Oct 12 07:35:02 crc systemd[1]: Starting Kubernetes Kubelet... Oct 12 07:35:02 crc restorecon[4552]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:02 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 12 07:35:03 crc restorecon[4552]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 12 07:35:03 crc restorecon[4552]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Oct 12 07:35:03 crc kubenswrapper[4599]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 12 07:35:03 crc kubenswrapper[4599]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Oct 12 07:35:03 crc kubenswrapper[4599]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 12 07:35:03 crc kubenswrapper[4599]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 12 07:35:03 crc kubenswrapper[4599]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Oct 12 07:35:03 crc kubenswrapper[4599]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.401687 4599 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.405295 4599 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.405317 4599 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.405322 4599 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.405326 4599 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.405346 4599 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.405354 4599 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.405360 4599 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.405364 4599 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.405368 4599 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.405371 4599 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.405375 4599 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.405388 4599 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.405393 4599 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.405397 4599 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.405401 4599 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.405405 4599 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.405409 4599 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.405412 4599 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.405416 4599 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.405419 4599 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.405422 4599 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.405426 4599 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.405429 4599 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.405432 4599 feature_gate.go:330] unrecognized feature gate: Example Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.405436 4599 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.405439 4599 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.405442 4599 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.405445 4599 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.405449 4599 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.405452 4599 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.405455 4599 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.405459 4599 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.405462 4599 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.405466 4599 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.405470 4599 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.405473 4599 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.405477 4599 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.405481 4599 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.405488 4599 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.405493 4599 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.405499 4599 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.405504 4599 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.405509 4599 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.405513 4599 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.405518 4599 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.405523 4599 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.405528 4599 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.405536 4599 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.405540 4599 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.405544 4599 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.405548 4599 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.405551 4599 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.405555 4599 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.405559 4599 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.405562 4599 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.405565 4599 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.405569 4599 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.405573 4599 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.405578 4599 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.405583 4599 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.405587 4599 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.405611 4599 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.405615 4599 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.405619 4599 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.405623 4599 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.405626 4599 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.405630 4599 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.405635 4599 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.405639 4599 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.405642 4599 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.405645 4599 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.405753 4599 flags.go:64] FLAG: --address="0.0.0.0" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.405762 4599 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.405769 4599 flags.go:64] FLAG: --anonymous-auth="true" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.405774 4599 flags.go:64] FLAG: --application-metrics-count-limit="100" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.405779 4599 flags.go:64] FLAG: --authentication-token-webhook="false" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.405783 4599 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.405788 4599 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.405793 4599 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.405796 4599 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.405800 4599 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.405804 4599 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.405808 4599 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.405812 4599 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.405816 4599 flags.go:64] FLAG: --cgroup-root="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.405819 4599 flags.go:64] FLAG: --cgroups-per-qos="true" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.405824 4599 flags.go:64] FLAG: --client-ca-file="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.405827 4599 flags.go:64] FLAG: --cloud-config="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.405831 4599 flags.go:64] FLAG: --cloud-provider="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.405834 4599 flags.go:64] FLAG: --cluster-dns="[]" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.405839 4599 flags.go:64] FLAG: --cluster-domain="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.405844 4599 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.405847 4599 flags.go:64] FLAG: --config-dir="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.405851 4599 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.405855 4599 flags.go:64] FLAG: --container-log-max-files="5" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.405860 4599 flags.go:64] FLAG: --container-log-max-size="10Mi" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.405864 4599 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.405872 4599 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.405876 4599 flags.go:64] FLAG: --containerd-namespace="k8s.io" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.405880 4599 flags.go:64] FLAG: --contention-profiling="false" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.405883 4599 flags.go:64] FLAG: --cpu-cfs-quota="true" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.405887 4599 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.405891 4599 flags.go:64] FLAG: --cpu-manager-policy="none" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.405894 4599 flags.go:64] FLAG: --cpu-manager-policy-options="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.405899 4599 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.405902 4599 flags.go:64] FLAG: --enable-controller-attach-detach="true" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.405906 4599 flags.go:64] FLAG: --enable-debugging-handlers="true" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.405910 4599 flags.go:64] FLAG: --enable-load-reader="false" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.405913 4599 flags.go:64] FLAG: --enable-server="true" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.405917 4599 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.405922 4599 flags.go:64] FLAG: --event-burst="100" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.405926 4599 flags.go:64] FLAG: --event-qps="50" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.405929 4599 flags.go:64] FLAG: --event-storage-age-limit="default=0" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.405933 4599 flags.go:64] FLAG: --event-storage-event-limit="default=0" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.405937 4599 flags.go:64] FLAG: --eviction-hard="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.405941 4599 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.405945 4599 flags.go:64] FLAG: --eviction-minimum-reclaim="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.405950 4599 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.405955 4599 flags.go:64] FLAG: --eviction-soft="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.405960 4599 flags.go:64] FLAG: --eviction-soft-grace-period="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.405964 4599 flags.go:64] FLAG: --exit-on-lock-contention="false" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.405969 4599 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.405974 4599 flags.go:64] FLAG: --experimental-mounter-path="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.405979 4599 flags.go:64] FLAG: --fail-cgroupv1="false" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.405984 4599 flags.go:64] FLAG: --fail-swap-on="true" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.405989 4599 flags.go:64] FLAG: --feature-gates="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.405993 4599 flags.go:64] FLAG: --file-check-frequency="20s" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.405997 4599 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.406001 4599 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.406005 4599 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.406010 4599 flags.go:64] FLAG: --healthz-port="10248" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.406014 4599 flags.go:64] FLAG: --help="false" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.406017 4599 flags.go:64] FLAG: --hostname-override="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.406021 4599 flags.go:64] FLAG: --housekeeping-interval="10s" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.406025 4599 flags.go:64] FLAG: --http-check-frequency="20s" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.406029 4599 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.406032 4599 flags.go:64] FLAG: --image-credential-provider-config="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.406036 4599 flags.go:64] FLAG: --image-gc-high-threshold="85" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.406039 4599 flags.go:64] FLAG: --image-gc-low-threshold="80" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.406043 4599 flags.go:64] FLAG: --image-service-endpoint="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.406046 4599 flags.go:64] FLAG: --kernel-memcg-notification="false" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.406050 4599 flags.go:64] FLAG: --kube-api-burst="100" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.406054 4599 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.406058 4599 flags.go:64] FLAG: --kube-api-qps="50" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.406061 4599 flags.go:64] FLAG: --kube-reserved="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.406065 4599 flags.go:64] FLAG: --kube-reserved-cgroup="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.406068 4599 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.406072 4599 flags.go:64] FLAG: --kubelet-cgroups="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.406076 4599 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.406080 4599 flags.go:64] FLAG: --lock-file="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.406083 4599 flags.go:64] FLAG: --log-cadvisor-usage="false" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.406087 4599 flags.go:64] FLAG: --log-flush-frequency="5s" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.406091 4599 flags.go:64] FLAG: --log-json-info-buffer-size="0" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.406096 4599 flags.go:64] FLAG: --log-json-split-stream="false" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.406101 4599 flags.go:64] FLAG: --log-text-info-buffer-size="0" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.406105 4599 flags.go:64] FLAG: --log-text-split-stream="false" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.406109 4599 flags.go:64] FLAG: --logging-format="text" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.406113 4599 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.406117 4599 flags.go:64] FLAG: --make-iptables-util-chains="true" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.406121 4599 flags.go:64] FLAG: --manifest-url="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.406125 4599 flags.go:64] FLAG: --manifest-url-header="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.406130 4599 flags.go:64] FLAG: --max-housekeeping-interval="15s" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.406135 4599 flags.go:64] FLAG: --max-open-files="1000000" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.406139 4599 flags.go:64] FLAG: --max-pods="110" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.406143 4599 flags.go:64] FLAG: --maximum-dead-containers="-1" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.406147 4599 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.406151 4599 flags.go:64] FLAG: --memory-manager-policy="None" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.406155 4599 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.406159 4599 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.406162 4599 flags.go:64] FLAG: --node-ip="192.168.126.11" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.406166 4599 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.406176 4599 flags.go:64] FLAG: --node-status-max-images="50" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.406180 4599 flags.go:64] FLAG: --node-status-update-frequency="10s" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.406184 4599 flags.go:64] FLAG: --oom-score-adj="-999" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.406188 4599 flags.go:64] FLAG: --pod-cidr="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.406191 4599 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.406198 4599 flags.go:64] FLAG: --pod-manifest-path="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.406202 4599 flags.go:64] FLAG: --pod-max-pids="-1" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.406206 4599 flags.go:64] FLAG: --pods-per-core="0" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.406210 4599 flags.go:64] FLAG: --port="10250" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.406214 4599 flags.go:64] FLAG: --protect-kernel-defaults="false" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.406218 4599 flags.go:64] FLAG: --provider-id="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.406222 4599 flags.go:64] FLAG: --qos-reserved="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.406226 4599 flags.go:64] FLAG: --read-only-port="10255" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.406230 4599 flags.go:64] FLAG: --register-node="true" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.406233 4599 flags.go:64] FLAG: --register-schedulable="true" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.406237 4599 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.406245 4599 flags.go:64] FLAG: --registry-burst="10" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.406249 4599 flags.go:64] FLAG: --registry-qps="5" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.406253 4599 flags.go:64] FLAG: --reserved-cpus="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.406256 4599 flags.go:64] FLAG: --reserved-memory="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.406261 4599 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.406266 4599 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.406270 4599 flags.go:64] FLAG: --rotate-certificates="false" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.406273 4599 flags.go:64] FLAG: --rotate-server-certificates="false" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.406279 4599 flags.go:64] FLAG: --runonce="false" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.406282 4599 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.406286 4599 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.406291 4599 flags.go:64] FLAG: --seccomp-default="false" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.406295 4599 flags.go:64] FLAG: --serialize-image-pulls="true" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.406299 4599 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.406303 4599 flags.go:64] FLAG: --storage-driver-db="cadvisor" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.406307 4599 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.406314 4599 flags.go:64] FLAG: --storage-driver-password="root" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.406317 4599 flags.go:64] FLAG: --storage-driver-secure="false" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.406321 4599 flags.go:64] FLAG: --storage-driver-table="stats" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.406325 4599 flags.go:64] FLAG: --storage-driver-user="root" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.406329 4599 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.406346 4599 flags.go:64] FLAG: --sync-frequency="1m0s" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.406350 4599 flags.go:64] FLAG: --system-cgroups="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.406354 4599 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.406359 4599 flags.go:64] FLAG: --system-reserved-cgroup="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.406364 4599 flags.go:64] FLAG: --tls-cert-file="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.406367 4599 flags.go:64] FLAG: --tls-cipher-suites="[]" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.406372 4599 flags.go:64] FLAG: --tls-min-version="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.406376 4599 flags.go:64] FLAG: --tls-private-key-file="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.406379 4599 flags.go:64] FLAG: --topology-manager-policy="none" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.406383 4599 flags.go:64] FLAG: --topology-manager-policy-options="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.406387 4599 flags.go:64] FLAG: --topology-manager-scope="container" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.406393 4599 flags.go:64] FLAG: --v="2" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.406398 4599 flags.go:64] FLAG: --version="false" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.406403 4599 flags.go:64] FLAG: --vmodule="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.406409 4599 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.406413 4599 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.406503 4599 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.406508 4599 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.406511 4599 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.406515 4599 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.406519 4599 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.406522 4599 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.406525 4599 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.406529 4599 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.406532 4599 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.406535 4599 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.406538 4599 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.406543 4599 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.406547 4599 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.406550 4599 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.406553 4599 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.406557 4599 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.406560 4599 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.406563 4599 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.406566 4599 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.406570 4599 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.406573 4599 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.406576 4599 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.406580 4599 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.406583 4599 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.406586 4599 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.406589 4599 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.406603 4599 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.406608 4599 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.406611 4599 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.406614 4599 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.406617 4599 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.406621 4599 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.406624 4599 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.406628 4599 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.406632 4599 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.406636 4599 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.406639 4599 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.406643 4599 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.406646 4599 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.406650 4599 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.406653 4599 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.406657 4599 feature_gate.go:330] unrecognized feature gate: Example Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.406660 4599 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.406664 4599 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.406667 4599 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.406671 4599 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.406674 4599 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.406677 4599 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.406680 4599 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.406683 4599 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.406688 4599 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.406692 4599 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.406696 4599 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.406700 4599 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.406703 4599 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.406707 4599 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.406711 4599 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.406714 4599 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.406718 4599 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.406723 4599 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.406726 4599 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.406729 4599 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.406732 4599 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.406736 4599 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.406739 4599 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.406742 4599 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.406745 4599 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.406748 4599 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.406752 4599 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.406755 4599 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.406758 4599 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.406771 4599 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.414535 4599 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.414615 4599 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.414730 4599 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.414749 4599 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.414755 4599 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.414759 4599 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.414764 4599 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.414769 4599 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.414773 4599 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.414778 4599 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.414782 4599 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.414786 4599 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.414790 4599 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.414795 4599 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.414800 4599 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.414807 4599 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.414811 4599 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.414815 4599 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.414819 4599 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.414827 4599 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.414831 4599 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.414835 4599 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.414838 4599 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.414842 4599 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.414845 4599 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.414849 4599 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.414852 4599 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.414856 4599 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.414859 4599 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.414862 4599 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.414866 4599 feature_gate.go:330] unrecognized feature gate: Example Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.414870 4599 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.414873 4599 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.414876 4599 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.414880 4599 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.414884 4599 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.414887 4599 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.414891 4599 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.414894 4599 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.414898 4599 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.414901 4599 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.414906 4599 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.414909 4599 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.414914 4599 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.414920 4599 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.414924 4599 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.414927 4599 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.414932 4599 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.414936 4599 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.414940 4599 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.414944 4599 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.414948 4599 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.414952 4599 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.414957 4599 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.414961 4599 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.414966 4599 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.414969 4599 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.414973 4599 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.414976 4599 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.414980 4599 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.414983 4599 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.414986 4599 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.414990 4599 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.414993 4599 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.414997 4599 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.415001 4599 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.415004 4599 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.415009 4599 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.415013 4599 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.415017 4599 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.415020 4599 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.415024 4599 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.415027 4599 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.415038 4599 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.415172 4599 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.415181 4599 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.415185 4599 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.415189 4599 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.415193 4599 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.415196 4599 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.415200 4599 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.415203 4599 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.415207 4599 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.415211 4599 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.415215 4599 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.415219 4599 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.415223 4599 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.415227 4599 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.415231 4599 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.415234 4599 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.415238 4599 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.415241 4599 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.415246 4599 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.415252 4599 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.415255 4599 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.415259 4599 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.415263 4599 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.415267 4599 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.415272 4599 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.415276 4599 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.415280 4599 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.415285 4599 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.415289 4599 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.415293 4599 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.415296 4599 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.415301 4599 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.415305 4599 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.415309 4599 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.415314 4599 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.415318 4599 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.415322 4599 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.415326 4599 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.415329 4599 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.415349 4599 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.415353 4599 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.415357 4599 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.415361 4599 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.415364 4599 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.415367 4599 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.415371 4599 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.415374 4599 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.415377 4599 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.415381 4599 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.415384 4599 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.415388 4599 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.415391 4599 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.415395 4599 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.415398 4599 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.415401 4599 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.415407 4599 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.415410 4599 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.415414 4599 feature_gate.go:330] unrecognized feature gate: Example Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.415418 4599 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.415422 4599 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.415425 4599 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.415429 4599 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.415433 4599 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.415437 4599 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.415442 4599 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.415447 4599 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.415451 4599 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.415456 4599 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.415460 4599 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.415463 4599 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.415468 4599 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.415474 4599 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.416432 4599 server.go:940] "Client rotation is on, will bootstrap in background" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.419811 4599 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.419913 4599 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.420739 4599 server.go:997] "Starting client certificate rotation" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.420770 4599 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.420973 4599 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-12-09 00:58:32.565205435 +0000 UTC Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.421062 4599 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 1385h23m29.144145713s for next certificate rotation Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.434021 4599 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.437021 4599 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.446729 4599 log.go:25] "Validated CRI v1 runtime API" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.466186 4599 log.go:25] "Validated CRI v1 image API" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.467794 4599 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.471376 4599 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-10-12-07-31-11-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.471417 4599 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:49 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/containers/storage/overlay-containers/75d81934760b26101869fbd8e4b5954c62b019c1cc3e5a0c9f82ed8de46b3b22/userdata/shm:{mountpoint:/var/lib/containers/storage/overlay-containers/75d81934760b26101869fbd8e4b5954c62b019c1cc3e5a0c9f82ed8de46b3b22/userdata/shm major:0 minor:42 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:50 fsType:tmpfs blockSize:0} overlay_0-43:{mountpoint:/var/lib/containers/storage/overlay/94b752e0a51c0134b00ddef6dc7a933a9d7c1d9bdc88a18dae4192a0d557d623/merged major:0 minor:43 fsType:overlay blockSize:0}] Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.486838 4599 manager.go:217] Machine: {Timestamp:2025-10-12 07:35:03.485448663 +0000 UTC m=+0.274644165 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2445406 MemoryCapacity:33654120448 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:c0d90076-d180-408b-98a9-f48996ced0a6 BootID:b3fd0901-686d-4e85-8767-c56bc470edcc Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/var/lib/containers/storage/overlay-containers/75d81934760b26101869fbd8e4b5954c62b019c1cc3e5a0c9f82ed8de46b3b22/userdata/shm DeviceMajor:0 DeviceMinor:42 Capacity:65536000 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-43 DeviceMajor:0 DeviceMinor:43 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:49 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:50 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827060224 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:4b:a0:11 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:enp3s0 MacAddress:fa:16:3e:4b:a0:11 Speed:-1 Mtu:1500} {Name:enp7s0 MacAddress:fa:16:3e:c3:48:08 Speed:-1 Mtu:1440} {Name:enp7s0.20 MacAddress:52:54:00:5c:92:52 Speed:-1 Mtu:1436} {Name:enp7s0.21 MacAddress:52:54:00:9b:69:77 Speed:-1 Mtu:1436} {Name:enp7s0.22 MacAddress:52:54:00:57:de:14 Speed:-1 Mtu:1436} {Name:eth10 MacAddress:aa:db:6d:99:e2:4f Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:fa:aa:67:38:0a:5e Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654120448 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:65536 Type:Data Level:1} {Id:0 Size:65536 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:65536 Type:Data Level:1} {Id:1 Size:65536 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:65536 Type:Data Level:1} {Id:10 Size:65536 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:65536 Type:Data Level:1} {Id:11 Size:65536 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:65536 Type:Data Level:1} {Id:2 Size:65536 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:65536 Type:Data Level:1} {Id:3 Size:65536 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:65536 Type:Data Level:1} {Id:4 Size:65536 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:65536 Type:Data Level:1} {Id:5 Size:65536 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:65536 Type:Data Level:1} {Id:6 Size:65536 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:65536 Type:Data Level:1} {Id:7 Size:65536 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:65536 Type:Data Level:1} {Id:8 Size:65536 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:65536 Type:Data Level:1} {Id:9 Size:65536 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.487032 4599 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.487160 4599 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.488300 4599 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.488481 4599 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.488515 4599 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.488729 4599 topology_manager.go:138] "Creating topology manager with none policy" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.488742 4599 container_manager_linux.go:303] "Creating device plugin manager" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.489103 4599 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.489135 4599 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.489313 4599 state_mem.go:36] "Initialized new in-memory state store" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.489413 4599 server.go:1245] "Using root directory" path="/var/lib/kubelet" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.491167 4599 kubelet.go:418] "Attempting to sync node with API server" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.491189 4599 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.491210 4599 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.491223 4599 kubelet.go:324] "Adding apiserver pod source" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.491235 4599 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.494140 4599 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.494946 4599 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.495189 4599 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 192.168.26.159:6443: connect: connection refused Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.495237 4599 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.26.159:6443: connect: connection refused Oct 12 07:35:03 crc kubenswrapper[4599]: E1012 07:35:03.495321 4599 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.26.159:6443: connect: connection refused" logger="UnhandledError" Oct 12 07:35:03 crc kubenswrapper[4599]: E1012 07:35:03.495327 4599 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 192.168.26.159:6443: connect: connection refused" logger="UnhandledError" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.497436 4599 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.498437 4599 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.498464 4599 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.498471 4599 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.498479 4599 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.498496 4599 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.498503 4599 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.498510 4599 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.498527 4599 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.498535 4599 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.498544 4599 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.498566 4599 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.498575 4599 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.498601 4599 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.499013 4599 server.go:1280] "Started kubelet" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.499330 4599 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.499587 4599 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.499855 4599 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 192.168.26.159:6443: connect: connection refused Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.500268 4599 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Oct 12 07:35:03 crc systemd[1]: Started Kubernetes Kubelet. Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.506139 4599 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.506184 4599 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Oct 12 07:35:03 crc kubenswrapper[4599]: E1012 07:35:03.506131 4599 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 192.168.26.159:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186dae250b56b881 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-10-12 07:35:03.498987649 +0000 UTC m=+0.288183151,LastTimestamp:2025-10-12 07:35:03.498987649 +0000 UTC m=+0.288183151,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.507872 4599 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 08:10:40.284456996 +0000 UTC Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.507928 4599 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 912h35m36.776536398s for next certificate rotation Oct 12 07:35:03 crc kubenswrapper[4599]: E1012 07:35:03.507949 4599 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.508077 4599 volume_manager.go:287] "The desired_state_of_world populator starts" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.508095 4599 volume_manager.go:289] "Starting Kubelet Volume Manager" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.508234 4599 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.508740 4599 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.26.159:6443: connect: connection refused Oct 12 07:35:03 crc kubenswrapper[4599]: E1012 07:35:03.508806 4599 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.26.159:6443: connect: connection refused" logger="UnhandledError" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.509814 4599 server.go:460] "Adding debug handlers to kubelet server" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.510550 4599 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.510625 4599 factory.go:55] Registering systemd factory Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.510705 4599 factory.go:221] Registration of the systemd container factory successfully Oct 12 07:35:03 crc kubenswrapper[4599]: E1012 07:35:03.510691 4599 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.159:6443: connect: connection refused" interval="200ms" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.511090 4599 factory.go:153] Registering CRI-O factory Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.511113 4599 factory.go:221] Registration of the crio container factory successfully Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.511137 4599 factory.go:103] Registering Raw factory Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.511151 4599 manager.go:1196] Started watching for new ooms in manager Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.511715 4599 manager.go:319] Starting recovery of all containers Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.517280 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.517313 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.517324 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.517351 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.517361 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.517369 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.517377 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.517400 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.517411 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.517421 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.517428 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.517438 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.517447 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.517456 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.517465 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.517476 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.517484 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.517493 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.517506 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.517514 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.517522 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.517531 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.517539 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.517548 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.517556 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.517564 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.517573 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.517584 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.517596 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.517604 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.517613 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.517622 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.517631 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.517650 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.517668 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.517677 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.517686 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.517719 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.517729 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.517737 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.517746 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.517754 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.517763 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.517772 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.517780 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.517788 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.517804 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.517812 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.517820 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.517831 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.517840 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.517848 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.517860 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.517871 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.517882 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.517891 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.517909 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.517918 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.517927 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.517936 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.517947 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.517955 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.517963 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.517971 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.517979 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.517986 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.517994 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518004 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518014 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518022 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518030 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518038 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518052 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518060 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518068 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518076 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518085 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518103 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518112 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518124 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518132 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518141 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518159 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518167 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518175 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518185 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518194 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518207 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518215 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518224 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518232 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518243 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518255 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518264 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518273 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518280 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518289 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518297 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518305 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518322 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518355 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518366 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518374 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518382 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518394 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518404 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518414 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518424 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518434 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518449 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518457 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518482 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518491 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518499 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518512 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518519 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518531 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518540 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518549 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518557 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518567 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518576 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518584 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518592 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518603 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518612 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518624 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518632 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518655 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518664 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518672 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518681 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518688 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518696 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518703 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518711 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518720 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518728 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518736 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518744 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518752 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518760 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518773 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518785 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518793 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518801 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518809 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518821 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518829 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518837 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518845 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518853 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518861 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518875 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518884 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518894 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518904 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518912 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518920 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518929 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518938 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518946 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518955 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518963 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518971 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518980 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518988 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.518996 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.519004 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.519017 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.519026 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.519035 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.519044 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.519061 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.519070 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.519077 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.519086 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.519099 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.519107 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.519124 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.519133 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.519142 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.519151 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.519159 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.519172 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.519180 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.519187 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.519198 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.519209 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.519220 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.519228 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.519236 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.519244 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.519251 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.519263 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.519272 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.519281 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.519289 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.519298 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.519305 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.519312 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.519361 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.519372 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.519382 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.519394 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.519403 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.519415 4599 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.520816 4599 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.520843 4599 reconstruct.go:97] "Volume reconstruction finished" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.520850 4599 reconciler.go:26] "Reconciler: start to sync state" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.530202 4599 manager.go:324] Recovery completed Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.538930 4599 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.540004 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.540052 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.540063 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.540827 4599 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.540865 4599 cpu_manager.go:225] "Starting CPU manager" policy="none" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.540875 4599 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.540892 4599 state_mem.go:36] "Initialized new in-memory state store" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.543951 4599 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.543999 4599 status_manager.go:217] "Starting to sync pod status with apiserver" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.544028 4599 kubelet.go:2335] "Starting kubelet main sync loop" Oct 12 07:35:03 crc kubenswrapper[4599]: E1012 07:35:03.544064 4599 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.544872 4599 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.26.159:6443: connect: connection refused Oct 12 07:35:03 crc kubenswrapper[4599]: E1012 07:35:03.544947 4599 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.26.159:6443: connect: connection refused" logger="UnhandledError" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.545473 4599 policy_none.go:49] "None policy: Start" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.546084 4599 memory_manager.go:170] "Starting memorymanager" policy="None" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.546105 4599 state_mem.go:35] "Initializing new in-memory state store" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.594300 4599 manager.go:334] "Starting Device Plugin manager" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.594364 4599 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.594378 4599 server.go:79] "Starting device plugin registration server" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.595012 4599 eviction_manager.go:189] "Eviction manager: starting control loop" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.595031 4599 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.595316 4599 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.595477 4599 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.595492 4599 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Oct 12 07:35:03 crc kubenswrapper[4599]: E1012 07:35:03.602046 4599 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.644926 4599 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.645143 4599 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.646159 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.646243 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.646260 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.646583 4599 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.646891 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.646929 4599 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.647610 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.647624 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.647638 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.647653 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.647642 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.647782 4599 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.647783 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.647964 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.648001 4599 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.648329 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.648370 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.648379 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.648476 4599 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.648614 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.648656 4599 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.649233 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.649254 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.649263 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.649347 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.649364 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.649370 4599 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.649388 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.649374 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.649431 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.649455 4599 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.649406 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.649565 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.649887 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.649917 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.649934 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.650082 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.650107 4599 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.650254 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.650281 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.650298 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.650810 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.650836 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.650847 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.695553 4599 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.696352 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.696449 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.696517 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.696589 4599 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 12 07:35:03 crc kubenswrapper[4599]: E1012 07:35:03.697167 4599 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 192.168.26.159:6443: connect: connection refused" node="crc" Oct 12 07:35:03 crc kubenswrapper[4599]: E1012 07:35:03.711693 4599 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.159:6443: connect: connection refused" interval="400ms" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.723839 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.723875 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.723897 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.723912 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.723954 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.724003 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.724022 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.724095 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.724128 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.724152 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.724193 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.724210 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.724253 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.724284 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.724327 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.825256 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.825314 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.825365 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.825384 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.825403 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.825425 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.825445 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.825461 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.825477 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.825497 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.825513 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.825512 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.825568 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.825568 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.825614 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.825618 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.825625 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.825528 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.825656 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.825686 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.825657 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.825711 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.825581 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.825788 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.825791 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.825820 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.825909 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.826006 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.826017 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.826113 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.898101 4599 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.899953 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.900002 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.900015 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.900046 4599 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 12 07:35:03 crc kubenswrapper[4599]: E1012 07:35:03.900614 4599 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 192.168.26.159:6443: connect: connection refused" node="crc" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.964030 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.968858 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.981174 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.987223 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 12 07:35:03 crc kubenswrapper[4599]: I1012 07:35:03.991351 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.993353 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-6b82daf8369eb2ca381d96af02c0e2100707e8a4c4cdc10fdf446db54aec4cf4 WatchSource:0}: Error finding container 6b82daf8369eb2ca381d96af02c0e2100707e8a4c4cdc10fdf446db54aec4cf4: Status 404 returned error can't find the container with id 6b82daf8369eb2ca381d96af02c0e2100707e8a4c4cdc10fdf446db54aec4cf4 Oct 12 07:35:03 crc kubenswrapper[4599]: W1012 07:35:03.994098 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-d018a6b0c484c1282b8327254b41d2ad331740cdc33de2e055907758bdeed1bb WatchSource:0}: Error finding container d018a6b0c484c1282b8327254b41d2ad331740cdc33de2e055907758bdeed1bb: Status 404 returned error can't find the container with id d018a6b0c484c1282b8327254b41d2ad331740cdc33de2e055907758bdeed1bb Oct 12 07:35:04 crc kubenswrapper[4599]: W1012 07:35:04.000298 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-00aae0f5ebdd6840749530f4a430b70cdafe2bcf2d5271771fb477390b2e71af WatchSource:0}: Error finding container 00aae0f5ebdd6840749530f4a430b70cdafe2bcf2d5271771fb477390b2e71af: Status 404 returned error can't find the container with id 00aae0f5ebdd6840749530f4a430b70cdafe2bcf2d5271771fb477390b2e71af Oct 12 07:35:04 crc kubenswrapper[4599]: W1012 07:35:04.004696 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-654c9df3e999d4e8a32438a18dc313ed4958ed40a65238a0899f0db1b1e0d699 WatchSource:0}: Error finding container 654c9df3e999d4e8a32438a18dc313ed4958ed40a65238a0899f0db1b1e0d699: Status 404 returned error can't find the container with id 654c9df3e999d4e8a32438a18dc313ed4958ed40a65238a0899f0db1b1e0d699 Oct 12 07:35:04 crc kubenswrapper[4599]: W1012 07:35:04.005483 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-85fd7a6ccd70b06138d9b0fb01f61c0b091d50c66867ccdffd1067a8bad7a34c WatchSource:0}: Error finding container 85fd7a6ccd70b06138d9b0fb01f61c0b091d50c66867ccdffd1067a8bad7a34c: Status 404 returned error can't find the container with id 85fd7a6ccd70b06138d9b0fb01f61c0b091d50c66867ccdffd1067a8bad7a34c Oct 12 07:35:04 crc kubenswrapper[4599]: E1012 07:35:04.112446 4599 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.159:6443: connect: connection refused" interval="800ms" Oct 12 07:35:04 crc kubenswrapper[4599]: I1012 07:35:04.301134 4599 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 07:35:04 crc kubenswrapper[4599]: I1012 07:35:04.302615 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:04 crc kubenswrapper[4599]: I1012 07:35:04.302683 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:04 crc kubenswrapper[4599]: I1012 07:35:04.302700 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:04 crc kubenswrapper[4599]: I1012 07:35:04.302739 4599 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 12 07:35:04 crc kubenswrapper[4599]: E1012 07:35:04.303350 4599 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 192.168.26.159:6443: connect: connection refused" node="crc" Oct 12 07:35:04 crc kubenswrapper[4599]: W1012 07:35:04.376476 4599 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.26.159:6443: connect: connection refused Oct 12 07:35:04 crc kubenswrapper[4599]: E1012 07:35:04.376558 4599 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.26.159:6443: connect: connection refused" logger="UnhandledError" Oct 12 07:35:04 crc kubenswrapper[4599]: I1012 07:35:04.500585 4599 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 192.168.26.159:6443: connect: connection refused Oct 12 07:35:04 crc kubenswrapper[4599]: I1012 07:35:04.550491 4599 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="d08b26fb90464c31907b23edfd9f4997375c3178e8699cf951af85424d9354d9" exitCode=0 Oct 12 07:35:04 crc kubenswrapper[4599]: I1012 07:35:04.550589 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"d08b26fb90464c31907b23edfd9f4997375c3178e8699cf951af85424d9354d9"} Oct 12 07:35:04 crc kubenswrapper[4599]: I1012 07:35:04.550714 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"85fd7a6ccd70b06138d9b0fb01f61c0b091d50c66867ccdffd1067a8bad7a34c"} Oct 12 07:35:04 crc kubenswrapper[4599]: I1012 07:35:04.550865 4599 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 07:35:04 crc kubenswrapper[4599]: I1012 07:35:04.552087 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:04 crc kubenswrapper[4599]: I1012 07:35:04.552117 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:04 crc kubenswrapper[4599]: I1012 07:35:04.552128 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:04 crc kubenswrapper[4599]: I1012 07:35:04.552149 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"bb25175bfe42e193200fbc76f82294afbd5f23d9653637c12ddf67553431748e"} Oct 12 07:35:04 crc kubenswrapper[4599]: I1012 07:35:04.552173 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"654c9df3e999d4e8a32438a18dc313ed4958ed40a65238a0899f0db1b1e0d699"} Oct 12 07:35:04 crc kubenswrapper[4599]: I1012 07:35:04.553547 4599 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="afc60525d60f5b497a75efb631611d658c23f6336c21ca7421bf3efee3ece0ed" exitCode=0 Oct 12 07:35:04 crc kubenswrapper[4599]: I1012 07:35:04.553659 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"afc60525d60f5b497a75efb631611d658c23f6336c21ca7421bf3efee3ece0ed"} Oct 12 07:35:04 crc kubenswrapper[4599]: I1012 07:35:04.553789 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"00aae0f5ebdd6840749530f4a430b70cdafe2bcf2d5271771fb477390b2e71af"} Oct 12 07:35:04 crc kubenswrapper[4599]: I1012 07:35:04.553935 4599 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 07:35:04 crc kubenswrapper[4599]: I1012 07:35:04.554923 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:04 crc kubenswrapper[4599]: I1012 07:35:04.554984 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:04 crc kubenswrapper[4599]: I1012 07:35:04.555007 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:04 crc kubenswrapper[4599]: I1012 07:35:04.555546 4599 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="6d98167ce5b8c2681e1606ed5f3b3c1c5172ef2c6aac08bd0db1cacd560d0581" exitCode=0 Oct 12 07:35:04 crc kubenswrapper[4599]: I1012 07:35:04.555606 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"6d98167ce5b8c2681e1606ed5f3b3c1c5172ef2c6aac08bd0db1cacd560d0581"} Oct 12 07:35:04 crc kubenswrapper[4599]: I1012 07:35:04.555630 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"d018a6b0c484c1282b8327254b41d2ad331740cdc33de2e055907758bdeed1bb"} Oct 12 07:35:04 crc kubenswrapper[4599]: I1012 07:35:04.555725 4599 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 07:35:04 crc kubenswrapper[4599]: I1012 07:35:04.556362 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:04 crc kubenswrapper[4599]: I1012 07:35:04.556400 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:04 crc kubenswrapper[4599]: I1012 07:35:04.556409 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:04 crc kubenswrapper[4599]: I1012 07:35:04.556821 4599 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 07:35:04 crc kubenswrapper[4599]: I1012 07:35:04.557109 4599 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="b748451e24963062c3aec5bf944775ca77c166a8b97b4309b9102a9157312cdd" exitCode=0 Oct 12 07:35:04 crc kubenswrapper[4599]: I1012 07:35:04.557140 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"b748451e24963062c3aec5bf944775ca77c166a8b97b4309b9102a9157312cdd"} Oct 12 07:35:04 crc kubenswrapper[4599]: I1012 07:35:04.557588 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"6b82daf8369eb2ca381d96af02c0e2100707e8a4c4cdc10fdf446db54aec4cf4"} Oct 12 07:35:04 crc kubenswrapper[4599]: I1012 07:35:04.557651 4599 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 07:35:04 crc kubenswrapper[4599]: I1012 07:35:04.559022 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:04 crc kubenswrapper[4599]: I1012 07:35:04.559087 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:04 crc kubenswrapper[4599]: I1012 07:35:04.559111 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:04 crc kubenswrapper[4599]: I1012 07:35:04.561509 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:04 crc kubenswrapper[4599]: I1012 07:35:04.561538 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:04 crc kubenswrapper[4599]: I1012 07:35:04.561550 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:04 crc kubenswrapper[4599]: W1012 07:35:04.616023 4599 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 192.168.26.159:6443: connect: connection refused Oct 12 07:35:04 crc kubenswrapper[4599]: E1012 07:35:04.616104 4599 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 192.168.26.159:6443: connect: connection refused" logger="UnhandledError" Oct 12 07:35:04 crc kubenswrapper[4599]: E1012 07:35:04.913638 4599 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.159:6443: connect: connection refused" interval="1.6s" Oct 12 07:35:04 crc kubenswrapper[4599]: W1012 07:35:04.994158 4599 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.26.159:6443: connect: connection refused Oct 12 07:35:04 crc kubenswrapper[4599]: E1012 07:35:04.994318 4599 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.26.159:6443: connect: connection refused" logger="UnhandledError" Oct 12 07:35:05 crc kubenswrapper[4599]: W1012 07:35:05.072046 4599 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.26.159:6443: connect: connection refused Oct 12 07:35:05 crc kubenswrapper[4599]: E1012 07:35:05.072115 4599 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.26.159:6443: connect: connection refused" logger="UnhandledError" Oct 12 07:35:05 crc kubenswrapper[4599]: I1012 07:35:05.103540 4599 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 07:35:05 crc kubenswrapper[4599]: I1012 07:35:05.104532 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:05 crc kubenswrapper[4599]: I1012 07:35:05.104585 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:05 crc kubenswrapper[4599]: I1012 07:35:05.104596 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:05 crc kubenswrapper[4599]: I1012 07:35:05.104640 4599 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 12 07:35:05 crc kubenswrapper[4599]: E1012 07:35:05.105002 4599 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 192.168.26.159:6443: connect: connection refused" node="crc" Oct 12 07:35:05 crc kubenswrapper[4599]: I1012 07:35:05.562390 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"622eac195ecab593bc5fa71a4a88ac987a45be5fb130336aae7acd680c51abe4"} Oct 12 07:35:05 crc kubenswrapper[4599]: I1012 07:35:05.562507 4599 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 07:35:05 crc kubenswrapper[4599]: I1012 07:35:05.562513 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"4da490def77b19758f5479ef7070846dee19a5057bd6b0568d96fd38a8dd6528"} Oct 12 07:35:05 crc kubenswrapper[4599]: I1012 07:35:05.562641 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"21abc118842f09c54256bd05f1e5978c928bad569bc9b52157287576117fb49d"} Oct 12 07:35:05 crc kubenswrapper[4599]: I1012 07:35:05.563444 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:05 crc kubenswrapper[4599]: I1012 07:35:05.563491 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:05 crc kubenswrapper[4599]: I1012 07:35:05.563506 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:05 crc kubenswrapper[4599]: I1012 07:35:05.566031 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"21d90e31b7ebf70d48881d9d0837e8a1fdeabc3d53988c7ba993ce28e902d3d4"} Oct 12 07:35:05 crc kubenswrapper[4599]: I1012 07:35:05.566096 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"4e595e7d1dee4ecff3b5deb6e363b57de09e4bcfddd6ab9ae9be27b6e3578628"} Oct 12 07:35:05 crc kubenswrapper[4599]: I1012 07:35:05.566108 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"545c8135b4e5319ac3b9c84ee50556b66e4f5cfe0987f5aa37ef668130f6f267"} Oct 12 07:35:05 crc kubenswrapper[4599]: I1012 07:35:05.566118 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"40188eb89126b26a3002269d063827de60fa17f72c18fbbd0e15b16c744e8ddb"} Oct 12 07:35:05 crc kubenswrapper[4599]: I1012 07:35:05.566127 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c61ee75e4bbe142c54914177759cf0c49ecc36f904c8c6bf421cd4917256d5a3"} Oct 12 07:35:05 crc kubenswrapper[4599]: I1012 07:35:05.566231 4599 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 07:35:05 crc kubenswrapper[4599]: I1012 07:35:05.567004 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:05 crc kubenswrapper[4599]: I1012 07:35:05.567085 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:05 crc kubenswrapper[4599]: I1012 07:35:05.567098 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:05 crc kubenswrapper[4599]: I1012 07:35:05.568464 4599 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="f23d6ac5f27394ffff846b0fa1b29ccaee19d85b3f638655c3b024ba5f5fb206" exitCode=0 Oct 12 07:35:05 crc kubenswrapper[4599]: I1012 07:35:05.568534 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"f23d6ac5f27394ffff846b0fa1b29ccaee19d85b3f638655c3b024ba5f5fb206"} Oct 12 07:35:05 crc kubenswrapper[4599]: I1012 07:35:05.568646 4599 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 07:35:05 crc kubenswrapper[4599]: I1012 07:35:05.569367 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:05 crc kubenswrapper[4599]: I1012 07:35:05.569393 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:05 crc kubenswrapper[4599]: I1012 07:35:05.569406 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:05 crc kubenswrapper[4599]: I1012 07:35:05.570821 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"443dfe629ce4bc2f6d96108ef419b3f8e577609981571c42882a8b3eb587722a"} Oct 12 07:35:05 crc kubenswrapper[4599]: I1012 07:35:05.570907 4599 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 07:35:05 crc kubenswrapper[4599]: I1012 07:35:05.572206 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:05 crc kubenswrapper[4599]: I1012 07:35:05.572242 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:05 crc kubenswrapper[4599]: I1012 07:35:05.572258 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:05 crc kubenswrapper[4599]: I1012 07:35:05.573909 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"091875c6e2e9f643a0c31783ab1905c5fd448d20e98b9b4de70be991e7e8f1c6"} Oct 12 07:35:05 crc kubenswrapper[4599]: I1012 07:35:05.573938 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"56aa5b66fc1ca787e84c8ca2a717038f9e522862fb680d5271921e410d5841f4"} Oct 12 07:35:05 crc kubenswrapper[4599]: I1012 07:35:05.573952 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"f52b4e0ae3f90600ec5e799352a1f70b4fad54a4805616d1508c22af008a0f8b"} Oct 12 07:35:05 crc kubenswrapper[4599]: I1012 07:35:05.574043 4599 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 07:35:05 crc kubenswrapper[4599]: I1012 07:35:05.577774 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:05 crc kubenswrapper[4599]: I1012 07:35:05.577808 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:05 crc kubenswrapper[4599]: I1012 07:35:05.577821 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:06 crc kubenswrapper[4599]: I1012 07:35:06.578065 4599 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="e87f0b014f9e37cdcbf6017f16972e7d97b8668e7577ef7c6fc5625b3c0a6c50" exitCode=0 Oct 12 07:35:06 crc kubenswrapper[4599]: I1012 07:35:06.578148 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"e87f0b014f9e37cdcbf6017f16972e7d97b8668e7577ef7c6fc5625b3c0a6c50"} Oct 12 07:35:06 crc kubenswrapper[4599]: I1012 07:35:06.578168 4599 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 07:35:06 crc kubenswrapper[4599]: I1012 07:35:06.578280 4599 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 07:35:06 crc kubenswrapper[4599]: I1012 07:35:06.578244 4599 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 07:35:06 crc kubenswrapper[4599]: I1012 07:35:06.579521 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:06 crc kubenswrapper[4599]: I1012 07:35:06.579561 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:06 crc kubenswrapper[4599]: I1012 07:35:06.579569 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:06 crc kubenswrapper[4599]: I1012 07:35:06.579653 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:06 crc kubenswrapper[4599]: I1012 07:35:06.579689 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:06 crc kubenswrapper[4599]: I1012 07:35:06.579700 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:06 crc kubenswrapper[4599]: I1012 07:35:06.580410 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:06 crc kubenswrapper[4599]: I1012 07:35:06.580474 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:06 crc kubenswrapper[4599]: I1012 07:35:06.580488 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:06 crc kubenswrapper[4599]: I1012 07:35:06.705193 4599 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 07:35:06 crc kubenswrapper[4599]: I1012 07:35:06.706409 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:06 crc kubenswrapper[4599]: I1012 07:35:06.706444 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:06 crc kubenswrapper[4599]: I1012 07:35:06.706455 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:06 crc kubenswrapper[4599]: I1012 07:35:06.706482 4599 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 12 07:35:06 crc kubenswrapper[4599]: I1012 07:35:06.966093 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 12 07:35:06 crc kubenswrapper[4599]: I1012 07:35:06.966398 4599 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 07:35:06 crc kubenswrapper[4599]: I1012 07:35:06.966439 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 12 07:35:06 crc kubenswrapper[4599]: I1012 07:35:06.970750 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:06 crc kubenswrapper[4599]: I1012 07:35:06.970782 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:06 crc kubenswrapper[4599]: I1012 07:35:06.970799 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:07 crc kubenswrapper[4599]: I1012 07:35:07.584458 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"097ef1d7c37423f694e67acfe553b043d51edd1aa552f754a9c5bf96ae67bb6e"} Oct 12 07:35:07 crc kubenswrapper[4599]: I1012 07:35:07.584499 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"aed1415303d72c92ce821232d50aad3924d40bcaa955f30e143e07566b77cb80"} Oct 12 07:35:07 crc kubenswrapper[4599]: I1012 07:35:07.584513 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"d21de833749a18822f5d386c0838dbdae6b06b1cef92fb274fd9b3cfca64514a"} Oct 12 07:35:07 crc kubenswrapper[4599]: I1012 07:35:07.584523 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"d791b457dfa4f7399f4afd3f0a05030a0b2484f2dbd9af24babea12b3a05b43c"} Oct 12 07:35:07 crc kubenswrapper[4599]: I1012 07:35:07.584531 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"8e880b7ae0a97113db0f9eb218d23e7fe3592aa2d77f1e798cb0d6cd04ed0ce5"} Oct 12 07:35:07 crc kubenswrapper[4599]: I1012 07:35:07.584530 4599 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 07:35:07 crc kubenswrapper[4599]: I1012 07:35:07.584594 4599 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 07:35:07 crc kubenswrapper[4599]: I1012 07:35:07.585291 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:07 crc kubenswrapper[4599]: I1012 07:35:07.585310 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:07 crc kubenswrapper[4599]: I1012 07:35:07.585314 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:07 crc kubenswrapper[4599]: I1012 07:35:07.585356 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:07 crc kubenswrapper[4599]: I1012 07:35:07.585367 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:07 crc kubenswrapper[4599]: I1012 07:35:07.585359 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:08 crc kubenswrapper[4599]: I1012 07:35:08.725558 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Oct 12 07:35:08 crc kubenswrapper[4599]: I1012 07:35:08.725715 4599 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 07:35:08 crc kubenswrapper[4599]: I1012 07:35:08.726483 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:08 crc kubenswrapper[4599]: I1012 07:35:08.726510 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:08 crc kubenswrapper[4599]: I1012 07:35:08.726518 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:08 crc kubenswrapper[4599]: I1012 07:35:08.768830 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 12 07:35:08 crc kubenswrapper[4599]: I1012 07:35:08.768937 4599 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 07:35:08 crc kubenswrapper[4599]: I1012 07:35:08.769513 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:08 crc kubenswrapper[4599]: I1012 07:35:08.769532 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:08 crc kubenswrapper[4599]: I1012 07:35:08.769540 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:10 crc kubenswrapper[4599]: I1012 07:35:10.092978 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Oct 12 07:35:10 crc kubenswrapper[4599]: I1012 07:35:10.093770 4599 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 07:35:10 crc kubenswrapper[4599]: I1012 07:35:10.095553 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:10 crc kubenswrapper[4599]: I1012 07:35:10.095619 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:10 crc kubenswrapper[4599]: I1012 07:35:10.095630 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:10 crc kubenswrapper[4599]: I1012 07:35:10.477793 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 12 07:35:10 crc kubenswrapper[4599]: I1012 07:35:10.478012 4599 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 07:35:10 crc kubenswrapper[4599]: I1012 07:35:10.479116 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:10 crc kubenswrapper[4599]: I1012 07:35:10.479153 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:10 crc kubenswrapper[4599]: I1012 07:35:10.479163 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:10 crc kubenswrapper[4599]: I1012 07:35:10.563597 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 12 07:35:10 crc kubenswrapper[4599]: I1012 07:35:10.563782 4599 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 07:35:10 crc kubenswrapper[4599]: I1012 07:35:10.564734 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:10 crc kubenswrapper[4599]: I1012 07:35:10.564764 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:10 crc kubenswrapper[4599]: I1012 07:35:10.564773 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:11 crc kubenswrapper[4599]: I1012 07:35:11.599940 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 12 07:35:11 crc kubenswrapper[4599]: I1012 07:35:11.600158 4599 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 07:35:11 crc kubenswrapper[4599]: I1012 07:35:11.601146 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:11 crc kubenswrapper[4599]: I1012 07:35:11.601197 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:11 crc kubenswrapper[4599]: I1012 07:35:11.601207 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:11 crc kubenswrapper[4599]: I1012 07:35:11.769617 4599 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 12 07:35:11 crc kubenswrapper[4599]: I1012 07:35:11.769691 4599 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 12 07:35:12 crc kubenswrapper[4599]: I1012 07:35:12.727731 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 12 07:35:12 crc kubenswrapper[4599]: I1012 07:35:12.727972 4599 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 07:35:12 crc kubenswrapper[4599]: I1012 07:35:12.729076 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:12 crc kubenswrapper[4599]: I1012 07:35:12.729119 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:12 crc kubenswrapper[4599]: I1012 07:35:12.729128 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:13 crc kubenswrapper[4599]: E1012 07:35:13.602168 4599 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 12 07:35:14 crc kubenswrapper[4599]: I1012 07:35:14.558710 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 12 07:35:14 crc kubenswrapper[4599]: I1012 07:35:14.558917 4599 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 07:35:14 crc kubenswrapper[4599]: I1012 07:35:14.560033 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:14 crc kubenswrapper[4599]: I1012 07:35:14.560110 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:14 crc kubenswrapper[4599]: I1012 07:35:14.560123 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:14 crc kubenswrapper[4599]: I1012 07:35:14.564429 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 12 07:35:14 crc kubenswrapper[4599]: I1012 07:35:14.597652 4599 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 07:35:14 crc kubenswrapper[4599]: I1012 07:35:14.598675 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:14 crc kubenswrapper[4599]: I1012 07:35:14.598723 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:14 crc kubenswrapper[4599]: I1012 07:35:14.598735 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:14 crc kubenswrapper[4599]: I1012 07:35:14.601356 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 12 07:35:15 crc kubenswrapper[4599]: I1012 07:35:15.501487 4599 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Oct 12 07:35:15 crc kubenswrapper[4599]: I1012 07:35:15.599508 4599 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 07:35:15 crc kubenswrapper[4599]: I1012 07:35:15.600397 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:15 crc kubenswrapper[4599]: I1012 07:35:15.600436 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:15 crc kubenswrapper[4599]: I1012 07:35:15.600445 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:15 crc kubenswrapper[4599]: I1012 07:35:15.740997 4599 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 12 07:35:15 crc kubenswrapper[4599]: I1012 07:35:15.741085 4599 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 12 07:35:15 crc kubenswrapper[4599]: I1012 07:35:15.748717 4599 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 12 07:35:15 crc kubenswrapper[4599]: I1012 07:35:15.748807 4599 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 12 07:35:18 crc kubenswrapper[4599]: I1012 07:35:18.747647 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Oct 12 07:35:18 crc kubenswrapper[4599]: I1012 07:35:18.747814 4599 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 07:35:18 crc kubenswrapper[4599]: I1012 07:35:18.748767 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:18 crc kubenswrapper[4599]: I1012 07:35:18.748801 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:18 crc kubenswrapper[4599]: I1012 07:35:18.748813 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:18 crc kubenswrapper[4599]: I1012 07:35:18.759637 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Oct 12 07:35:19 crc kubenswrapper[4599]: I1012 07:35:19.610222 4599 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 07:35:19 crc kubenswrapper[4599]: I1012 07:35:19.611348 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:19 crc kubenswrapper[4599]: I1012 07:35:19.611406 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:19 crc kubenswrapper[4599]: I1012 07:35:19.611419 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:20 crc kubenswrapper[4599]: I1012 07:35:20.480553 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 12 07:35:20 crc kubenswrapper[4599]: I1012 07:35:20.480775 4599 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 07:35:20 crc kubenswrapper[4599]: I1012 07:35:20.481268 4599 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Oct 12 07:35:20 crc kubenswrapper[4599]: I1012 07:35:20.481317 4599 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Oct 12 07:35:20 crc kubenswrapper[4599]: I1012 07:35:20.482206 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:20 crc kubenswrapper[4599]: I1012 07:35:20.482235 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:20 crc kubenswrapper[4599]: I1012 07:35:20.482244 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:20 crc kubenswrapper[4599]: I1012 07:35:20.483704 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 12 07:35:20 crc kubenswrapper[4599]: I1012 07:35:20.611054 4599 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 07:35:20 crc kubenswrapper[4599]: I1012 07:35:20.611463 4599 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Oct 12 07:35:20 crc kubenswrapper[4599]: I1012 07:35:20.611539 4599 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Oct 12 07:35:20 crc kubenswrapper[4599]: I1012 07:35:20.612064 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:20 crc kubenswrapper[4599]: I1012 07:35:20.612099 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:20 crc kubenswrapper[4599]: I1012 07:35:20.612109 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:20 crc kubenswrapper[4599]: E1012 07:35:20.750324 4599 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="3.2s" Oct 12 07:35:20 crc kubenswrapper[4599]: I1012 07:35:20.752234 4599 trace.go:236] Trace[788883075]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (12-Oct-2025 07:35:06.645) (total time: 14107ms): Oct 12 07:35:20 crc kubenswrapper[4599]: Trace[788883075]: ---"Objects listed" error: 14107ms (07:35:20.752) Oct 12 07:35:20 crc kubenswrapper[4599]: Trace[788883075]: [14.10712098s] [14.10712098s] END Oct 12 07:35:20 crc kubenswrapper[4599]: I1012 07:35:20.752266 4599 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Oct 12 07:35:20 crc kubenswrapper[4599]: I1012 07:35:20.752652 4599 trace.go:236] Trace[1168319381]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (12-Oct-2025 07:35:07.194) (total time: 13557ms): Oct 12 07:35:20 crc kubenswrapper[4599]: Trace[1168319381]: ---"Objects listed" error: 13557ms (07:35:20.752) Oct 12 07:35:20 crc kubenswrapper[4599]: Trace[1168319381]: [13.557975186s] [13.557975186s] END Oct 12 07:35:20 crc kubenswrapper[4599]: I1012 07:35:20.752678 4599 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Oct 12 07:35:20 crc kubenswrapper[4599]: I1012 07:35:20.752986 4599 trace.go:236] Trace[1595170751]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (12-Oct-2025 07:35:06.780) (total time: 13972ms): Oct 12 07:35:20 crc kubenswrapper[4599]: Trace[1595170751]: ---"Objects listed" error: 13972ms (07:35:20.752) Oct 12 07:35:20 crc kubenswrapper[4599]: Trace[1595170751]: [13.972210804s] [13.972210804s] END Oct 12 07:35:20 crc kubenswrapper[4599]: I1012 07:35:20.753002 4599 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Oct 12 07:35:20 crc kubenswrapper[4599]: E1012 07:35:20.753747 4599 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Oct 12 07:35:20 crc kubenswrapper[4599]: I1012 07:35:20.753972 4599 trace.go:236] Trace[1083209017]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (12-Oct-2025 07:35:08.267) (total time: 12486ms): Oct 12 07:35:20 crc kubenswrapper[4599]: Trace[1083209017]: ---"Objects listed" error: 12486ms (07:35:20.753) Oct 12 07:35:20 crc kubenswrapper[4599]: Trace[1083209017]: [12.48666205s] [12.48666205s] END Oct 12 07:35:20 crc kubenswrapper[4599]: I1012 07:35:20.753991 4599 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Oct 12 07:35:20 crc kubenswrapper[4599]: I1012 07:35:20.754093 4599 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.191963 4599 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.192030 4599 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.445707 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.448998 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.502162 4599 apiserver.go:52] "Watching apiserver" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.503956 4599 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.504378 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-dns/node-resolver-f5988","openshift-image-registry/node-ca-rxr42","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-operator/iptables-alerter-4ln5h"] Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.504698 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.504768 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.504873 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 07:35:21 crc kubenswrapper[4599]: E1012 07:35:21.504934 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 07:35:21 crc kubenswrapper[4599]: E1012 07:35:21.504867 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.505257 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.505405 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 07:35:21 crc kubenswrapper[4599]: E1012 07:35:21.505448 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.505412 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.505813 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-rxr42" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.505888 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-f5988" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.505997 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.506191 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.507054 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.508013 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.508279 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.508737 4599 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.509587 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.509600 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.509634 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.509715 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.509596 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.509795 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.509834 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.510079 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.510136 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.510164 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.510183 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.521040 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.529950 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.538578 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.554877 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.557826 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.557868 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.557889 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.557912 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.557930 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.557946 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.557964 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.557978 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.557997 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.558015 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.558030 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.558048 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.558119 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.558137 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.558152 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.558170 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.558187 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.558206 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.558222 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.558240 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.558255 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.558270 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.558287 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.558305 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.558321 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.558357 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.558374 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.558391 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.558407 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.558423 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.558438 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.558469 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.558469 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.558483 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-5mz5c"] Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.558486 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.558550 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.558575 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.558595 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.558621 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.558640 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.558660 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.558676 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.558694 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.558701 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.558749 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.558786 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.558810 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.558829 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.558836 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.558849 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.558867 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.558884 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.558905 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.558923 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.558939 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.558957 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.558975 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.558990 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.559012 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.559028 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.559049 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.559081 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.559100 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.559119 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.559134 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.559149 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.559165 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.559183 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.559203 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.559221 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.559236 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.559254 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.559273 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.559290 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.559307 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.559326 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.559376 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.559397 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.559416 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.559432 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.559451 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.559467 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.559485 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.559500 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.559517 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.559535 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.559553 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.559573 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.559588 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.559604 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.559621 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.559637 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.559654 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.559673 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.559690 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.559711 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.559727 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.559744 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.559762 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.559780 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.559796 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.559813 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.559828 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.559846 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.559863 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.559883 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.559902 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.559918 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.559936 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.559953 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.559969 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.559987 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.560005 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.560022 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.560039 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.560058 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.560087 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.560105 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.560125 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.560143 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.560159 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.560177 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.560193 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.560210 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.560227 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.560243 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.560259 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.560276 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.560293 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.560311 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.560328 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.560362 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.560378 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.560395 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.560411 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.560429 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.560447 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.560464 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.560480 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.560499 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.560531 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.560550 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.560567 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.560591 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.560617 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.560634 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.560650 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.560668 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.560685 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.560702 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.560719 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.560735 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.560751 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.560770 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.560787 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.560804 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.560824 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.560843 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.560861 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.560876 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.560893 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.560910 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.560929 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.560948 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.560966 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.560988 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.561014 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.561031 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.561049 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.561076 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.561095 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.561132 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.561152 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.561170 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.561188 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.561206 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.561224 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.561244 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.561260 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.561279 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.561299 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.561320 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.561355 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.561374 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.561393 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.561413 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.561436 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.561456 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.561478 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.561496 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.561513 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.561532 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.561549 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.561565 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.561583 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.561599 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.561615 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.561633 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.561652 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.561670 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.561711 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.561736 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.561758 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.561778 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.561800 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.561821 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.561860 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzz9z\" (UniqueName: \"kubernetes.io/projected/0147ed28-bc0f-409c-a813-dc2ffffba092-kube-api-access-wzz9z\") pod \"node-resolver-f5988\" (UID: \"0147ed28-bc0f-409c-a813-dc2ffffba092\") " pod="openshift-dns/node-resolver-f5988" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.561880 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.561901 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0147ed28-bc0f-409c-a813-dc2ffffba092-hosts-file\") pod \"node-resolver-f5988\" (UID: \"0147ed28-bc0f-409c-a813-dc2ffffba092\") " pod="openshift-dns/node-resolver-f5988" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.561919 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.561938 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.561956 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.561975 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.561992 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.562011 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.562031 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7hr5\" (UniqueName: \"kubernetes.io/projected/92909e59-b659-4fc0-91c0-1880ff96b4f3-kube-api-access-h7hr5\") pod \"node-ca-rxr42\" (UID: \"92909e59-b659-4fc0-91c0-1880ff96b4f3\") " pod="openshift-image-registry/node-ca-rxr42" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.562081 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.562103 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/92909e59-b659-4fc0-91c0-1880ff96b4f3-host\") pod \"node-ca-rxr42\" (UID: \"92909e59-b659-4fc0-91c0-1880ff96b4f3\") " pod="openshift-image-registry/node-ca-rxr42" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.562121 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/92909e59-b659-4fc0-91c0-1880ff96b4f3-serviceca\") pod \"node-ca-rxr42\" (UID: \"92909e59-b659-4fc0-91c0-1880ff96b4f3\") " pod="openshift-image-registry/node-ca-rxr42" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.562168 4599 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.562179 4599 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.562189 4599 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.558765 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-8hm26"] Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.562731 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-9xbn5"] Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.563147 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-whk5b"] Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.563773 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.564040 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-9xbn5" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.565556 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-8hm26" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.572769 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.559046 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.559202 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.559350 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.559416 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.559515 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.559573 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.559811 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.559862 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.559941 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.560012 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.560082 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.560115 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.560166 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.560258 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.560268 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.560468 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.560625 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.560684 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.560775 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.560819 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.560914 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.561139 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.561160 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.561198 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.561270 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.561319 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.561412 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.561445 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.561451 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.561499 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.561629 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.561698 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.561791 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.561869 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.562007 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.562026 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.562211 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.562299 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: E1012 07:35:21.562426 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 07:35:22.062406167 +0000 UTC m=+18.851601669 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.562429 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.562918 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.575824 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.563903 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.564309 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.564480 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.564507 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.564505 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.564517 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.564581 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.564616 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.564753 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.564769 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.565002 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.565132 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.565226 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.565354 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.565375 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.565482 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.565580 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.565669 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.565742 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.565769 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.565847 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.566040 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.566428 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.566457 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.566647 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: W1012 07:35:21.566814 4599 reflector.go:561] object-"openshift-ovn-kubernetes"/"ovnkube-script-lib": failed to list *v1.ConfigMap: configmaps "ovnkube-script-lib" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-ovn-kubernetes": no relationship found between node 'crc' and this object Oct 12 07:35:21 crc kubenswrapper[4599]: W1012 07:35:21.566845 4599 reflector.go:561] object-"openshift-ovn-kubernetes"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-ovn-kubernetes": no relationship found between node 'crc' and this object Oct 12 07:35:21 crc kubenswrapper[4599]: W1012 07:35:21.566868 4599 reflector.go:561] object-"openshift-ovn-kubernetes"/"env-overrides": failed to list *v1.ConfigMap: configmaps "env-overrides" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-ovn-kubernetes": no relationship found between node 'crc' and this object Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.566922 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: W1012 07:35:21.567092 4599 reflector.go:561] object-"openshift-ovn-kubernetes"/"ovnkube-config": failed to list *v1.ConfigMap: configmaps "ovnkube-config" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-ovn-kubernetes": no relationship found between node 'crc' and this object Oct 12 07:35:21 crc kubenswrapper[4599]: W1012 07:35:21.567134 4599 reflector.go:561] object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-ovn-kubernetes": no relationship found between node 'crc' and this object Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.567380 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.567702 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.567906 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.567975 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.568087 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.568157 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.568190 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.568204 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.568247 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.568416 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.568461 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.568678 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.568998 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.569024 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.569197 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.569394 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.569403 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.569409 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.569952 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.570003 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.572202 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.572398 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.572430 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.572470 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.572538 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.572653 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.572656 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.572707 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Oct 12 07:35:21 crc kubenswrapper[4599]: E1012 07:35:21.572724 4599 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.572730 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.572883 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.572955 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.573064 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.573237 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.573248 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.573439 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.573700 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.574159 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.574230 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.574357 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.575459 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.575570 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.575585 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.575851 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.575890 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.575903 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: E1012 07:35:21.576042 4599 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.576188 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: E1012 07:35:21.576218 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-12 07:35:22.076202065 +0000 UTC m=+18.865397567 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.576199 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.576365 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.576401 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.576537 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.576585 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.576645 4599 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.576804 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.576620 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.576899 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.576904 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.577286 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.577666 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.577732 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.577854 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.577955 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.578088 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.578208 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.578232 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.578317 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.578524 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.578569 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.578579 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.578723 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.578798 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.578911 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.579024 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.579040 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: E1012 07:35:21.579042 4599 reflector.go:158] "Unhandled Error" err="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"ovnkube-script-lib\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-ovn-kubernetes\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 12 07:35:21 crc kubenswrapper[4599]: E1012 07:35:21.579057 4599 reflector.go:158] "Unhandled Error" err="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-ovn-kubernetes\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.579062 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: E1012 07:35:21.579102 4599 reflector.go:158] "Unhandled Error" err="object-\"openshift-ovn-kubernetes\"/\"env-overrides\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"env-overrides\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-ovn-kubernetes\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 12 07:35:21 crc kubenswrapper[4599]: E1012 07:35:21.579121 4599 reflector.go:158] "Unhandled Error" err="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"ovnkube-config\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-ovn-kubernetes\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 12 07:35:21 crc kubenswrapper[4599]: E1012 07:35:21.579141 4599 reflector.go:158] "Unhandled Error" err="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-ovn-kubernetes\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.579324 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.579411 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.579577 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.579619 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.579723 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: E1012 07:35:21.579766 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-12 07:35:22.079752351 +0000 UTC m=+18.868947853 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.576041 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.579886 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.580244 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.581970 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.581986 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.582815 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.582871 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.583681 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.583874 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.584062 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.584086 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.584582 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.584770 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.585101 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.585385 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.585719 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.585898 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.585927 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.586134 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.586153 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.586173 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.586353 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.586386 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.586633 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.586645 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.586905 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.586932 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.585133 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.587198 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.587244 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.587544 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.587554 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.587560 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.587602 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.587836 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.588671 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.588766 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.588875 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.588898 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.589055 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.589181 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.589510 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.589793 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.589891 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.589960 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.590325 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.590646 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.590829 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.591095 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.591142 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.591439 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.591706 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.591872 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.591923 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.592147 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.592232 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.592267 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.593535 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: E1012 07:35:21.594473 4599 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 12 07:35:21 crc kubenswrapper[4599]: E1012 07:35:21.594493 4599 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 12 07:35:21 crc kubenswrapper[4599]: E1012 07:35:21.594547 4599 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 12 07:35:21 crc kubenswrapper[4599]: E1012 07:35:21.594611 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-12 07:35:22.094595387 +0000 UTC m=+18.883790889 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 12 07:35:21 crc kubenswrapper[4599]: E1012 07:35:21.595978 4599 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 12 07:35:21 crc kubenswrapper[4599]: E1012 07:35:21.596004 4599 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 12 07:35:21 crc kubenswrapper[4599]: E1012 07:35:21.596018 4599 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 12 07:35:21 crc kubenswrapper[4599]: E1012 07:35:21.596077 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-12 07:35:22.096054754 +0000 UTC m=+18.885250247 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.596116 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.596536 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.596735 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.597226 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.598117 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.603425 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.607001 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.615064 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.615159 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.616254 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.616313 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.617037 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.618462 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.621307 4599 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="21d90e31b7ebf70d48881d9d0837e8a1fdeabc3d53988c7ba993ce28e902d3d4" exitCode=255 Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.621417 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"21d90e31b7ebf70d48881d9d0837e8a1fdeabc3d53988c7ba993ce28e902d3d4"} Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.621976 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rxr42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92909e59-b659-4fc0-91c0-1880ff96b4f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7hr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rxr42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.624955 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:35:21 crc kubenswrapper[4599]: E1012 07:35:21.625755 4599 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.627667 4599 scope.go:117] "RemoveContainer" containerID="21d90e31b7ebf70d48881d9d0837e8a1fdeabc3d53988c7ba993ce28e902d3d4" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.627707 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.628188 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f5988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0147ed28-bc0f-409c-a813-dc2ffffba092\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzz9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f5988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.635254 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2869b43-5f5f-4ed7-9dc3-63d210bde137\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21abc118842f09c54256bd05f1e5978c928bad569bc9b52157287576117fb49d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb25175bfe42e193200fbc76f82294afbd5f23d9653637c12ddf67553431748e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da490def77b19758f5479ef7070846dee19a5057bd6b0568d96fd38a8dd6528\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://622eac195ecab593bc5fa71a4a88ac987a45be5fb130336aae7acd680c51abe4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.640992 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc694bce-8c25-4729-b452-29d44d3efe6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25l5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25l5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5mz5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.650511 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8afb4c88-1806-483a-9957-30833be28202\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c61ee75e4bbe142c54914177759cf0c49ecc36f904c8c6bf421cd4917256d5a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://545c8135b4e5319ac3b9c84ee50556b66e4f5cfe0987f5aa37ef668130f6f267\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40188eb89126b26a3002269d063827de60fa17f72c18fbbd0e15b16c744e8ddb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21d90e31b7ebf70d48881d9d0837e8a1fdeabc3d53988c7ba993ce28e902d3d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21d90e31b7ebf70d48881d9d0837e8a1fdeabc3d53988c7ba993ce28e902d3d4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T07:35:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW1012 07:35:20.755648 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1012 07:35:20.755757 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 07:35:20.757774 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2293705747/tls.crt::/tmp/serving-cert-2293705747/tls.key\\\\\\\"\\\\nI1012 07:35:20.975376 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1012 07:35:20.977324 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1012 07:35:20.977357 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1012 07:35:20.977377 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1012 07:35:20.977382 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1012 07:35:20.982144 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1012 07:35:20.982166 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1012 07:35:20.982172 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1012 07:35:20.982212 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1012 07:35:20.982216 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1012 07:35:20.982220 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1012 07:35:20.982223 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1012 07:35:20.982226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1012 07:35:20.985689 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e595e7d1dee4ecff3b5deb6e363b57de09e4bcfddd6ab9ae9be27b6e3578628\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afc60525d60f5b497a75efb631611d658c23f6336c21ca7421bf3efee3ece0ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afc60525d60f5b497a75efb631611d658c23f6336c21ca7421bf3efee3ece0ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.658773 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.663494 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7hr5\" (UniqueName: \"kubernetes.io/projected/92909e59-b659-4fc0-91c0-1880ff96b4f3-kube-api-access-h7hr5\") pod \"node-ca-rxr42\" (UID: \"92909e59-b659-4fc0-91c0-1880ff96b4f3\") " pod="openshift-image-registry/node-ca-rxr42" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.663526 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1a95b7ab-8632-4332-a30f-64f28ef8d313-run-ovn\") pod \"ovnkube-node-whk5b\" (UID: \"1a95b7ab-8632-4332-a30f-64f28ef8d313\") " pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.663545 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1a95b7ab-8632-4332-a30f-64f28ef8d313-ovnkube-script-lib\") pod \"ovnkube-node-whk5b\" (UID: \"1a95b7ab-8632-4332-a30f-64f28ef8d313\") " pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.663563 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ce311f52-0501-45d3-8209-b1d2aa25028b-system-cni-dir\") pod \"multus-8hm26\" (UID: \"ce311f52-0501-45d3-8209-b1d2aa25028b\") " pod="openshift-multus/multus-8hm26" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.663579 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/18ca8765-c435-4750-b803-14b539958d9e-cnibin\") pod \"multus-additional-cni-plugins-9xbn5\" (UID: \"18ca8765-c435-4750-b803-14b539958d9e\") " pod="openshift-multus/multus-additional-cni-plugins-9xbn5" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.663596 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25l5n\" (UniqueName: \"kubernetes.io/projected/cc694bce-8c25-4729-b452-29d44d3efe6e-kube-api-access-25l5n\") pod \"machine-config-daemon-5mz5c\" (UID: \"cc694bce-8c25-4729-b452-29d44d3efe6e\") " pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.663611 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ce311f52-0501-45d3-8209-b1d2aa25028b-host-var-lib-cni-bin\") pod \"multus-8hm26\" (UID: \"ce311f52-0501-45d3-8209-b1d2aa25028b\") " pod="openshift-multus/multus-8hm26" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.663625 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1a95b7ab-8632-4332-a30f-64f28ef8d313-etc-openvswitch\") pod \"ovnkube-node-whk5b\" (UID: \"1a95b7ab-8632-4332-a30f-64f28ef8d313\") " pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.663640 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1a95b7ab-8632-4332-a30f-64f28ef8d313-run-openvswitch\") pod \"ovnkube-node-whk5b\" (UID: \"1a95b7ab-8632-4332-a30f-64f28ef8d313\") " pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.663655 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1a95b7ab-8632-4332-a30f-64f28ef8d313-host-run-ovn-kubernetes\") pod \"ovnkube-node-whk5b\" (UID: \"1a95b7ab-8632-4332-a30f-64f28ef8d313\") " pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.663669 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ce311f52-0501-45d3-8209-b1d2aa25028b-cni-binary-copy\") pod \"multus-8hm26\" (UID: \"ce311f52-0501-45d3-8209-b1d2aa25028b\") " pod="openshift-multus/multus-8hm26" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.663682 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ce311f52-0501-45d3-8209-b1d2aa25028b-host-var-lib-cni-multus\") pod \"multus-8hm26\" (UID: \"ce311f52-0501-45d3-8209-b1d2aa25028b\") " pod="openshift-multus/multus-8hm26" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.663696 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/18ca8765-c435-4750-b803-14b539958d9e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-9xbn5\" (UID: \"18ca8765-c435-4750-b803-14b539958d9e\") " pod="openshift-multus/multus-additional-cni-plugins-9xbn5" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.663708 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1a95b7ab-8632-4332-a30f-64f28ef8d313-host-kubelet\") pod \"ovnkube-node-whk5b\" (UID: \"1a95b7ab-8632-4332-a30f-64f28ef8d313\") " pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.663734 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ce311f52-0501-45d3-8209-b1d2aa25028b-host-var-lib-kubelet\") pod \"multus-8hm26\" (UID: \"ce311f52-0501-45d3-8209-b1d2aa25028b\") " pod="openshift-multus/multus-8hm26" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.663750 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-st7tw\" (UniqueName: \"kubernetes.io/projected/18ca8765-c435-4750-b803-14b539958d9e-kube-api-access-st7tw\") pod \"multus-additional-cni-plugins-9xbn5\" (UID: \"18ca8765-c435-4750-b803-14b539958d9e\") " pod="openshift-multus/multus-additional-cni-plugins-9xbn5" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.663767 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1a95b7ab-8632-4332-a30f-64f28ef8d313-var-lib-openvswitch\") pod \"ovnkube-node-whk5b\" (UID: \"1a95b7ab-8632-4332-a30f-64f28ef8d313\") " pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.663781 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1a95b7ab-8632-4332-a30f-64f28ef8d313-ovnkube-config\") pod \"ovnkube-node-whk5b\" (UID: \"1a95b7ab-8632-4332-a30f-64f28ef8d313\") " pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.663795 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1a95b7ab-8632-4332-a30f-64f28ef8d313-env-overrides\") pod \"ovnkube-node-whk5b\" (UID: \"1a95b7ab-8632-4332-a30f-64f28ef8d313\") " pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.663812 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cc694bce-8c25-4729-b452-29d44d3efe6e-proxy-tls\") pod \"machine-config-daemon-5mz5c\" (UID: \"cc694bce-8c25-4729-b452-29d44d3efe6e\") " pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.663834 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ce311f52-0501-45d3-8209-b1d2aa25028b-hostroot\") pod \"multus-8hm26\" (UID: \"ce311f52-0501-45d3-8209-b1d2aa25028b\") " pod="openshift-multus/multus-8hm26" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.663849 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ce311f52-0501-45d3-8209-b1d2aa25028b-etc-kubernetes\") pod \"multus-8hm26\" (UID: \"ce311f52-0501-45d3-8209-b1d2aa25028b\") " pod="openshift-multus/multus-8hm26" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.663863 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1a95b7ab-8632-4332-a30f-64f28ef8d313-host-run-netns\") pod \"ovnkube-node-whk5b\" (UID: \"1a95b7ab-8632-4332-a30f-64f28ef8d313\") " pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.663880 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1a95b7ab-8632-4332-a30f-64f28ef8d313-ovn-node-metrics-cert\") pod \"ovnkube-node-whk5b\" (UID: \"1a95b7ab-8632-4332-a30f-64f28ef8d313\") " pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.663897 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.663942 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ce311f52-0501-45d3-8209-b1d2aa25028b-multus-daemon-config\") pod \"multus-8hm26\" (UID: \"ce311f52-0501-45d3-8209-b1d2aa25028b\") " pod="openshift-multus/multus-8hm26" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.663959 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/18ca8765-c435-4750-b803-14b539958d9e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-9xbn5\" (UID: \"18ca8765-c435-4750-b803-14b539958d9e\") " pod="openshift-multus/multus-additional-cni-plugins-9xbn5" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.663975 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1a95b7ab-8632-4332-a30f-64f28ef8d313-systemd-units\") pod \"ovnkube-node-whk5b\" (UID: \"1a95b7ab-8632-4332-a30f-64f28ef8d313\") " pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.664002 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ce311f52-0501-45d3-8209-b1d2aa25028b-os-release\") pod \"multus-8hm26\" (UID: \"ce311f52-0501-45d3-8209-b1d2aa25028b\") " pod="openshift-multus/multus-8hm26" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.664016 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ce311f52-0501-45d3-8209-b1d2aa25028b-host-run-k8s-cni-cncf-io\") pod \"multus-8hm26\" (UID: \"ce311f52-0501-45d3-8209-b1d2aa25028b\") " pod="openshift-multus/multus-8hm26" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.664030 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1a95b7ab-8632-4332-a30f-64f28ef8d313-node-log\") pod \"ovnkube-node-whk5b\" (UID: \"1a95b7ab-8632-4332-a30f-64f28ef8d313\") " pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.664049 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ce311f52-0501-45d3-8209-b1d2aa25028b-cnibin\") pod \"multus-8hm26\" (UID: \"ce311f52-0501-45d3-8209-b1d2aa25028b\") " pod="openshift-multus/multus-8hm26" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.664062 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/92909e59-b659-4fc0-91c0-1880ff96b4f3-host\") pod \"node-ca-rxr42\" (UID: \"92909e59-b659-4fc0-91c0-1880ff96b4f3\") " pod="openshift-image-registry/node-ca-rxr42" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.664089 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/92909e59-b659-4fc0-91c0-1880ff96b4f3-serviceca\") pod \"node-ca-rxr42\" (UID: \"92909e59-b659-4fc0-91c0-1880ff96b4f3\") " pod="openshift-image-registry/node-ca-rxr42" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.664104 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/18ca8765-c435-4750-b803-14b539958d9e-os-release\") pod \"multus-additional-cni-plugins-9xbn5\" (UID: \"18ca8765-c435-4750-b803-14b539958d9e\") " pod="openshift-multus/multus-additional-cni-plugins-9xbn5" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.664118 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/18ca8765-c435-4750-b803-14b539958d9e-cni-binary-copy\") pod \"multus-additional-cni-plugins-9xbn5\" (UID: \"18ca8765-c435-4750-b803-14b539958d9e\") " pod="openshift-multus/multus-additional-cni-plugins-9xbn5" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.664141 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzwzn\" (UniqueName: \"kubernetes.io/projected/1a95b7ab-8632-4332-a30f-64f28ef8d313-kube-api-access-qzwzn\") pod \"ovnkube-node-whk5b\" (UID: \"1a95b7ab-8632-4332-a30f-64f28ef8d313\") " pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.664176 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ce311f52-0501-45d3-8209-b1d2aa25028b-multus-socket-dir-parent\") pod \"multus-8hm26\" (UID: \"ce311f52-0501-45d3-8209-b1d2aa25028b\") " pod="openshift-multus/multus-8hm26" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.664189 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/cc694bce-8c25-4729-b452-29d44d3efe6e-rootfs\") pod \"machine-config-daemon-5mz5c\" (UID: \"cc694bce-8c25-4729-b452-29d44d3efe6e\") " pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.664211 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.664233 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ce311f52-0501-45d3-8209-b1d2aa25028b-multus-conf-dir\") pod \"multus-8hm26\" (UID: \"ce311f52-0501-45d3-8209-b1d2aa25028b\") " pod="openshift-multus/multus-8hm26" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.664248 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54j2z\" (UniqueName: \"kubernetes.io/projected/ce311f52-0501-45d3-8209-b1d2aa25028b-kube-api-access-54j2z\") pod \"multus-8hm26\" (UID: \"ce311f52-0501-45d3-8209-b1d2aa25028b\") " pod="openshift-multus/multus-8hm26" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.664263 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1a95b7ab-8632-4332-a30f-64f28ef8d313-host-slash\") pod \"ovnkube-node-whk5b\" (UID: \"1a95b7ab-8632-4332-a30f-64f28ef8d313\") " pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.664277 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/cc694bce-8c25-4729-b452-29d44d3efe6e-mcd-auth-proxy-config\") pod \"machine-config-daemon-5mz5c\" (UID: \"cc694bce-8c25-4729-b452-29d44d3efe6e\") " pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.664293 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzz9z\" (UniqueName: \"kubernetes.io/projected/0147ed28-bc0f-409c-a813-dc2ffffba092-kube-api-access-wzz9z\") pod \"node-resolver-f5988\" (UID: \"0147ed28-bc0f-409c-a813-dc2ffffba092\") " pod="openshift-dns/node-resolver-f5988" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.664308 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ce311f52-0501-45d3-8209-b1d2aa25028b-host-run-multus-certs\") pod \"multus-8hm26\" (UID: \"ce311f52-0501-45d3-8209-b1d2aa25028b\") " pod="openshift-multus/multus-8hm26" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.664328 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1a95b7ab-8632-4332-a30f-64f28ef8d313-log-socket\") pod \"ovnkube-node-whk5b\" (UID: \"1a95b7ab-8632-4332-a30f-64f28ef8d313\") " pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.664358 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1a95b7ab-8632-4332-a30f-64f28ef8d313-host-cni-bin\") pod \"ovnkube-node-whk5b\" (UID: \"1a95b7ab-8632-4332-a30f-64f28ef8d313\") " pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.664372 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1a95b7ab-8632-4332-a30f-64f28ef8d313-host-cni-netd\") pod \"ovnkube-node-whk5b\" (UID: \"1a95b7ab-8632-4332-a30f-64f28ef8d313\") " pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.664389 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0147ed28-bc0f-409c-a813-dc2ffffba092-hosts-file\") pod \"node-resolver-f5988\" (UID: \"0147ed28-bc0f-409c-a813-dc2ffffba092\") " pod="openshift-dns/node-resolver-f5988" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.664403 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ce311f52-0501-45d3-8209-b1d2aa25028b-host-run-netns\") pod \"multus-8hm26\" (UID: \"ce311f52-0501-45d3-8209-b1d2aa25028b\") " pod="openshift-multus/multus-8hm26" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.664439 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ce311f52-0501-45d3-8209-b1d2aa25028b-multus-cni-dir\") pod \"multus-8hm26\" (UID: \"ce311f52-0501-45d3-8209-b1d2aa25028b\") " pod="openshift-multus/multus-8hm26" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.664458 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/18ca8765-c435-4750-b803-14b539958d9e-system-cni-dir\") pod \"multus-additional-cni-plugins-9xbn5\" (UID: \"18ca8765-c435-4750-b803-14b539958d9e\") " pod="openshift-multus/multus-additional-cni-plugins-9xbn5" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.664471 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1a95b7ab-8632-4332-a30f-64f28ef8d313-run-systemd\") pod \"ovnkube-node-whk5b\" (UID: \"1a95b7ab-8632-4332-a30f-64f28ef8d313\") " pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.664486 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1a95b7ab-8632-4332-a30f-64f28ef8d313-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-whk5b\" (UID: \"1a95b7ab-8632-4332-a30f-64f28ef8d313\") " pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.664529 4599 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.664540 4599 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.664549 4599 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.664558 4599 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.664566 4599 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.664580 4599 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.664589 4599 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.664598 4599 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.664608 4599 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.664617 4599 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.664627 4599 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.664636 4599 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.664650 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.664657 4599 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.664667 4599 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.664676 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.664686 4599 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.664694 4599 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.664703 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.664712 4599 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.664720 4599 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.664729 4599 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.664740 4599 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.664749 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.664758 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.664766 4599 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.664775 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.664784 4599 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.664792 4599 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.664799 4599 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.664808 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.664816 4599 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.664824 4599 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.664833 4599 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.664842 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.664851 4599 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.664861 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.664870 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.664878 4599 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.664887 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.664896 4599 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.664904 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.664913 4599 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.664922 4599 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.664931 4599 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.664939 4599 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.664947 4599 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.664955 4599 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.664964 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.664972 4599 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.664980 4599 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.664988 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.664997 4599 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.665005 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.665012 4599 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.665021 4599 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.665029 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.665037 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.665051 4599 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.665058 4599 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.665067 4599 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.665085 4599 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.665093 4599 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.665868 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.666171 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/92909e59-b659-4fc0-91c0-1880ff96b4f3-host\") pod \"node-ca-rxr42\" (UID: \"92909e59-b659-4fc0-91c0-1880ff96b4f3\") " pod="openshift-image-registry/node-ca-rxr42" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.666407 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.666647 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.666795 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0147ed28-bc0f-409c-a813-dc2ffffba092-hosts-file\") pod \"node-resolver-f5988\" (UID: \"0147ed28-bc0f-409c-a813-dc2ffffba092\") " pod="openshift-dns/node-resolver-f5988" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.665102 4599 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.667009 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/92909e59-b659-4fc0-91c0-1880ff96b4f3-serviceca\") pod \"node-ca-rxr42\" (UID: \"92909e59-b659-4fc0-91c0-1880ff96b4f3\") " pod="openshift-image-registry/node-ca-rxr42" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.667535 4599 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.667585 4599 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.667601 4599 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.667755 4599 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.667778 4599 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.667809 4599 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.667823 4599 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.667833 4599 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.667842 4599 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.667851 4599 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.667865 4599 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.667875 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.667912 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.667922 4599 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.667944 4599 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.668123 4599 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.668135 4599 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.668145 4599 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.668155 4599 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.668164 4599 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.668173 4599 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.668182 4599 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.668191 4599 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.668201 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.668213 4599 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.668222 4599 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.668232 4599 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.668241 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.668250 4599 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.668259 4599 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.668268 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.668277 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.668285 4599 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.668294 4599 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.668303 4599 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.668311 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.668322 4599 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.668330 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.668357 4599 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.668366 4599 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.668375 4599 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.668383 4599 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.668392 4599 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.668401 4599 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.668410 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.668419 4599 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.668428 4599 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.668436 4599 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.668446 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.668455 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.668465 4599 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.668475 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.668486 4599 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.668496 4599 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.668507 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.668516 4599 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.668526 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.668535 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.668546 4599 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.668555 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.668564 4599 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.668592 4599 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.668600 4599 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.668610 4599 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.668618 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.668627 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.668636 4599 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.668644 4599 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.668652 4599 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.668660 4599 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.668669 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.668677 4599 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.668686 4599 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.668695 4599 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.668705 4599 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.668714 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.668723 4599 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.668732 4599 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.668740 4599 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.668749 4599 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.668757 4599 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.668796 4599 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.668805 4599 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.668815 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.668824 4599 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.668835 4599 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.668844 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.668853 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.668861 4599 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.668870 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.668877 4599 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.668887 4599 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.668895 4599 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.668903 4599 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.668915 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.668923 4599 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.668931 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.668939 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.668947 4599 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.668955 4599 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.668964 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.668972 4599 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.668980 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.668988 4599 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.668996 4599 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.669005 4599 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.669014 4599 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.669024 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.669032 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.669041 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.669052 4599 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.669061 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.669085 4599 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.669096 4599 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.669107 4599 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.669115 4599 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.669124 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.669156 4599 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.669165 4599 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.669174 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.669182 4599 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.669190 4599 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.669199 4599 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.669207 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.669217 4599 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.669225 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.669234 4599 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.676327 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2869b43-5f5f-4ed7-9dc3-63d210bde137\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21abc118842f09c54256bd05f1e5978c928bad569bc9b52157287576117fb49d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb25175bfe42e193200fbc76f82294afbd5f23d9653637c12ddf67553431748e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da490def77b19758f5479ef7070846dee19a5057bd6b0568d96fd38a8dd6528\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://622eac195ecab593bc5fa71a4a88ac987a45be5fb130336aae7acd680c51abe4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.681771 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7hr5\" (UniqueName: \"kubernetes.io/projected/92909e59-b659-4fc0-91c0-1880ff96b4f3-kube-api-access-h7hr5\") pod \"node-ca-rxr42\" (UID: \"92909e59-b659-4fc0-91c0-1880ff96b4f3\") " pod="openshift-image-registry/node-ca-rxr42" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.686877 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzz9z\" (UniqueName: \"kubernetes.io/projected/0147ed28-bc0f-409c-a813-dc2ffffba092-kube-api-access-wzz9z\") pod \"node-resolver-f5988\" (UID: \"0147ed28-bc0f-409c-a813-dc2ffffba092\") " pod="openshift-dns/node-resolver-f5988" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.689188 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.709876 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.723733 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9xbn5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18ca8765-c435-4750-b803-14b539958d9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9xbn5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.733149 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8hm26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce311f52-0501-45d3-8209-b1d2aa25028b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8hm26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.741084 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.749282 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.762466 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a95b7ab-8632-4332-a30f-64f28ef8d313\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-whk5b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.768113 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rxr42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92909e59-b659-4fc0-91c0-1880ff96b4f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7hr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rxr42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.770027 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1a95b7ab-8632-4332-a30f-64f28ef8d313-run-ovn\") pod \"ovnkube-node-whk5b\" (UID: \"1a95b7ab-8632-4332-a30f-64f28ef8d313\") " pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.770060 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1a95b7ab-8632-4332-a30f-64f28ef8d313-ovnkube-script-lib\") pod \"ovnkube-node-whk5b\" (UID: \"1a95b7ab-8632-4332-a30f-64f28ef8d313\") " pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.770093 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ce311f52-0501-45d3-8209-b1d2aa25028b-system-cni-dir\") pod \"multus-8hm26\" (UID: \"ce311f52-0501-45d3-8209-b1d2aa25028b\") " pod="openshift-multus/multus-8hm26" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.770113 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/18ca8765-c435-4750-b803-14b539958d9e-cnibin\") pod \"multus-additional-cni-plugins-9xbn5\" (UID: \"18ca8765-c435-4750-b803-14b539958d9e\") " pod="openshift-multus/multus-additional-cni-plugins-9xbn5" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.770109 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1a95b7ab-8632-4332-a30f-64f28ef8d313-run-ovn\") pod \"ovnkube-node-whk5b\" (UID: \"1a95b7ab-8632-4332-a30f-64f28ef8d313\") " pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.770145 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25l5n\" (UniqueName: \"kubernetes.io/projected/cc694bce-8c25-4729-b452-29d44d3efe6e-kube-api-access-25l5n\") pod \"machine-config-daemon-5mz5c\" (UID: \"cc694bce-8c25-4729-b452-29d44d3efe6e\") " pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.770162 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/18ca8765-c435-4750-b803-14b539958d9e-cnibin\") pod \"multus-additional-cni-plugins-9xbn5\" (UID: \"18ca8765-c435-4750-b803-14b539958d9e\") " pod="openshift-multus/multus-additional-cni-plugins-9xbn5" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.770166 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ce311f52-0501-45d3-8209-b1d2aa25028b-host-var-lib-cni-bin\") pod \"multus-8hm26\" (UID: \"ce311f52-0501-45d3-8209-b1d2aa25028b\") " pod="openshift-multus/multus-8hm26" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.770194 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ce311f52-0501-45d3-8209-b1d2aa25028b-host-var-lib-cni-bin\") pod \"multus-8hm26\" (UID: \"ce311f52-0501-45d3-8209-b1d2aa25028b\") " pod="openshift-multus/multus-8hm26" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.770202 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1a95b7ab-8632-4332-a30f-64f28ef8d313-etc-openvswitch\") pod \"ovnkube-node-whk5b\" (UID: \"1a95b7ab-8632-4332-a30f-64f28ef8d313\") " pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.770223 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1a95b7ab-8632-4332-a30f-64f28ef8d313-etc-openvswitch\") pod \"ovnkube-node-whk5b\" (UID: \"1a95b7ab-8632-4332-a30f-64f28ef8d313\") " pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.770233 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1a95b7ab-8632-4332-a30f-64f28ef8d313-run-openvswitch\") pod \"ovnkube-node-whk5b\" (UID: \"1a95b7ab-8632-4332-a30f-64f28ef8d313\") " pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.770260 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1a95b7ab-8632-4332-a30f-64f28ef8d313-host-run-ovn-kubernetes\") pod \"ovnkube-node-whk5b\" (UID: \"1a95b7ab-8632-4332-a30f-64f28ef8d313\") " pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.770266 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ce311f52-0501-45d3-8209-b1d2aa25028b-system-cni-dir\") pod \"multus-8hm26\" (UID: \"ce311f52-0501-45d3-8209-b1d2aa25028b\") " pod="openshift-multus/multus-8hm26" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.770316 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1a95b7ab-8632-4332-a30f-64f28ef8d313-run-openvswitch\") pod \"ovnkube-node-whk5b\" (UID: \"1a95b7ab-8632-4332-a30f-64f28ef8d313\") " pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.770280 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ce311f52-0501-45d3-8209-b1d2aa25028b-cni-binary-copy\") pod \"multus-8hm26\" (UID: \"ce311f52-0501-45d3-8209-b1d2aa25028b\") " pod="openshift-multus/multus-8hm26" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.770371 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1a95b7ab-8632-4332-a30f-64f28ef8d313-host-run-ovn-kubernetes\") pod \"ovnkube-node-whk5b\" (UID: \"1a95b7ab-8632-4332-a30f-64f28ef8d313\") " pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.770396 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ce311f52-0501-45d3-8209-b1d2aa25028b-host-var-lib-cni-multus\") pod \"multus-8hm26\" (UID: \"ce311f52-0501-45d3-8209-b1d2aa25028b\") " pod="openshift-multus/multus-8hm26" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.770440 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/18ca8765-c435-4750-b803-14b539958d9e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-9xbn5\" (UID: \"18ca8765-c435-4750-b803-14b539958d9e\") " pod="openshift-multus/multus-additional-cni-plugins-9xbn5" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.770454 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ce311f52-0501-45d3-8209-b1d2aa25028b-host-var-lib-cni-multus\") pod \"multus-8hm26\" (UID: \"ce311f52-0501-45d3-8209-b1d2aa25028b\") " pod="openshift-multus/multus-8hm26" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.770481 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1a95b7ab-8632-4332-a30f-64f28ef8d313-host-kubelet\") pod \"ovnkube-node-whk5b\" (UID: \"1a95b7ab-8632-4332-a30f-64f28ef8d313\") " pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.770461 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1a95b7ab-8632-4332-a30f-64f28ef8d313-host-kubelet\") pod \"ovnkube-node-whk5b\" (UID: \"1a95b7ab-8632-4332-a30f-64f28ef8d313\") " pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.770512 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-st7tw\" (UniqueName: \"kubernetes.io/projected/18ca8765-c435-4750-b803-14b539958d9e-kube-api-access-st7tw\") pod \"multus-additional-cni-plugins-9xbn5\" (UID: \"18ca8765-c435-4750-b803-14b539958d9e\") " pod="openshift-multus/multus-additional-cni-plugins-9xbn5" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.770532 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1a95b7ab-8632-4332-a30f-64f28ef8d313-var-lib-openvswitch\") pod \"ovnkube-node-whk5b\" (UID: \"1a95b7ab-8632-4332-a30f-64f28ef8d313\") " pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.770551 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1a95b7ab-8632-4332-a30f-64f28ef8d313-ovnkube-config\") pod \"ovnkube-node-whk5b\" (UID: \"1a95b7ab-8632-4332-a30f-64f28ef8d313\") " pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.770565 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1a95b7ab-8632-4332-a30f-64f28ef8d313-env-overrides\") pod \"ovnkube-node-whk5b\" (UID: \"1a95b7ab-8632-4332-a30f-64f28ef8d313\") " pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.770582 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ce311f52-0501-45d3-8209-b1d2aa25028b-host-var-lib-kubelet\") pod \"multus-8hm26\" (UID: \"ce311f52-0501-45d3-8209-b1d2aa25028b\") " pod="openshift-multus/multus-8hm26" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.770599 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cc694bce-8c25-4729-b452-29d44d3efe6e-proxy-tls\") pod \"machine-config-daemon-5mz5c\" (UID: \"cc694bce-8c25-4729-b452-29d44d3efe6e\") " pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.770600 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1a95b7ab-8632-4332-a30f-64f28ef8d313-var-lib-openvswitch\") pod \"ovnkube-node-whk5b\" (UID: \"1a95b7ab-8632-4332-a30f-64f28ef8d313\") " pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.770614 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1a95b7ab-8632-4332-a30f-64f28ef8d313-host-run-netns\") pod \"ovnkube-node-whk5b\" (UID: \"1a95b7ab-8632-4332-a30f-64f28ef8d313\") " pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.770631 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1a95b7ab-8632-4332-a30f-64f28ef8d313-ovn-node-metrics-cert\") pod \"ovnkube-node-whk5b\" (UID: \"1a95b7ab-8632-4332-a30f-64f28ef8d313\") " pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.770659 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ce311f52-0501-45d3-8209-b1d2aa25028b-hostroot\") pod \"multus-8hm26\" (UID: \"ce311f52-0501-45d3-8209-b1d2aa25028b\") " pod="openshift-multus/multus-8hm26" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.770678 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ce311f52-0501-45d3-8209-b1d2aa25028b-etc-kubernetes\") pod \"multus-8hm26\" (UID: \"ce311f52-0501-45d3-8209-b1d2aa25028b\") " pod="openshift-multus/multus-8hm26" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.770693 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/18ca8765-c435-4750-b803-14b539958d9e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-9xbn5\" (UID: \"18ca8765-c435-4750-b803-14b539958d9e\") " pod="openshift-multus/multus-additional-cni-plugins-9xbn5" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.770715 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1a95b7ab-8632-4332-a30f-64f28ef8d313-systemd-units\") pod \"ovnkube-node-whk5b\" (UID: \"1a95b7ab-8632-4332-a30f-64f28ef8d313\") " pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.770731 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ce311f52-0501-45d3-8209-b1d2aa25028b-multus-daemon-config\") pod \"multus-8hm26\" (UID: \"ce311f52-0501-45d3-8209-b1d2aa25028b\") " pod="openshift-multus/multus-8hm26" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.770747 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1a95b7ab-8632-4332-a30f-64f28ef8d313-node-log\") pod \"ovnkube-node-whk5b\" (UID: \"1a95b7ab-8632-4332-a30f-64f28ef8d313\") " pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.770763 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ce311f52-0501-45d3-8209-b1d2aa25028b-os-release\") pod \"multus-8hm26\" (UID: \"ce311f52-0501-45d3-8209-b1d2aa25028b\") " pod="openshift-multus/multus-8hm26" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.770779 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ce311f52-0501-45d3-8209-b1d2aa25028b-host-run-k8s-cni-cncf-io\") pod \"multus-8hm26\" (UID: \"ce311f52-0501-45d3-8209-b1d2aa25028b\") " pod="openshift-multus/multus-8hm26" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.770793 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ce311f52-0501-45d3-8209-b1d2aa25028b-cnibin\") pod \"multus-8hm26\" (UID: \"ce311f52-0501-45d3-8209-b1d2aa25028b\") " pod="openshift-multus/multus-8hm26" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.770809 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/18ca8765-c435-4750-b803-14b539958d9e-cni-binary-copy\") pod \"multus-additional-cni-plugins-9xbn5\" (UID: \"18ca8765-c435-4750-b803-14b539958d9e\") " pod="openshift-multus/multus-additional-cni-plugins-9xbn5" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.770812 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ce311f52-0501-45d3-8209-b1d2aa25028b-hostroot\") pod \"multus-8hm26\" (UID: \"ce311f52-0501-45d3-8209-b1d2aa25028b\") " pod="openshift-multus/multus-8hm26" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.770826 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzwzn\" (UniqueName: \"kubernetes.io/projected/1a95b7ab-8632-4332-a30f-64f28ef8d313-kube-api-access-qzwzn\") pod \"ovnkube-node-whk5b\" (UID: \"1a95b7ab-8632-4332-a30f-64f28ef8d313\") " pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.770844 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/18ca8765-c435-4750-b803-14b539958d9e-os-release\") pod \"multus-additional-cni-plugins-9xbn5\" (UID: \"18ca8765-c435-4750-b803-14b539958d9e\") " pod="openshift-multus/multus-additional-cni-plugins-9xbn5" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.770857 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/18ca8765-c435-4750-b803-14b539958d9e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-9xbn5\" (UID: \"18ca8765-c435-4750-b803-14b539958d9e\") " pod="openshift-multus/multus-additional-cni-plugins-9xbn5" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.770860 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/cc694bce-8c25-4729-b452-29d44d3efe6e-rootfs\") pod \"machine-config-daemon-5mz5c\" (UID: \"cc694bce-8c25-4729-b452-29d44d3efe6e\") " pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.770895 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ce311f52-0501-45d3-8209-b1d2aa25028b-host-var-lib-kubelet\") pod \"multus-8hm26\" (UID: \"ce311f52-0501-45d3-8209-b1d2aa25028b\") " pod="openshift-multus/multus-8hm26" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.770897 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ce311f52-0501-45d3-8209-b1d2aa25028b-etc-kubernetes\") pod \"multus-8hm26\" (UID: \"ce311f52-0501-45d3-8209-b1d2aa25028b\") " pod="openshift-multus/multus-8hm26" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.770917 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ce311f52-0501-45d3-8209-b1d2aa25028b-cni-binary-copy\") pod \"multus-8hm26\" (UID: \"ce311f52-0501-45d3-8209-b1d2aa25028b\") " pod="openshift-multus/multus-8hm26" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.770880 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/cc694bce-8c25-4729-b452-29d44d3efe6e-rootfs\") pod \"machine-config-daemon-5mz5c\" (UID: \"cc694bce-8c25-4729-b452-29d44d3efe6e\") " pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.770978 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/18ca8765-c435-4750-b803-14b539958d9e-os-release\") pod \"multus-additional-cni-plugins-9xbn5\" (UID: \"18ca8765-c435-4750-b803-14b539958d9e\") " pod="openshift-multus/multus-additional-cni-plugins-9xbn5" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.771004 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1a95b7ab-8632-4332-a30f-64f28ef8d313-host-run-netns\") pod \"ovnkube-node-whk5b\" (UID: \"1a95b7ab-8632-4332-a30f-64f28ef8d313\") " pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.771019 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1a95b7ab-8632-4332-a30f-64f28ef8d313-systemd-units\") pod \"ovnkube-node-whk5b\" (UID: \"1a95b7ab-8632-4332-a30f-64f28ef8d313\") " pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.771437 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ce311f52-0501-45d3-8209-b1d2aa25028b-os-release\") pod \"multus-8hm26\" (UID: \"ce311f52-0501-45d3-8209-b1d2aa25028b\") " pod="openshift-multus/multus-8hm26" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.771497 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ce311f52-0501-45d3-8209-b1d2aa25028b-multus-socket-dir-parent\") pod \"multus-8hm26\" (UID: \"ce311f52-0501-45d3-8209-b1d2aa25028b\") " pod="openshift-multus/multus-8hm26" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.771535 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ce311f52-0501-45d3-8209-b1d2aa25028b-multus-conf-dir\") pod \"multus-8hm26\" (UID: \"ce311f52-0501-45d3-8209-b1d2aa25028b\") " pod="openshift-multus/multus-8hm26" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.771556 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54j2z\" (UniqueName: \"kubernetes.io/projected/ce311f52-0501-45d3-8209-b1d2aa25028b-kube-api-access-54j2z\") pod \"multus-8hm26\" (UID: \"ce311f52-0501-45d3-8209-b1d2aa25028b\") " pod="openshift-multus/multus-8hm26" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.771557 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/18ca8765-c435-4750-b803-14b539958d9e-cni-binary-copy\") pod \"multus-additional-cni-plugins-9xbn5\" (UID: \"18ca8765-c435-4750-b803-14b539958d9e\") " pod="openshift-multus/multus-additional-cni-plugins-9xbn5" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.771560 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ce311f52-0501-45d3-8209-b1d2aa25028b-multus-daemon-config\") pod \"multus-8hm26\" (UID: \"ce311f52-0501-45d3-8209-b1d2aa25028b\") " pod="openshift-multus/multus-8hm26" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.771614 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ce311f52-0501-45d3-8209-b1d2aa25028b-host-run-k8s-cni-cncf-io\") pod \"multus-8hm26\" (UID: \"ce311f52-0501-45d3-8209-b1d2aa25028b\") " pod="openshift-multus/multus-8hm26" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.771615 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ce311f52-0501-45d3-8209-b1d2aa25028b-multus-conf-dir\") pod \"multus-8hm26\" (UID: \"ce311f52-0501-45d3-8209-b1d2aa25028b\") " pod="openshift-multus/multus-8hm26" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.771639 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ce311f52-0501-45d3-8209-b1d2aa25028b-multus-socket-dir-parent\") pod \"multus-8hm26\" (UID: \"ce311f52-0501-45d3-8209-b1d2aa25028b\") " pod="openshift-multus/multus-8hm26" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.771650 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1a95b7ab-8632-4332-a30f-64f28ef8d313-host-slash\") pod \"ovnkube-node-whk5b\" (UID: \"1a95b7ab-8632-4332-a30f-64f28ef8d313\") " pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.771668 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ce311f52-0501-45d3-8209-b1d2aa25028b-cnibin\") pod \"multus-8hm26\" (UID: \"ce311f52-0501-45d3-8209-b1d2aa25028b\") " pod="openshift-multus/multus-8hm26" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.771698 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1a95b7ab-8632-4332-a30f-64f28ef8d313-host-slash\") pod \"ovnkube-node-whk5b\" (UID: \"1a95b7ab-8632-4332-a30f-64f28ef8d313\") " pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.771719 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1a95b7ab-8632-4332-a30f-64f28ef8d313-node-log\") pod \"ovnkube-node-whk5b\" (UID: \"1a95b7ab-8632-4332-a30f-64f28ef8d313\") " pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.771764 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/cc694bce-8c25-4729-b452-29d44d3efe6e-mcd-auth-proxy-config\") pod \"machine-config-daemon-5mz5c\" (UID: \"cc694bce-8c25-4729-b452-29d44d3efe6e\") " pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.771881 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1a95b7ab-8632-4332-a30f-64f28ef8d313-host-cni-netd\") pod \"ovnkube-node-whk5b\" (UID: \"1a95b7ab-8632-4332-a30f-64f28ef8d313\") " pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.771912 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ce311f52-0501-45d3-8209-b1d2aa25028b-host-run-multus-certs\") pod \"multus-8hm26\" (UID: \"ce311f52-0501-45d3-8209-b1d2aa25028b\") " pod="openshift-multus/multus-8hm26" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.771937 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1a95b7ab-8632-4332-a30f-64f28ef8d313-log-socket\") pod \"ovnkube-node-whk5b\" (UID: \"1a95b7ab-8632-4332-a30f-64f28ef8d313\") " pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.771952 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1a95b7ab-8632-4332-a30f-64f28ef8d313-host-cni-bin\") pod \"ovnkube-node-whk5b\" (UID: \"1a95b7ab-8632-4332-a30f-64f28ef8d313\") " pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.771978 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1a95b7ab-8632-4332-a30f-64f28ef8d313-host-cni-netd\") pod \"ovnkube-node-whk5b\" (UID: \"1a95b7ab-8632-4332-a30f-64f28ef8d313\") " pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.772019 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ce311f52-0501-45d3-8209-b1d2aa25028b-host-run-netns\") pod \"multus-8hm26\" (UID: \"ce311f52-0501-45d3-8209-b1d2aa25028b\") " pod="openshift-multus/multus-8hm26" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.772043 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1a95b7ab-8632-4332-a30f-64f28ef8d313-host-cni-bin\") pod \"ovnkube-node-whk5b\" (UID: \"1a95b7ab-8632-4332-a30f-64f28ef8d313\") " pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.772047 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ce311f52-0501-45d3-8209-b1d2aa25028b-host-run-multus-certs\") pod \"multus-8hm26\" (UID: \"ce311f52-0501-45d3-8209-b1d2aa25028b\") " pod="openshift-multus/multus-8hm26" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.772062 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ce311f52-0501-45d3-8209-b1d2aa25028b-host-run-netns\") pod \"multus-8hm26\" (UID: \"ce311f52-0501-45d3-8209-b1d2aa25028b\") " pod="openshift-multus/multus-8hm26" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.772065 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1a95b7ab-8632-4332-a30f-64f28ef8d313-log-socket\") pod \"ovnkube-node-whk5b\" (UID: \"1a95b7ab-8632-4332-a30f-64f28ef8d313\") " pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.772102 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ce311f52-0501-45d3-8209-b1d2aa25028b-multus-cni-dir\") pod \"multus-8hm26\" (UID: \"ce311f52-0501-45d3-8209-b1d2aa25028b\") " pod="openshift-multus/multus-8hm26" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.772117 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/18ca8765-c435-4750-b803-14b539958d9e-system-cni-dir\") pod \"multus-additional-cni-plugins-9xbn5\" (UID: \"18ca8765-c435-4750-b803-14b539958d9e\") " pod="openshift-multus/multus-additional-cni-plugins-9xbn5" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.772131 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1a95b7ab-8632-4332-a30f-64f28ef8d313-run-systemd\") pod \"ovnkube-node-whk5b\" (UID: \"1a95b7ab-8632-4332-a30f-64f28ef8d313\") " pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.772148 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1a95b7ab-8632-4332-a30f-64f28ef8d313-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-whk5b\" (UID: \"1a95b7ab-8632-4332-a30f-64f28ef8d313\") " pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.772189 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1a95b7ab-8632-4332-a30f-64f28ef8d313-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-whk5b\" (UID: \"1a95b7ab-8632-4332-a30f-64f28ef8d313\") " pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.772211 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/18ca8765-c435-4750-b803-14b539958d9e-system-cni-dir\") pod \"multus-additional-cni-plugins-9xbn5\" (UID: \"18ca8765-c435-4750-b803-14b539958d9e\") " pod="openshift-multus/multus-additional-cni-plugins-9xbn5" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.772224 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ce311f52-0501-45d3-8209-b1d2aa25028b-multus-cni-dir\") pod \"multus-8hm26\" (UID: \"ce311f52-0501-45d3-8209-b1d2aa25028b\") " pod="openshift-multus/multus-8hm26" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.772233 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1a95b7ab-8632-4332-a30f-64f28ef8d313-run-systemd\") pod \"ovnkube-node-whk5b\" (UID: \"1a95b7ab-8632-4332-a30f-64f28ef8d313\") " pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.772259 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/18ca8765-c435-4750-b803-14b539958d9e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-9xbn5\" (UID: \"18ca8765-c435-4750-b803-14b539958d9e\") " pod="openshift-multus/multus-additional-cni-plugins-9xbn5" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.772315 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/cc694bce-8c25-4729-b452-29d44d3efe6e-mcd-auth-proxy-config\") pod \"machine-config-daemon-5mz5c\" (UID: \"cc694bce-8c25-4729-b452-29d44d3efe6e\") " pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.773778 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1a95b7ab-8632-4332-a30f-64f28ef8d313-ovn-node-metrics-cert\") pod \"ovnkube-node-whk5b\" (UID: \"1a95b7ab-8632-4332-a30f-64f28ef8d313\") " pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.774546 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f5988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0147ed28-bc0f-409c-a813-dc2ffffba092\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzz9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f5988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.774816 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cc694bce-8c25-4729-b452-29d44d3efe6e-proxy-tls\") pod \"machine-config-daemon-5mz5c\" (UID: \"cc694bce-8c25-4729-b452-29d44d3efe6e\") " pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.783650 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-st7tw\" (UniqueName: \"kubernetes.io/projected/18ca8765-c435-4750-b803-14b539958d9e-kube-api-access-st7tw\") pod \"multus-additional-cni-plugins-9xbn5\" (UID: \"18ca8765-c435-4750-b803-14b539958d9e\") " pod="openshift-multus/multus-additional-cni-plugins-9xbn5" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.784185 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25l5n\" (UniqueName: \"kubernetes.io/projected/cc694bce-8c25-4729-b452-29d44d3efe6e-kube-api-access-25l5n\") pod \"machine-config-daemon-5mz5c\" (UID: \"cc694bce-8c25-4729-b452-29d44d3efe6e\") " pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.785006 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54j2z\" (UniqueName: \"kubernetes.io/projected/ce311f52-0501-45d3-8209-b1d2aa25028b-kube-api-access-54j2z\") pod \"multus-8hm26\" (UID: \"ce311f52-0501-45d3-8209-b1d2aa25028b\") " pod="openshift-multus/multus-8hm26" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.816293 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.821117 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 12 07:35:21 crc kubenswrapper[4599]: W1012 07:35:21.827741 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-febd299bfef3cce60044c5fcacbcbbcd2fb50f0997907a8b7ac9a1c48e4814d2 WatchSource:0}: Error finding container febd299bfef3cce60044c5fcacbcbbcd2fb50f0997907a8b7ac9a1c48e4814d2: Status 404 returned error can't find the container with id febd299bfef3cce60044c5fcacbcbbcd2fb50f0997907a8b7ac9a1c48e4814d2 Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.828210 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 12 07:35:21 crc kubenswrapper[4599]: W1012 07:35:21.831535 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-3fb81c663bfe426ff7c112944787c22162a633b0ed3b51af661c7e2ecbd24a4b WatchSource:0}: Error finding container 3fb81c663bfe426ff7c112944787c22162a633b0ed3b51af661c7e2ecbd24a4b: Status 404 returned error can't find the container with id 3fb81c663bfe426ff7c112944787c22162a633b0ed3b51af661c7e2ecbd24a4b Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.837487 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-rxr42" Oct 12 07:35:21 crc kubenswrapper[4599]: W1012 07:35:21.839371 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-1ec91dd660f3da374048257cabcd49b5185ebe93c5480f7ac1a9ec636c42d72b WatchSource:0}: Error finding container 1ec91dd660f3da374048257cabcd49b5185ebe93c5480f7ac1a9ec636c42d72b: Status 404 returned error can't find the container with id 1ec91dd660f3da374048257cabcd49b5185ebe93c5480f7ac1a9ec636c42d72b Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.845170 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-f5988" Oct 12 07:35:21 crc kubenswrapper[4599]: W1012 07:35:21.854908 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92909e59_b659_4fc0_91c0_1880ff96b4f3.slice/crio-e64d5d41c912ee09e08197c6885fcf79720e4092a02d2f4a9cf377f86db2c9b2 WatchSource:0}: Error finding container e64d5d41c912ee09e08197c6885fcf79720e4092a02d2f4a9cf377f86db2c9b2: Status 404 returned error can't find the container with id e64d5d41c912ee09e08197c6885fcf79720e4092a02d2f4a9cf377f86db2c9b2 Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.874382 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.900739 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-9xbn5" Oct 12 07:35:21 crc kubenswrapper[4599]: I1012 07:35:21.916226 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-8hm26" Oct 12 07:35:21 crc kubenswrapper[4599]: W1012 07:35:21.931513 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18ca8765_c435_4750_b803_14b539958d9e.slice/crio-9f3a414478dd79dc0b765e17b78f3466991f2e22f8ad8ff4f52c6bfe7846c793 WatchSource:0}: Error finding container 9f3a414478dd79dc0b765e17b78f3466991f2e22f8ad8ff4f52c6bfe7846c793: Status 404 returned error can't find the container with id 9f3a414478dd79dc0b765e17b78f3466991f2e22f8ad8ff4f52c6bfe7846c793 Oct 12 07:35:21 crc kubenswrapper[4599]: W1012 07:35:21.935325 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce311f52_0501_45d3_8209_b1d2aa25028b.slice/crio-c039df3dc04694bce87734dd32893775a3d84748c638f1c81b8bd73c54629646 WatchSource:0}: Error finding container c039df3dc04694bce87734dd32893775a3d84748c638f1c81b8bd73c54629646: Status 404 returned error can't find the container with id c039df3dc04694bce87734dd32893775a3d84748c638f1c81b8bd73c54629646 Oct 12 07:35:22 crc kubenswrapper[4599]: I1012 07:35:22.075119 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 07:35:22 crc kubenswrapper[4599]: E1012 07:35:22.075374 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 07:35:23.07535751 +0000 UTC m=+19.864553012 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 07:35:22 crc kubenswrapper[4599]: I1012 07:35:22.176033 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 07:35:22 crc kubenswrapper[4599]: I1012 07:35:22.176105 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 07:35:22 crc kubenswrapper[4599]: I1012 07:35:22.176128 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 07:35:22 crc kubenswrapper[4599]: I1012 07:35:22.176156 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 07:35:22 crc kubenswrapper[4599]: E1012 07:35:22.176256 4599 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 12 07:35:22 crc kubenswrapper[4599]: E1012 07:35:22.176268 4599 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 12 07:35:22 crc kubenswrapper[4599]: E1012 07:35:22.176310 4599 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 12 07:35:22 crc kubenswrapper[4599]: E1012 07:35:22.176364 4599 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 12 07:35:22 crc kubenswrapper[4599]: E1012 07:35:22.176320 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-12 07:35:23.176306748 +0000 UTC m=+19.965502250 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 12 07:35:22 crc kubenswrapper[4599]: E1012 07:35:22.176380 4599 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 12 07:35:22 crc kubenswrapper[4599]: E1012 07:35:22.176413 4599 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 12 07:35:22 crc kubenswrapper[4599]: E1012 07:35:22.176454 4599 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 12 07:35:22 crc kubenswrapper[4599]: E1012 07:35:22.176470 4599 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 12 07:35:22 crc kubenswrapper[4599]: E1012 07:35:22.176419 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-12 07:35:23.176398243 +0000 UTC m=+19.965593746 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 12 07:35:22 crc kubenswrapper[4599]: E1012 07:35:22.176536 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-12 07:35:23.176517643 +0000 UTC m=+19.965713144 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 12 07:35:22 crc kubenswrapper[4599]: E1012 07:35:22.176564 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-12 07:35:23.176557339 +0000 UTC m=+19.965752841 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 12 07:35:22 crc kubenswrapper[4599]: I1012 07:35:22.624429 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-f5988" event={"ID":"0147ed28-bc0f-409c-a813-dc2ffffba092","Type":"ContainerStarted","Data":"b06de751793dade650ec18334a6bcf8bb26e5e89af47ede264d087d1a950178a"} Oct 12 07:35:22 crc kubenswrapper[4599]: I1012 07:35:22.624739 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-f5988" event={"ID":"0147ed28-bc0f-409c-a813-dc2ffffba092","Type":"ContainerStarted","Data":"3fbc48f006c6a502783bf3ca6eb5d5173b47853ab08cdd68619313534a7938fc"} Oct 12 07:35:22 crc kubenswrapper[4599]: I1012 07:35:22.626527 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-rxr42" event={"ID":"92909e59-b659-4fc0-91c0-1880ff96b4f3","Type":"ContainerStarted","Data":"4162f0c9d7789e80004cf3ea54eed47b4d473832cf68a03e348973c300f9dcc1"} Oct 12 07:35:22 crc kubenswrapper[4599]: I1012 07:35:22.626557 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-rxr42" event={"ID":"92909e59-b659-4fc0-91c0-1880ff96b4f3","Type":"ContainerStarted","Data":"e64d5d41c912ee09e08197c6885fcf79720e4092a02d2f4a9cf377f86db2c9b2"} Oct 12 07:35:22 crc kubenswrapper[4599]: I1012 07:35:22.628871 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"e4b9189620cc5d626fe6c3e7bbe0efd22f7a6f061bc2afd3d33b1e04ebdbfb1c"} Oct 12 07:35:22 crc kubenswrapper[4599]: I1012 07:35:22.628900 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"cc203371af6667734441e7dca5f3b301365eb1136f363df8c87377ae58168130"} Oct 12 07:35:22 crc kubenswrapper[4599]: I1012 07:35:22.628911 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"3fb81c663bfe426ff7c112944787c22162a633b0ed3b51af661c7e2ecbd24a4b"} Oct 12 07:35:22 crc kubenswrapper[4599]: I1012 07:35:22.630462 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" event={"ID":"cc694bce-8c25-4729-b452-29d44d3efe6e","Type":"ContainerStarted","Data":"5c222e0cd037e0719d1ea94d4f28639dae42c676fa56b5f57b22cd50d87e6b29"} Oct 12 07:35:22 crc kubenswrapper[4599]: I1012 07:35:22.630493 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" event={"ID":"cc694bce-8c25-4729-b452-29d44d3efe6e","Type":"ContainerStarted","Data":"238fe8d2b2e7f08d70bc5f938aca3c90b68c447db21a713d651d47b47a8127c2"} Oct 12 07:35:22 crc kubenswrapper[4599]: I1012 07:35:22.630506 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" event={"ID":"cc694bce-8c25-4729-b452-29d44d3efe6e","Type":"ContainerStarted","Data":"0d01e9b597b4a84de30af416243034615d2a2fb909ab98c603298bae895ebcc8"} Oct 12 07:35:22 crc kubenswrapper[4599]: I1012 07:35:22.631882 4599 generic.go:334] "Generic (PLEG): container finished" podID="18ca8765-c435-4750-b803-14b539958d9e" containerID="16c75354b9fbe00273ab4aaa5ca1306c5d097996b8ce8074a2cee8185118a001" exitCode=0 Oct 12 07:35:22 crc kubenswrapper[4599]: I1012 07:35:22.631934 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9xbn5" event={"ID":"18ca8765-c435-4750-b803-14b539958d9e","Type":"ContainerDied","Data":"16c75354b9fbe00273ab4aaa5ca1306c5d097996b8ce8074a2cee8185118a001"} Oct 12 07:35:22 crc kubenswrapper[4599]: I1012 07:35:22.631952 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9xbn5" event={"ID":"18ca8765-c435-4750-b803-14b539958d9e","Type":"ContainerStarted","Data":"9f3a414478dd79dc0b765e17b78f3466991f2e22f8ad8ff4f52c6bfe7846c793"} Oct 12 07:35:22 crc kubenswrapper[4599]: I1012 07:35:22.634019 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"1ec91dd660f3da374048257cabcd49b5185ebe93c5480f7ac1a9ec636c42d72b"} Oct 12 07:35:22 crc kubenswrapper[4599]: I1012 07:35:22.637864 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:22Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:22 crc kubenswrapper[4599]: I1012 07:35:22.638661 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 12 07:35:22 crc kubenswrapper[4599]: I1012 07:35:22.641098 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"bc5c76ab7433309a1f2c32d2ada884b5b0d926fee8219463079f8689a1f7efef"} Oct 12 07:35:22 crc kubenswrapper[4599]: I1012 07:35:22.641576 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 12 07:35:22 crc kubenswrapper[4599]: I1012 07:35:22.643759 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8hm26" event={"ID":"ce311f52-0501-45d3-8209-b1d2aa25028b","Type":"ContainerStarted","Data":"611114a1fecbd461507dd2f566f975c7ef3e8d7655d90c1a5b31cd05a430cb38"} Oct 12 07:35:22 crc kubenswrapper[4599]: I1012 07:35:22.643813 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8hm26" event={"ID":"ce311f52-0501-45d3-8209-b1d2aa25028b","Type":"ContainerStarted","Data":"c039df3dc04694bce87734dd32893775a3d84748c638f1c81b8bd73c54629646"} Oct 12 07:35:22 crc kubenswrapper[4599]: I1012 07:35:22.644613 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"db79c5e0b03856a5da5fba6b8e6cb77590ef9e29090ac66da24d54ea638a20f7"} Oct 12 07:35:22 crc kubenswrapper[4599]: I1012 07:35:22.644644 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"febd299bfef3cce60044c5fcacbcbbcd2fb50f0997907a8b7ac9a1c48e4814d2"} Oct 12 07:35:22 crc kubenswrapper[4599]: I1012 07:35:22.647956 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9xbn5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18ca8765-c435-4750-b803-14b539958d9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9xbn5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:22Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:22 crc kubenswrapper[4599]: I1012 07:35:22.658652 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8hm26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce311f52-0501-45d3-8209-b1d2aa25028b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8hm26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:22Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:22 crc kubenswrapper[4599]: I1012 07:35:22.672675 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:22Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:22 crc kubenswrapper[4599]: I1012 07:35:22.681756 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:22Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:22 crc kubenswrapper[4599]: I1012 07:35:22.691717 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:22Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:22 crc kubenswrapper[4599]: I1012 07:35:22.702312 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rxr42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92909e59-b659-4fc0-91c0-1880ff96b4f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7hr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rxr42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:22Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:22 crc kubenswrapper[4599]: I1012 07:35:22.709511 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f5988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0147ed28-bc0f-409c-a813-dc2ffffba092\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b06de751793dade650ec18334a6bcf8bb26e5e89af47ede264d087d1a950178a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzz9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f5988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:22Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:22 crc kubenswrapper[4599]: I1012 07:35:22.730294 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a95b7ab-8632-4332-a30f-64f28ef8d313\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-whk5b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:22Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:22 crc kubenswrapper[4599]: I1012 07:35:22.745181 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8afb4c88-1806-483a-9957-30833be28202\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c61ee75e4bbe142c54914177759cf0c49ecc36f904c8c6bf421cd4917256d5a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://545c8135b4e5319ac3b9c84ee50556b66e4f5cfe0987f5aa37ef668130f6f267\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40188eb89126b26a3002269d063827de60fa17f72c18fbbd0e15b16c744e8ddb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21d90e31b7ebf70d48881d9d0837e8a1fdeabc3d53988c7ba993ce28e902d3d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21d90e31b7ebf70d48881d9d0837e8a1fdeabc3d53988c7ba993ce28e902d3d4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T07:35:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW1012 07:35:20.755648 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1012 07:35:20.755757 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 07:35:20.757774 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2293705747/tls.crt::/tmp/serving-cert-2293705747/tls.key\\\\\\\"\\\\nI1012 07:35:20.975376 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1012 07:35:20.977324 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1012 07:35:20.977357 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1012 07:35:20.977377 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1012 07:35:20.977382 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1012 07:35:20.982144 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1012 07:35:20.982166 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1012 07:35:20.982172 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1012 07:35:20.982212 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1012 07:35:20.982216 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1012 07:35:20.982220 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1012 07:35:20.982223 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1012 07:35:20.982226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1012 07:35:20.985689 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e595e7d1dee4ecff3b5deb6e363b57de09e4bcfddd6ab9ae9be27b6e3578628\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afc60525d60f5b497a75efb631611d658c23f6336c21ca7421bf3efee3ece0ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afc60525d60f5b497a75efb631611d658c23f6336c21ca7421bf3efee3ece0ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:22Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:22 crc kubenswrapper[4599]: I1012 07:35:22.758976 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:22Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:22 crc kubenswrapper[4599]: E1012 07:35:22.777598 4599 configmap.go:193] Couldn't get configMap openshift-ovn-kubernetes/ovnkube-script-lib: failed to sync configmap cache: timed out waiting for the condition Oct 12 07:35:22 crc kubenswrapper[4599]: E1012 07:35:22.777773 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1a95b7ab-8632-4332-a30f-64f28ef8d313-ovnkube-script-lib podName:1a95b7ab-8632-4332-a30f-64f28ef8d313 nodeName:}" failed. No retries permitted until 2025-10-12 07:35:23.277754856 +0000 UTC m=+20.066950358 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "ovnkube-script-lib" (UniqueName: "kubernetes.io/configmap/1a95b7ab-8632-4332-a30f-64f28ef8d313-ovnkube-script-lib") pod "ovnkube-node-whk5b" (UID: "1a95b7ab-8632-4332-a30f-64f28ef8d313") : failed to sync configmap cache: timed out waiting for the condition Oct 12 07:35:22 crc kubenswrapper[4599]: I1012 07:35:22.778168 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:22Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:22 crc kubenswrapper[4599]: E1012 07:35:22.778427 4599 configmap.go:193] Couldn't get configMap openshift-ovn-kubernetes/ovnkube-config: failed to sync configmap cache: timed out waiting for the condition Oct 12 07:35:22 crc kubenswrapper[4599]: E1012 07:35:22.778522 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1a95b7ab-8632-4332-a30f-64f28ef8d313-ovnkube-config podName:1a95b7ab-8632-4332-a30f-64f28ef8d313 nodeName:}" failed. No retries permitted until 2025-10-12 07:35:23.278512058 +0000 UTC m=+20.067707560 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "ovnkube-config" (UniqueName: "kubernetes.io/configmap/1a95b7ab-8632-4332-a30f-64f28ef8d313-ovnkube-config") pod "ovnkube-node-whk5b" (UID: "1a95b7ab-8632-4332-a30f-64f28ef8d313") : failed to sync configmap cache: timed out waiting for the condition Oct 12 07:35:22 crc kubenswrapper[4599]: E1012 07:35:22.778440 4599 configmap.go:193] Couldn't get configMap openshift-ovn-kubernetes/env-overrides: failed to sync configmap cache: timed out waiting for the condition Oct 12 07:35:22 crc kubenswrapper[4599]: E1012 07:35:22.778655 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1a95b7ab-8632-4332-a30f-64f28ef8d313-env-overrides podName:1a95b7ab-8632-4332-a30f-64f28ef8d313 nodeName:}" failed. No retries permitted until 2025-10-12 07:35:23.278648069 +0000 UTC m=+20.067843571 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "env-overrides" (UniqueName: "kubernetes.io/configmap/1a95b7ab-8632-4332-a30f-64f28ef8d313-env-overrides") pod "ovnkube-node-whk5b" (UID: "1a95b7ab-8632-4332-a30f-64f28ef8d313") : failed to sync configmap cache: timed out waiting for the condition Oct 12 07:35:22 crc kubenswrapper[4599]: I1012 07:35:22.783297 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Oct 12 07:35:22 crc kubenswrapper[4599]: E1012 07:35:22.783480 4599 projected.go:288] Couldn't get configMap openshift-ovn-kubernetes/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Oct 12 07:35:22 crc kubenswrapper[4599]: I1012 07:35:22.795636 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc694bce-8c25-4729-b452-29d44d3efe6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25l5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25l5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5mz5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:22Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:22 crc kubenswrapper[4599]: I1012 07:35:22.807324 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2869b43-5f5f-4ed7-9dc3-63d210bde137\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21abc118842f09c54256bd05f1e5978c928bad569bc9b52157287576117fb49d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb25175bfe42e193200fbc76f82294afbd5f23d9653637c12ddf67553431748e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da490def77b19758f5479ef7070846dee19a5057bd6b0568d96fd38a8dd6528\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://622eac195ecab593bc5fa71a4a88ac987a45be5fb130336aae7acd680c51abe4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:22Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:22 crc kubenswrapper[4599]: I1012 07:35:22.819035 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8afb4c88-1806-483a-9957-30833be28202\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c61ee75e4bbe142c54914177759cf0c49ecc36f904c8c6bf421cd4917256d5a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://545c8135b4e5319ac3b9c84ee50556b66e4f5cfe0987f5aa37ef668130f6f267\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40188eb89126b26a3002269d063827de60fa17f72c18fbbd0e15b16c744e8ddb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc5c76ab7433309a1f2c32d2ada884b5b0d926fee8219463079f8689a1f7efef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21d90e31b7ebf70d48881d9d0837e8a1fdeabc3d53988c7ba993ce28e902d3d4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T07:35:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW1012 07:35:20.755648 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1012 07:35:20.755757 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 07:35:20.757774 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2293705747/tls.crt::/tmp/serving-cert-2293705747/tls.key\\\\\\\"\\\\nI1012 07:35:20.975376 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1012 07:35:20.977324 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1012 07:35:20.977357 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1012 07:35:20.977377 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1012 07:35:20.977382 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1012 07:35:20.982144 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1012 07:35:20.982166 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1012 07:35:20.982172 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1012 07:35:20.982212 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1012 07:35:20.982216 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1012 07:35:20.982220 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1012 07:35:20.982223 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1012 07:35:20.982226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1012 07:35:20.985689 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e595e7d1dee4ecff3b5deb6e363b57de09e4bcfddd6ab9ae9be27b6e3578628\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afc60525d60f5b497a75efb631611d658c23f6336c21ca7421bf3efee3ece0ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afc60525d60f5b497a75efb631611d658c23f6336c21ca7421bf3efee3ece0ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:22Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:22 crc kubenswrapper[4599]: I1012 07:35:22.830218 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db79c5e0b03856a5da5fba6b8e6cb77590ef9e29090ac66da24d54ea638a20f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:22Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:22 crc kubenswrapper[4599]: I1012 07:35:22.845031 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:22Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:22 crc kubenswrapper[4599]: I1012 07:35:22.855423 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc694bce-8c25-4729-b452-29d44d3efe6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c222e0cd037e0719d1ea94d4f28639dae42c676fa56b5f57b22cd50d87e6b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25l5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://238fe8d2b2e7f08d70bc5f938aca3c90b68c447db21a713d651d47b47a8127c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25l5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5mz5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:22Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:22 crc kubenswrapper[4599]: I1012 07:35:22.866095 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2869b43-5f5f-4ed7-9dc3-63d210bde137\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21abc118842f09c54256bd05f1e5978c928bad569bc9b52157287576117fb49d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb25175bfe42e193200fbc76f82294afbd5f23d9653637c12ddf67553431748e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da490def77b19758f5479ef7070846dee19a5057bd6b0568d96fd38a8dd6528\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://622eac195ecab593bc5fa71a4a88ac987a45be5fb130336aae7acd680c51abe4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:22Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:22 crc kubenswrapper[4599]: I1012 07:35:22.875434 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:22Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:22 crc kubenswrapper[4599]: I1012 07:35:22.886972 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9xbn5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18ca8765-c435-4750-b803-14b539958d9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16c75354b9fbe00273ab4aaa5ca1306c5d097996b8ce8074a2cee8185118a001\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16c75354b9fbe00273ab4aaa5ca1306c5d097996b8ce8074a2cee8185118a001\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9xbn5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:22Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:22 crc kubenswrapper[4599]: I1012 07:35:22.897529 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8hm26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce311f52-0501-45d3-8209-b1d2aa25028b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://611114a1fecbd461507dd2f566f975c7ef3e8d7655d90c1a5b31cd05a430cb38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8hm26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:22Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:22 crc kubenswrapper[4599]: I1012 07:35:22.915464 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:22Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:22 crc kubenswrapper[4599]: I1012 07:35:22.927674 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:22Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:22 crc kubenswrapper[4599]: I1012 07:35:22.941141 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4b9189620cc5d626fe6c3e7bbe0efd22f7a6f061bc2afd3d33b1e04ebdbfb1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc203371af6667734441e7dca5f3b301365eb1136f363df8c87377ae58168130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:22Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:22 crc kubenswrapper[4599]: I1012 07:35:22.950477 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rxr42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92909e59-b659-4fc0-91c0-1880ff96b4f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4162f0c9d7789e80004cf3ea54eed47b4d473832cf68a03e348973c300f9dcc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7hr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rxr42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:22Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:22 crc kubenswrapper[4599]: I1012 07:35:22.958733 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f5988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0147ed28-bc0f-409c-a813-dc2ffffba092\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b06de751793dade650ec18334a6bcf8bb26e5e89af47ede264d087d1a950178a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzz9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f5988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:22Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:22 crc kubenswrapper[4599]: I1012 07:35:22.974157 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a95b7ab-8632-4332-a30f-64f28ef8d313\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-whk5b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:22Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:22 crc kubenswrapper[4599]: I1012 07:35:22.974495 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Oct 12 07:35:22 crc kubenswrapper[4599]: E1012 07:35:22.984323 4599 projected.go:194] Error preparing data for projected volume kube-api-access-qzwzn for pod openshift-ovn-kubernetes/ovnkube-node-whk5b: failed to sync configmap cache: timed out waiting for the condition Oct 12 07:35:22 crc kubenswrapper[4599]: E1012 07:35:22.984488 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1a95b7ab-8632-4332-a30f-64f28ef8d313-kube-api-access-qzwzn podName:1a95b7ab-8632-4332-a30f-64f28ef8d313 nodeName:}" failed. No retries permitted until 2025-10-12 07:35:23.484469407 +0000 UTC m=+20.273664909 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-qzwzn" (UniqueName: "kubernetes.io/projected/1a95b7ab-8632-4332-a30f-64f28ef8d313-kube-api-access-qzwzn") pod "ovnkube-node-whk5b" (UID: "1a95b7ab-8632-4332-a30f-64f28ef8d313") : failed to sync configmap cache: timed out waiting for the condition Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.056914 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.087819 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 07:35:23 crc kubenswrapper[4599]: E1012 07:35:23.088056 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 07:35:25.088027396 +0000 UTC m=+21.877222898 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.108167 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.146496 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.188740 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.188839 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.188922 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 07:35:23 crc kubenswrapper[4599]: E1012 07:35:23.188981 4599 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 12 07:35:23 crc kubenswrapper[4599]: E1012 07:35:23.189016 4599 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 12 07:35:23 crc kubenswrapper[4599]: E1012 07:35:23.189028 4599 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 12 07:35:23 crc kubenswrapper[4599]: E1012 07:35:23.189084 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-12 07:35:25.189065735 +0000 UTC m=+21.978261238 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.189182 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 07:35:23 crc kubenswrapper[4599]: E1012 07:35:23.189352 4599 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 12 07:35:23 crc kubenswrapper[4599]: E1012 07:35:23.189448 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-12 07:35:25.189435925 +0000 UTC m=+21.978631418 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 12 07:35:23 crc kubenswrapper[4599]: E1012 07:35:23.189455 4599 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 12 07:35:23 crc kubenswrapper[4599]: E1012 07:35:23.189588 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-12 07:35:25.189579932 +0000 UTC m=+21.978775433 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 12 07:35:23 crc kubenswrapper[4599]: E1012 07:35:23.189635 4599 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 12 07:35:23 crc kubenswrapper[4599]: E1012 07:35:23.189701 4599 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 12 07:35:23 crc kubenswrapper[4599]: E1012 07:35:23.189718 4599 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 12 07:35:23 crc kubenswrapper[4599]: E1012 07:35:23.189808 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-12 07:35:25.189788721 +0000 UTC m=+21.978984233 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.290400 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1a95b7ab-8632-4332-a30f-64f28ef8d313-ovnkube-script-lib\") pod \"ovnkube-node-whk5b\" (UID: \"1a95b7ab-8632-4332-a30f-64f28ef8d313\") " pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.290627 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1a95b7ab-8632-4332-a30f-64f28ef8d313-ovnkube-config\") pod \"ovnkube-node-whk5b\" (UID: \"1a95b7ab-8632-4332-a30f-64f28ef8d313\") " pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.291402 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1a95b7ab-8632-4332-a30f-64f28ef8d313-ovnkube-config\") pod \"ovnkube-node-whk5b\" (UID: \"1a95b7ab-8632-4332-a30f-64f28ef8d313\") " pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.291436 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1a95b7ab-8632-4332-a30f-64f28ef8d313-env-overrides\") pod \"ovnkube-node-whk5b\" (UID: \"1a95b7ab-8632-4332-a30f-64f28ef8d313\") " pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.291436 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1a95b7ab-8632-4332-a30f-64f28ef8d313-ovnkube-script-lib\") pod \"ovnkube-node-whk5b\" (UID: \"1a95b7ab-8632-4332-a30f-64f28ef8d313\") " pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.291646 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1a95b7ab-8632-4332-a30f-64f28ef8d313-env-overrides\") pod \"ovnkube-node-whk5b\" (UID: \"1a95b7ab-8632-4332-a30f-64f28ef8d313\") " pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.494491 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzwzn\" (UniqueName: \"kubernetes.io/projected/1a95b7ab-8632-4332-a30f-64f28ef8d313-kube-api-access-qzwzn\") pod \"ovnkube-node-whk5b\" (UID: \"1a95b7ab-8632-4332-a30f-64f28ef8d313\") " pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.500186 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzwzn\" (UniqueName: \"kubernetes.io/projected/1a95b7ab-8632-4332-a30f-64f28ef8d313-kube-api-access-qzwzn\") pod \"ovnkube-node-whk5b\" (UID: \"1a95b7ab-8632-4332-a30f-64f28ef8d313\") " pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.544541 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.544545 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 07:35:23 crc kubenswrapper[4599]: E1012 07:35:23.544818 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 07:35:23 crc kubenswrapper[4599]: E1012 07:35:23.544973 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.544597 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 07:35:23 crc kubenswrapper[4599]: E1012 07:35:23.545203 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.548309 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.548981 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.550053 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.550650 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.551572 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.552058 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.552617 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.556603 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.558491 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.559291 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9xbn5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18ca8765-c435-4750-b803-14b539958d9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16c75354b9fbe00273ab4aaa5ca1306c5d097996b8ce8074a2cee8185118a001\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16c75354b9fbe00273ab4aaa5ca1306c5d097996b8ce8074a2cee8185118a001\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9xbn5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:23Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.559445 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.559945 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.561031 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.561585 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.562078 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.562980 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.563493 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.564430 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.564825 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.565403 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.566369 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.566814 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.567369 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.568143 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.568763 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.569551 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.570102 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.571026 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.571494 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.571599 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8hm26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce311f52-0501-45d3-8209-b1d2aa25028b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://611114a1fecbd461507dd2f566f975c7ef3e8d7655d90c1a5b31cd05a430cb38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8hm26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:23Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.572407 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.572861 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.573296 4599 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.573740 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.575269 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.575846 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.576647 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.578016 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.578641 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.579636 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.580235 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.581178 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.581650 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.582087 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:23Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.582543 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.583146 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.584023 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.584518 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.585365 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.585853 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.586842 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.587327 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.588153 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.588743 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.589219 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.590078 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.590543 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.592757 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:23Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.602971 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4b9189620cc5d626fe6c3e7bbe0efd22f7a6f061bc2afd3d33b1e04ebdbfb1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc203371af6667734441e7dca5f3b301365eb1136f363df8c87377ae58168130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:23Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.616408 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:23Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.626216 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rxr42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92909e59-b659-4fc0-91c0-1880ff96b4f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4162f0c9d7789e80004cf3ea54eed47b4d473832cf68a03e348973c300f9dcc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7hr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rxr42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:23Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.636827 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f5988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0147ed28-bc0f-409c-a813-dc2ffffba092\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b06de751793dade650ec18334a6bcf8bb26e5e89af47ede264d087d1a950178a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzz9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f5988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:23Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.650190 4599 generic.go:334] "Generic (PLEG): container finished" podID="18ca8765-c435-4750-b803-14b539958d9e" containerID="5a7c28a100796e21008c597d15ee36786f7e87553a8a4feb5734ac79cef9b556" exitCode=0 Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.650503 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9xbn5" event={"ID":"18ca8765-c435-4750-b803-14b539958d9e","Type":"ContainerDied","Data":"5a7c28a100796e21008c597d15ee36786f7e87553a8a4feb5734ac79cef9b556"} Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.652401 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a95b7ab-8632-4332-a30f-64f28ef8d313\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-whk5b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:23Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.665613 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8afb4c88-1806-483a-9957-30833be28202\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c61ee75e4bbe142c54914177759cf0c49ecc36f904c8c6bf421cd4917256d5a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://545c8135b4e5319ac3b9c84ee50556b66e4f5cfe0987f5aa37ef668130f6f267\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40188eb89126b26a3002269d063827de60fa17f72c18fbbd0e15b16c744e8ddb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc5c76ab7433309a1f2c32d2ada884b5b0d926fee8219463079f8689a1f7efef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21d90e31b7ebf70d48881d9d0837e8a1fdeabc3d53988c7ba993ce28e902d3d4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T07:35:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW1012 07:35:20.755648 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1012 07:35:20.755757 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 07:35:20.757774 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2293705747/tls.crt::/tmp/serving-cert-2293705747/tls.key\\\\\\\"\\\\nI1012 07:35:20.975376 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1012 07:35:20.977324 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1012 07:35:20.977357 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1012 07:35:20.977377 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1012 07:35:20.977382 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1012 07:35:20.982144 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1012 07:35:20.982166 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1012 07:35:20.982172 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1012 07:35:20.982212 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1012 07:35:20.982216 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1012 07:35:20.982220 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1012 07:35:20.982223 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1012 07:35:20.982226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1012 07:35:20.985689 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e595e7d1dee4ecff3b5deb6e363b57de09e4bcfddd6ab9ae9be27b6e3578628\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afc60525d60f5b497a75efb631611d658c23f6336c21ca7421bf3efee3ece0ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afc60525d60f5b497a75efb631611d658c23f6336c21ca7421bf3efee3ece0ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:23Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.678113 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db79c5e0b03856a5da5fba6b8e6cb77590ef9e29090ac66da24d54ea638a20f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:23Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.688812 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:23Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.693532 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.701940 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc694bce-8c25-4729-b452-29d44d3efe6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c222e0cd037e0719d1ea94d4f28639dae42c676fa56b5f57b22cd50d87e6b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25l5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://238fe8d2b2e7f08d70bc5f938aca3c90b68c447db21a713d651d47b47a8127c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25l5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5mz5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:23Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.714306 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2869b43-5f5f-4ed7-9dc3-63d210bde137\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21abc118842f09c54256bd05f1e5978c928bad569bc9b52157287576117fb49d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb25175bfe42e193200fbc76f82294afbd5f23d9653637c12ddf67553431748e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da490def77b19758f5479ef7070846dee19a5057bd6b0568d96fd38a8dd6528\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://622eac195ecab593bc5fa71a4a88ac987a45be5fb130336aae7acd680c51abe4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:23Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.724425 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:23Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.733208 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:23Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.756536 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4b9189620cc5d626fe6c3e7bbe0efd22f7a6f061bc2afd3d33b1e04ebdbfb1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc203371af6667734441e7dca5f3b301365eb1136f363df8c87377ae58168130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:23Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.770701 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:23Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.789885 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9xbn5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18ca8765-c435-4750-b803-14b539958d9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16c75354b9fbe00273ab4aaa5ca1306c5d097996b8ce8074a2cee8185118a001\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16c75354b9fbe00273ab4aaa5ca1306c5d097996b8ce8074a2cee8185118a001\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a7c28a100796e21008c597d15ee36786f7e87553a8a4feb5734ac79cef9b556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a7c28a100796e21008c597d15ee36786f7e87553a8a4feb5734ac79cef9b556\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9xbn5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:23Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.806518 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8hm26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce311f52-0501-45d3-8209-b1d2aa25028b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://611114a1fecbd461507dd2f566f975c7ef3e8d7655d90c1a5b31cd05a430cb38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8hm26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:23Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.817615 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rxr42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92909e59-b659-4fc0-91c0-1880ff96b4f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4162f0c9d7789e80004cf3ea54eed47b4d473832cf68a03e348973c300f9dcc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7hr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rxr42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:23Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.825281 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f5988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0147ed28-bc0f-409c-a813-dc2ffffba092\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b06de751793dade650ec18334a6bcf8bb26e5e89af47ede264d087d1a950178a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzz9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f5988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:23Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.842218 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a95b7ab-8632-4332-a30f-64f28ef8d313\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-whk5b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:23Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.852747 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8afb4c88-1806-483a-9957-30833be28202\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c61ee75e4bbe142c54914177759cf0c49ecc36f904c8c6bf421cd4917256d5a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://545c8135b4e5319ac3b9c84ee50556b66e4f5cfe0987f5aa37ef668130f6f267\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40188eb89126b26a3002269d063827de60fa17f72c18fbbd0e15b16c744e8ddb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc5c76ab7433309a1f2c32d2ada884b5b0d926fee8219463079f8689a1f7efef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21d90e31b7ebf70d48881d9d0837e8a1fdeabc3d53988c7ba993ce28e902d3d4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T07:35:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW1012 07:35:20.755648 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1012 07:35:20.755757 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 07:35:20.757774 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2293705747/tls.crt::/tmp/serving-cert-2293705747/tls.key\\\\\\\"\\\\nI1012 07:35:20.975376 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1012 07:35:20.977324 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1012 07:35:20.977357 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1012 07:35:20.977377 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1012 07:35:20.977382 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1012 07:35:20.982144 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1012 07:35:20.982166 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1012 07:35:20.982172 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1012 07:35:20.982212 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1012 07:35:20.982216 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1012 07:35:20.982220 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1012 07:35:20.982223 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1012 07:35:20.982226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1012 07:35:20.985689 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e595e7d1dee4ecff3b5deb6e363b57de09e4bcfddd6ab9ae9be27b6e3578628\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afc60525d60f5b497a75efb631611d658c23f6336c21ca7421bf3efee3ece0ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afc60525d60f5b497a75efb631611d658c23f6336c21ca7421bf3efee3ece0ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:23Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.862712 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db79c5e0b03856a5da5fba6b8e6cb77590ef9e29090ac66da24d54ea638a20f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:23Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.883770 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:23Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.896187 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc694bce-8c25-4729-b452-29d44d3efe6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c222e0cd037e0719d1ea94d4f28639dae42c676fa56b5f57b22cd50d87e6b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25l5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://238fe8d2b2e7f08d70bc5f938aca3c90b68c447db21a713d651d47b47a8127c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25l5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5mz5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:23Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.906601 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2869b43-5f5f-4ed7-9dc3-63d210bde137\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21abc118842f09c54256bd05f1e5978c928bad569bc9b52157287576117fb49d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb25175bfe42e193200fbc76f82294afbd5f23d9653637c12ddf67553431748e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da490def77b19758f5479ef7070846dee19a5057bd6b0568d96fd38a8dd6528\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://622eac195ecab593bc5fa71a4a88ac987a45be5fb130336aae7acd680c51abe4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:23Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.953910 4599 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.955876 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.955916 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.955929 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.956006 4599 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.961376 4599 kubelet_node_status.go:115] "Node was previously registered" node="crc" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.961581 4599 kubelet_node_status.go:79] "Successfully registered node" node="crc" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.962331 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.962385 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.962397 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.962411 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.962420 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:23Z","lastTransitionTime":"2025-10-12T07:35:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:23 crc kubenswrapper[4599]: E1012 07:35:23.976117 4599 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b3fd0901-686d-4e85-8767-c56bc470edcc\\\",\\\"systemUUID\\\":\\\"c0d90076-d180-408b-98a9-f48996ced0a6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:23Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.978590 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.978620 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.978630 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.978645 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.978659 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:23Z","lastTransitionTime":"2025-10-12T07:35:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:23 crc kubenswrapper[4599]: E1012 07:35:23.987049 4599 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b3fd0901-686d-4e85-8767-c56bc470edcc\\\",\\\"systemUUID\\\":\\\"c0d90076-d180-408b-98a9-f48996ced0a6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:23Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.989809 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.989838 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.989849 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.989860 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:23 crc kubenswrapper[4599]: I1012 07:35:23.989868 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:23Z","lastTransitionTime":"2025-10-12T07:35:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:23 crc kubenswrapper[4599]: E1012 07:35:23.998555 4599 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b3fd0901-686d-4e85-8767-c56bc470edcc\\\",\\\"systemUUID\\\":\\\"c0d90076-d180-408b-98a9-f48996ced0a6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:23Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:24 crc kubenswrapper[4599]: I1012 07:35:24.001939 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:24 crc kubenswrapper[4599]: I1012 07:35:24.002055 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:24 crc kubenswrapper[4599]: I1012 07:35:24.002123 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:24 crc kubenswrapper[4599]: I1012 07:35:24.002209 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:24 crc kubenswrapper[4599]: I1012 07:35:24.002265 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:24Z","lastTransitionTime":"2025-10-12T07:35:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:24 crc kubenswrapper[4599]: E1012 07:35:24.014084 4599 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b3fd0901-686d-4e85-8767-c56bc470edcc\\\",\\\"systemUUID\\\":\\\"c0d90076-d180-408b-98a9-f48996ced0a6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:24Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:24 crc kubenswrapper[4599]: I1012 07:35:24.017224 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:24 crc kubenswrapper[4599]: I1012 07:35:24.017258 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:24 crc kubenswrapper[4599]: I1012 07:35:24.017270 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:24 crc kubenswrapper[4599]: I1012 07:35:24.017287 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:24 crc kubenswrapper[4599]: I1012 07:35:24.017300 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:24Z","lastTransitionTime":"2025-10-12T07:35:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:24 crc kubenswrapper[4599]: E1012 07:35:24.027196 4599 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b3fd0901-686d-4e85-8767-c56bc470edcc\\\",\\\"systemUUID\\\":\\\"c0d90076-d180-408b-98a9-f48996ced0a6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:24Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:24 crc kubenswrapper[4599]: E1012 07:35:24.027327 4599 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 12 07:35:24 crc kubenswrapper[4599]: I1012 07:35:24.029307 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:24 crc kubenswrapper[4599]: I1012 07:35:24.029356 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:24 crc kubenswrapper[4599]: I1012 07:35:24.029379 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:24 crc kubenswrapper[4599]: I1012 07:35:24.029398 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:24 crc kubenswrapper[4599]: I1012 07:35:24.029410 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:24Z","lastTransitionTime":"2025-10-12T07:35:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:24 crc kubenswrapper[4599]: I1012 07:35:24.131511 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:24 crc kubenswrapper[4599]: I1012 07:35:24.131727 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:24 crc kubenswrapper[4599]: I1012 07:35:24.131806 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:24 crc kubenswrapper[4599]: I1012 07:35:24.131878 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:24 crc kubenswrapper[4599]: I1012 07:35:24.132133 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:24Z","lastTransitionTime":"2025-10-12T07:35:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:24 crc kubenswrapper[4599]: I1012 07:35:24.234607 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:24 crc kubenswrapper[4599]: I1012 07:35:24.234657 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:24 crc kubenswrapper[4599]: I1012 07:35:24.234673 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:24 crc kubenswrapper[4599]: I1012 07:35:24.234702 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:24 crc kubenswrapper[4599]: I1012 07:35:24.234715 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:24Z","lastTransitionTime":"2025-10-12T07:35:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:24 crc kubenswrapper[4599]: I1012 07:35:24.337532 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:24 crc kubenswrapper[4599]: I1012 07:35:24.337567 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:24 crc kubenswrapper[4599]: I1012 07:35:24.337577 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:24 crc kubenswrapper[4599]: I1012 07:35:24.337592 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:24 crc kubenswrapper[4599]: I1012 07:35:24.337602 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:24Z","lastTransitionTime":"2025-10-12T07:35:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:24 crc kubenswrapper[4599]: I1012 07:35:24.446625 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:24 crc kubenswrapper[4599]: I1012 07:35:24.446863 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:24 crc kubenswrapper[4599]: I1012 07:35:24.446944 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:24 crc kubenswrapper[4599]: I1012 07:35:24.447038 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:24 crc kubenswrapper[4599]: I1012 07:35:24.447113 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:24Z","lastTransitionTime":"2025-10-12T07:35:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:24 crc kubenswrapper[4599]: I1012 07:35:24.550285 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:24 crc kubenswrapper[4599]: I1012 07:35:24.550358 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:24 crc kubenswrapper[4599]: I1012 07:35:24.550371 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:24 crc kubenswrapper[4599]: I1012 07:35:24.550391 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:24 crc kubenswrapper[4599]: I1012 07:35:24.550402 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:24Z","lastTransitionTime":"2025-10-12T07:35:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:24 crc kubenswrapper[4599]: I1012 07:35:24.653211 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:24 crc kubenswrapper[4599]: I1012 07:35:24.653246 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:24 crc kubenswrapper[4599]: I1012 07:35:24.653256 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:24 crc kubenswrapper[4599]: I1012 07:35:24.653272 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:24 crc kubenswrapper[4599]: I1012 07:35:24.653282 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:24Z","lastTransitionTime":"2025-10-12T07:35:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:24 crc kubenswrapper[4599]: I1012 07:35:24.655458 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"7b574e7cb76b05a81f63a55ce4544b1d4b3a58e6793be2c89620f66282642e39"} Oct 12 07:35:24 crc kubenswrapper[4599]: I1012 07:35:24.657458 4599 generic.go:334] "Generic (PLEG): container finished" podID="18ca8765-c435-4750-b803-14b539958d9e" containerID="e6388c1e143c0caded216941ec25bd07be0a14f2f97f6328c35d5cc04815a88a" exitCode=0 Oct 12 07:35:24 crc kubenswrapper[4599]: I1012 07:35:24.657520 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9xbn5" event={"ID":"18ca8765-c435-4750-b803-14b539958d9e","Type":"ContainerDied","Data":"e6388c1e143c0caded216941ec25bd07be0a14f2f97f6328c35d5cc04815a88a"} Oct 12 07:35:24 crc kubenswrapper[4599]: I1012 07:35:24.659314 4599 generic.go:334] "Generic (PLEG): container finished" podID="1a95b7ab-8632-4332-a30f-64f28ef8d313" containerID="ce4f64c7fab07385f9215821c5c75c5b345653fa141a238bb3dd74e7a1a291a5" exitCode=0 Oct 12 07:35:24 crc kubenswrapper[4599]: I1012 07:35:24.659378 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" event={"ID":"1a95b7ab-8632-4332-a30f-64f28ef8d313","Type":"ContainerDied","Data":"ce4f64c7fab07385f9215821c5c75c5b345653fa141a238bb3dd74e7a1a291a5"} Oct 12 07:35:24 crc kubenswrapper[4599]: I1012 07:35:24.659450 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" event={"ID":"1a95b7ab-8632-4332-a30f-64f28ef8d313","Type":"ContainerStarted","Data":"ae3914feb32fa6adbbf41b62fde5ead6f30a591f461ac1ee313218ae5cefb8c3"} Oct 12 07:35:24 crc kubenswrapper[4599]: I1012 07:35:24.669164 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2869b43-5f5f-4ed7-9dc3-63d210bde137\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21abc118842f09c54256bd05f1e5978c928bad569bc9b52157287576117fb49d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb25175bfe42e193200fbc76f82294afbd5f23d9653637c12ddf67553431748e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da490def77b19758f5479ef7070846dee19a5057bd6b0568d96fd38a8dd6528\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://622eac195ecab593bc5fa71a4a88ac987a45be5fb130336aae7acd680c51abe4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:24Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:24 crc kubenswrapper[4599]: I1012 07:35:24.679312 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4b9189620cc5d626fe6c3e7bbe0efd22f7a6f061bc2afd3d33b1e04ebdbfb1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc203371af6667734441e7dca5f3b301365eb1136f363df8c87377ae58168130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:24Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:24 crc kubenswrapper[4599]: I1012 07:35:24.692776 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b574e7cb76b05a81f63a55ce4544b1d4b3a58e6793be2c89620f66282642e39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:24Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:24 crc kubenswrapper[4599]: I1012 07:35:24.708965 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9xbn5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18ca8765-c435-4750-b803-14b539958d9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16c75354b9fbe00273ab4aaa5ca1306c5d097996b8ce8074a2cee8185118a001\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16c75354b9fbe00273ab4aaa5ca1306c5d097996b8ce8074a2cee8185118a001\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a7c28a100796e21008c597d15ee36786f7e87553a8a4feb5734ac79cef9b556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a7c28a100796e21008c597d15ee36786f7e87553a8a4feb5734ac79cef9b556\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9xbn5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:24Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:24 crc kubenswrapper[4599]: I1012 07:35:24.718910 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8hm26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce311f52-0501-45d3-8209-b1d2aa25028b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://611114a1fecbd461507dd2f566f975c7ef3e8d7655d90c1a5b31cd05a430cb38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8hm26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:24Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:24 crc kubenswrapper[4599]: I1012 07:35:24.729072 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:24Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:24 crc kubenswrapper[4599]: I1012 07:35:24.737862 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:24Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:24 crc kubenswrapper[4599]: I1012 07:35:24.752037 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a95b7ab-8632-4332-a30f-64f28ef8d313\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-whk5b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:24Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:24 crc kubenswrapper[4599]: I1012 07:35:24.755803 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:24 crc kubenswrapper[4599]: I1012 07:35:24.755834 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:24 crc kubenswrapper[4599]: I1012 07:35:24.755843 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:24 crc kubenswrapper[4599]: I1012 07:35:24.755878 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:24 crc kubenswrapper[4599]: I1012 07:35:24.755891 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:24Z","lastTransitionTime":"2025-10-12T07:35:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:24 crc kubenswrapper[4599]: I1012 07:35:24.762034 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rxr42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92909e59-b659-4fc0-91c0-1880ff96b4f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4162f0c9d7789e80004cf3ea54eed47b4d473832cf68a03e348973c300f9dcc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7hr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rxr42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:24Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:24 crc kubenswrapper[4599]: I1012 07:35:24.770298 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f5988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0147ed28-bc0f-409c-a813-dc2ffffba092\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b06de751793dade650ec18334a6bcf8bb26e5e89af47ede264d087d1a950178a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzz9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f5988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:24Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:24 crc kubenswrapper[4599]: I1012 07:35:24.779522 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc694bce-8c25-4729-b452-29d44d3efe6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c222e0cd037e0719d1ea94d4f28639dae42c676fa56b5f57b22cd50d87e6b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25l5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://238fe8d2b2e7f08d70bc5f938aca3c90b68c447db21a713d651d47b47a8127c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25l5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5mz5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:24Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:24 crc kubenswrapper[4599]: I1012 07:35:24.790195 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8afb4c88-1806-483a-9957-30833be28202\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c61ee75e4bbe142c54914177759cf0c49ecc36f904c8c6bf421cd4917256d5a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://545c8135b4e5319ac3b9c84ee50556b66e4f5cfe0987f5aa37ef668130f6f267\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40188eb89126b26a3002269d063827de60fa17f72c18fbbd0e15b16c744e8ddb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc5c76ab7433309a1f2c32d2ada884b5b0d926fee8219463079f8689a1f7efef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21d90e31b7ebf70d48881d9d0837e8a1fdeabc3d53988c7ba993ce28e902d3d4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T07:35:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW1012 07:35:20.755648 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1012 07:35:20.755757 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 07:35:20.757774 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2293705747/tls.crt::/tmp/serving-cert-2293705747/tls.key\\\\\\\"\\\\nI1012 07:35:20.975376 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1012 07:35:20.977324 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1012 07:35:20.977357 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1012 07:35:20.977377 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1012 07:35:20.977382 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1012 07:35:20.982144 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1012 07:35:20.982166 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1012 07:35:20.982172 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1012 07:35:20.982212 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1012 07:35:20.982216 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1012 07:35:20.982220 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1012 07:35:20.982223 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1012 07:35:20.982226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1012 07:35:20.985689 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e595e7d1dee4ecff3b5deb6e363b57de09e4bcfddd6ab9ae9be27b6e3578628\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afc60525d60f5b497a75efb631611d658c23f6336c21ca7421bf3efee3ece0ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afc60525d60f5b497a75efb631611d658c23f6336c21ca7421bf3efee3ece0ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:24Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:24 crc kubenswrapper[4599]: I1012 07:35:24.800238 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db79c5e0b03856a5da5fba6b8e6cb77590ef9e29090ac66da24d54ea638a20f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:24Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:24 crc kubenswrapper[4599]: I1012 07:35:24.809484 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:24Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:24 crc kubenswrapper[4599]: I1012 07:35:24.821020 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2869b43-5f5f-4ed7-9dc3-63d210bde137\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21abc118842f09c54256bd05f1e5978c928bad569bc9b52157287576117fb49d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb25175bfe42e193200fbc76f82294afbd5f23d9653637c12ddf67553431748e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da490def77b19758f5479ef7070846dee19a5057bd6b0568d96fd38a8dd6528\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://622eac195ecab593bc5fa71a4a88ac987a45be5fb130336aae7acd680c51abe4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:24Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:24 crc kubenswrapper[4599]: I1012 07:35:24.833640 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:24Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:24 crc kubenswrapper[4599]: I1012 07:35:24.845421 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:24Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:24 crc kubenswrapper[4599]: I1012 07:35:24.857514 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4b9189620cc5d626fe6c3e7bbe0efd22f7a6f061bc2afd3d33b1e04ebdbfb1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc203371af6667734441e7dca5f3b301365eb1136f363df8c87377ae58168130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:24Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:24 crc kubenswrapper[4599]: I1012 07:35:24.858172 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:24 crc kubenswrapper[4599]: I1012 07:35:24.858218 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:24 crc kubenswrapper[4599]: I1012 07:35:24.858229 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:24 crc kubenswrapper[4599]: I1012 07:35:24.858245 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:24 crc kubenswrapper[4599]: I1012 07:35:24.858256 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:24Z","lastTransitionTime":"2025-10-12T07:35:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:24 crc kubenswrapper[4599]: I1012 07:35:24.866144 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b574e7cb76b05a81f63a55ce4544b1d4b3a58e6793be2c89620f66282642e39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:24Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:24 crc kubenswrapper[4599]: I1012 07:35:24.876458 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9xbn5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18ca8765-c435-4750-b803-14b539958d9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16c75354b9fbe00273ab4aaa5ca1306c5d097996b8ce8074a2cee8185118a001\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16c75354b9fbe00273ab4aaa5ca1306c5d097996b8ce8074a2cee8185118a001\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a7c28a100796e21008c597d15ee36786f7e87553a8a4feb5734ac79cef9b556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a7c28a100796e21008c597d15ee36786f7e87553a8a4feb5734ac79cef9b556\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6388c1e143c0caded216941ec25bd07be0a14f2f97f6328c35d5cc04815a88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6388c1e143c0caded216941ec25bd07be0a14f2f97f6328c35d5cc04815a88a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9xbn5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:24Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:24 crc kubenswrapper[4599]: I1012 07:35:24.887319 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8hm26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce311f52-0501-45d3-8209-b1d2aa25028b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://611114a1fecbd461507dd2f566f975c7ef3e8d7655d90c1a5b31cd05a430cb38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8hm26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:24Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:24 crc kubenswrapper[4599]: I1012 07:35:24.896043 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rxr42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92909e59-b659-4fc0-91c0-1880ff96b4f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4162f0c9d7789e80004cf3ea54eed47b4d473832cf68a03e348973c300f9dcc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7hr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rxr42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:24Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:24 crc kubenswrapper[4599]: I1012 07:35:24.903669 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f5988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0147ed28-bc0f-409c-a813-dc2ffffba092\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b06de751793dade650ec18334a6bcf8bb26e5e89af47ede264d087d1a950178a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzz9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f5988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:24Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:24 crc kubenswrapper[4599]: I1012 07:35:24.918937 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a95b7ab-8632-4332-a30f-64f28ef8d313\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce4f64c7fab07385f9215821c5c75c5b345653fa141a238bb3dd74e7a1a291a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce4f64c7fab07385f9215821c5c75c5b345653fa141a238bb3dd74e7a1a291a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-whk5b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:24Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:24 crc kubenswrapper[4599]: I1012 07:35:24.929505 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db79c5e0b03856a5da5fba6b8e6cb77590ef9e29090ac66da24d54ea638a20f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:24Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:24 crc kubenswrapper[4599]: I1012 07:35:24.949907 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:24Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:24 crc kubenswrapper[4599]: I1012 07:35:24.963379 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:24 crc kubenswrapper[4599]: I1012 07:35:24.963412 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:24 crc kubenswrapper[4599]: I1012 07:35:24.963422 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:24 crc kubenswrapper[4599]: I1012 07:35:24.963438 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:24 crc kubenswrapper[4599]: I1012 07:35:24.963448 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:24Z","lastTransitionTime":"2025-10-12T07:35:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:24 crc kubenswrapper[4599]: I1012 07:35:24.991592 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc694bce-8c25-4729-b452-29d44d3efe6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c222e0cd037e0719d1ea94d4f28639dae42c676fa56b5f57b22cd50d87e6b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25l5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://238fe8d2b2e7f08d70bc5f938aca3c90b68c447db21a713d651d47b47a8127c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25l5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5mz5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:24Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:25 crc kubenswrapper[4599]: I1012 07:35:25.030438 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8afb4c88-1806-483a-9957-30833be28202\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c61ee75e4bbe142c54914177759cf0c49ecc36f904c8c6bf421cd4917256d5a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://545c8135b4e5319ac3b9c84ee50556b66e4f5cfe0987f5aa37ef668130f6f267\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40188eb89126b26a3002269d063827de60fa17f72c18fbbd0e15b16c744e8ddb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc5c76ab7433309a1f2c32d2ada884b5b0d926fee8219463079f8689a1f7efef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21d90e31b7ebf70d48881d9d0837e8a1fdeabc3d53988c7ba993ce28e902d3d4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T07:35:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW1012 07:35:20.755648 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1012 07:35:20.755757 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 07:35:20.757774 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2293705747/tls.crt::/tmp/serving-cert-2293705747/tls.key\\\\\\\"\\\\nI1012 07:35:20.975376 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1012 07:35:20.977324 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1012 07:35:20.977357 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1012 07:35:20.977377 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1012 07:35:20.977382 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1012 07:35:20.982144 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1012 07:35:20.982166 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1012 07:35:20.982172 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1012 07:35:20.982212 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1012 07:35:20.982216 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1012 07:35:20.982220 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1012 07:35:20.982223 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1012 07:35:20.982226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1012 07:35:20.985689 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e595e7d1dee4ecff3b5deb6e363b57de09e4bcfddd6ab9ae9be27b6e3578628\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afc60525d60f5b497a75efb631611d658c23f6336c21ca7421bf3efee3ece0ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afc60525d60f5b497a75efb631611d658c23f6336c21ca7421bf3efee3ece0ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:25Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:25 crc kubenswrapper[4599]: I1012 07:35:25.066997 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:25 crc kubenswrapper[4599]: I1012 07:35:25.067042 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:25 crc kubenswrapper[4599]: I1012 07:35:25.067053 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:25 crc kubenswrapper[4599]: I1012 07:35:25.067076 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:25 crc kubenswrapper[4599]: I1012 07:35:25.067090 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:25Z","lastTransitionTime":"2025-10-12T07:35:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:25 crc kubenswrapper[4599]: I1012 07:35:25.110295 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 07:35:25 crc kubenswrapper[4599]: E1012 07:35:25.110458 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 07:35:29.110431381 +0000 UTC m=+25.899626884 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 07:35:25 crc kubenswrapper[4599]: I1012 07:35:25.169490 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:25 crc kubenswrapper[4599]: I1012 07:35:25.169531 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:25 crc kubenswrapper[4599]: I1012 07:35:25.169540 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:25 crc kubenswrapper[4599]: I1012 07:35:25.169555 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:25 crc kubenswrapper[4599]: I1012 07:35:25.169566 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:25Z","lastTransitionTime":"2025-10-12T07:35:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:25 crc kubenswrapper[4599]: I1012 07:35:25.211975 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 07:35:25 crc kubenswrapper[4599]: I1012 07:35:25.212023 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 07:35:25 crc kubenswrapper[4599]: I1012 07:35:25.212044 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 07:35:25 crc kubenswrapper[4599]: I1012 07:35:25.212065 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 07:35:25 crc kubenswrapper[4599]: E1012 07:35:25.212152 4599 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 12 07:35:25 crc kubenswrapper[4599]: E1012 07:35:25.212234 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-12 07:35:29.212218706 +0000 UTC m=+26.001414208 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 12 07:35:25 crc kubenswrapper[4599]: E1012 07:35:25.212250 4599 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 12 07:35:25 crc kubenswrapper[4599]: E1012 07:35:25.212287 4599 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 12 07:35:25 crc kubenswrapper[4599]: E1012 07:35:25.212300 4599 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 12 07:35:25 crc kubenswrapper[4599]: E1012 07:35:25.212250 4599 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 12 07:35:25 crc kubenswrapper[4599]: E1012 07:35:25.212386 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-12 07:35:29.212367642 +0000 UTC m=+26.001563144 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 12 07:35:25 crc kubenswrapper[4599]: E1012 07:35:25.212431 4599 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 12 07:35:25 crc kubenswrapper[4599]: E1012 07:35:25.212456 4599 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 12 07:35:25 crc kubenswrapper[4599]: E1012 07:35:25.212156 4599 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 12 07:35:25 crc kubenswrapper[4599]: E1012 07:35:25.212498 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-12 07:35:29.212490297 +0000 UTC m=+26.001685799 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 12 07:35:25 crc kubenswrapper[4599]: E1012 07:35:25.212528 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-12 07:35:29.212505666 +0000 UTC m=+26.001701168 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 12 07:35:25 crc kubenswrapper[4599]: I1012 07:35:25.271686 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:25 crc kubenswrapper[4599]: I1012 07:35:25.271730 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:25 crc kubenswrapper[4599]: I1012 07:35:25.271740 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:25 crc kubenswrapper[4599]: I1012 07:35:25.271755 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:25 crc kubenswrapper[4599]: I1012 07:35:25.271766 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:25Z","lastTransitionTime":"2025-10-12T07:35:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:25 crc kubenswrapper[4599]: I1012 07:35:25.374532 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:25 crc kubenswrapper[4599]: I1012 07:35:25.374595 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:25 crc kubenswrapper[4599]: I1012 07:35:25.374605 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:25 crc kubenswrapper[4599]: I1012 07:35:25.374625 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:25 crc kubenswrapper[4599]: I1012 07:35:25.374637 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:25Z","lastTransitionTime":"2025-10-12T07:35:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:25 crc kubenswrapper[4599]: I1012 07:35:25.477193 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:25 crc kubenswrapper[4599]: I1012 07:35:25.477280 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:25 crc kubenswrapper[4599]: I1012 07:35:25.478210 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:25 crc kubenswrapper[4599]: I1012 07:35:25.478254 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:25 crc kubenswrapper[4599]: I1012 07:35:25.478276 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:25Z","lastTransitionTime":"2025-10-12T07:35:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:25 crc kubenswrapper[4599]: I1012 07:35:25.544240 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 07:35:25 crc kubenswrapper[4599]: I1012 07:35:25.544350 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 07:35:25 crc kubenswrapper[4599]: I1012 07:35:25.544400 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 07:35:25 crc kubenswrapper[4599]: E1012 07:35:25.544398 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 07:35:25 crc kubenswrapper[4599]: E1012 07:35:25.544482 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 07:35:25 crc kubenswrapper[4599]: E1012 07:35:25.544557 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 07:35:25 crc kubenswrapper[4599]: I1012 07:35:25.580468 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:25 crc kubenswrapper[4599]: I1012 07:35:25.580510 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:25 crc kubenswrapper[4599]: I1012 07:35:25.580519 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:25 crc kubenswrapper[4599]: I1012 07:35:25.580535 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:25 crc kubenswrapper[4599]: I1012 07:35:25.580545 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:25Z","lastTransitionTime":"2025-10-12T07:35:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:25 crc kubenswrapper[4599]: I1012 07:35:25.667017 4599 generic.go:334] "Generic (PLEG): container finished" podID="18ca8765-c435-4750-b803-14b539958d9e" containerID="e201774968890569efafc736378cff0d700067bb0bd39677f42ef65fa2c07646" exitCode=0 Oct 12 07:35:25 crc kubenswrapper[4599]: I1012 07:35:25.667102 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9xbn5" event={"ID":"18ca8765-c435-4750-b803-14b539958d9e","Type":"ContainerDied","Data":"e201774968890569efafc736378cff0d700067bb0bd39677f42ef65fa2c07646"} Oct 12 07:35:25 crc kubenswrapper[4599]: I1012 07:35:25.673114 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" event={"ID":"1a95b7ab-8632-4332-a30f-64f28ef8d313","Type":"ContainerStarted","Data":"5f73bbbb549129c48a1421dda3d2385cf3515feafd8215cdac6b792823d614b8"} Oct 12 07:35:25 crc kubenswrapper[4599]: I1012 07:35:25.673221 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" event={"ID":"1a95b7ab-8632-4332-a30f-64f28ef8d313","Type":"ContainerStarted","Data":"a156f480c7252f61ec3dc991814db0e7259fdf7ae618c8f43d86da97f8a6bc5c"} Oct 12 07:35:25 crc kubenswrapper[4599]: I1012 07:35:25.673290 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" event={"ID":"1a95b7ab-8632-4332-a30f-64f28ef8d313","Type":"ContainerStarted","Data":"c150b687d6b0c337604ac14537335d57d9c955a9e31001d274507e5e65ac7bc5"} Oct 12 07:35:25 crc kubenswrapper[4599]: I1012 07:35:25.673366 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" event={"ID":"1a95b7ab-8632-4332-a30f-64f28ef8d313","Type":"ContainerStarted","Data":"543f674ce98b17c90c5a351ae72f596a2e90006a1b86fcede3359715c67b94bb"} Oct 12 07:35:25 crc kubenswrapper[4599]: I1012 07:35:25.673426 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" event={"ID":"1a95b7ab-8632-4332-a30f-64f28ef8d313","Type":"ContainerStarted","Data":"f99854365302221a36d3d7647292430b5655222c551953b6aa6de9eb1c922bea"} Oct 12 07:35:25 crc kubenswrapper[4599]: I1012 07:35:25.673483 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" event={"ID":"1a95b7ab-8632-4332-a30f-64f28ef8d313","Type":"ContainerStarted","Data":"810de96e34e6b028e90bcc5693f1624d337cdd775c01901fc91e6dd2a7f3564d"} Oct 12 07:35:25 crc kubenswrapper[4599]: I1012 07:35:25.679440 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rxr42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92909e59-b659-4fc0-91c0-1880ff96b4f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4162f0c9d7789e80004cf3ea54eed47b4d473832cf68a03e348973c300f9dcc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7hr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rxr42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:25Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:25 crc kubenswrapper[4599]: I1012 07:35:25.682761 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:25 crc kubenswrapper[4599]: I1012 07:35:25.682800 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:25 crc kubenswrapper[4599]: I1012 07:35:25.682812 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:25 crc kubenswrapper[4599]: I1012 07:35:25.682830 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:25 crc kubenswrapper[4599]: I1012 07:35:25.682844 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:25Z","lastTransitionTime":"2025-10-12T07:35:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:25 crc kubenswrapper[4599]: I1012 07:35:25.692640 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f5988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0147ed28-bc0f-409c-a813-dc2ffffba092\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b06de751793dade650ec18334a6bcf8bb26e5e89af47ede264d087d1a950178a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzz9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f5988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:25Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:25 crc kubenswrapper[4599]: I1012 07:35:25.710557 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a95b7ab-8632-4332-a30f-64f28ef8d313\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce4f64c7fab07385f9215821c5c75c5b345653fa141a238bb3dd74e7a1a291a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce4f64c7fab07385f9215821c5c75c5b345653fa141a238bb3dd74e7a1a291a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-whk5b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:25Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:25 crc kubenswrapper[4599]: I1012 07:35:25.724909 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db79c5e0b03856a5da5fba6b8e6cb77590ef9e29090ac66da24d54ea638a20f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:25Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:25 crc kubenswrapper[4599]: I1012 07:35:25.736860 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:25Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:25 crc kubenswrapper[4599]: I1012 07:35:25.749048 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc694bce-8c25-4729-b452-29d44d3efe6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c222e0cd037e0719d1ea94d4f28639dae42c676fa56b5f57b22cd50d87e6b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25l5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://238fe8d2b2e7f08d70bc5f938aca3c90b68c447db21a713d651d47b47a8127c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25l5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5mz5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:25Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:25 crc kubenswrapper[4599]: I1012 07:35:25.759230 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8afb4c88-1806-483a-9957-30833be28202\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c61ee75e4bbe142c54914177759cf0c49ecc36f904c8c6bf421cd4917256d5a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://545c8135b4e5319ac3b9c84ee50556b66e4f5cfe0987f5aa37ef668130f6f267\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40188eb89126b26a3002269d063827de60fa17f72c18fbbd0e15b16c744e8ddb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc5c76ab7433309a1f2c32d2ada884b5b0d926fee8219463079f8689a1f7efef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21d90e31b7ebf70d48881d9d0837e8a1fdeabc3d53988c7ba993ce28e902d3d4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T07:35:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW1012 07:35:20.755648 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1012 07:35:20.755757 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 07:35:20.757774 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2293705747/tls.crt::/tmp/serving-cert-2293705747/tls.key\\\\\\\"\\\\nI1012 07:35:20.975376 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1012 07:35:20.977324 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1012 07:35:20.977357 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1012 07:35:20.977377 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1012 07:35:20.977382 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1012 07:35:20.982144 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1012 07:35:20.982166 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1012 07:35:20.982172 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1012 07:35:20.982212 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1012 07:35:20.982216 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1012 07:35:20.982220 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1012 07:35:20.982223 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1012 07:35:20.982226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1012 07:35:20.985689 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e595e7d1dee4ecff3b5deb6e363b57de09e4bcfddd6ab9ae9be27b6e3578628\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afc60525d60f5b497a75efb631611d658c23f6336c21ca7421bf3efee3ece0ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afc60525d60f5b497a75efb631611d658c23f6336c21ca7421bf3efee3ece0ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:25Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:25 crc kubenswrapper[4599]: I1012 07:35:25.769121 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2869b43-5f5f-4ed7-9dc3-63d210bde137\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21abc118842f09c54256bd05f1e5978c928bad569bc9b52157287576117fb49d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb25175bfe42e193200fbc76f82294afbd5f23d9653637c12ddf67553431748e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da490def77b19758f5479ef7070846dee19a5057bd6b0568d96fd38a8dd6528\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://622eac195ecab593bc5fa71a4a88ac987a45be5fb130336aae7acd680c51abe4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:25Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:25 crc kubenswrapper[4599]: I1012 07:35:25.779051 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:25Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:25 crc kubenswrapper[4599]: I1012 07:35:25.785546 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:25 crc kubenswrapper[4599]: I1012 07:35:25.785594 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:25 crc kubenswrapper[4599]: I1012 07:35:25.785609 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:25 crc kubenswrapper[4599]: I1012 07:35:25.785631 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:25 crc kubenswrapper[4599]: I1012 07:35:25.785642 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:25Z","lastTransitionTime":"2025-10-12T07:35:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:25 crc kubenswrapper[4599]: I1012 07:35:25.787816 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:25Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:25 crc kubenswrapper[4599]: I1012 07:35:25.799015 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4b9189620cc5d626fe6c3e7bbe0efd22f7a6f061bc2afd3d33b1e04ebdbfb1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc203371af6667734441e7dca5f3b301365eb1136f363df8c87377ae58168130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:25Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:25 crc kubenswrapper[4599]: I1012 07:35:25.808746 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b574e7cb76b05a81f63a55ce4544b1d4b3a58e6793be2c89620f66282642e39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:25Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:25 crc kubenswrapper[4599]: I1012 07:35:25.820366 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9xbn5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18ca8765-c435-4750-b803-14b539958d9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16c75354b9fbe00273ab4aaa5ca1306c5d097996b8ce8074a2cee8185118a001\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16c75354b9fbe00273ab4aaa5ca1306c5d097996b8ce8074a2cee8185118a001\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a7c28a100796e21008c597d15ee36786f7e87553a8a4feb5734ac79cef9b556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a7c28a100796e21008c597d15ee36786f7e87553a8a4feb5734ac79cef9b556\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6388c1e143c0caded216941ec25bd07be0a14f2f97f6328c35d5cc04815a88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6388c1e143c0caded216941ec25bd07be0a14f2f97f6328c35d5cc04815a88a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e201774968890569efafc736378cff0d700067bb0bd39677f42ef65fa2c07646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e201774968890569efafc736378cff0d700067bb0bd39677f42ef65fa2c07646\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9xbn5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:25Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:25 crc kubenswrapper[4599]: I1012 07:35:25.829806 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8hm26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce311f52-0501-45d3-8209-b1d2aa25028b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://611114a1fecbd461507dd2f566f975c7ef3e8d7655d90c1a5b31cd05a430cb38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8hm26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:25Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:25 crc kubenswrapper[4599]: I1012 07:35:25.888263 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:25 crc kubenswrapper[4599]: I1012 07:35:25.888304 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:25 crc kubenswrapper[4599]: I1012 07:35:25.888314 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:25 crc kubenswrapper[4599]: I1012 07:35:25.888350 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:25 crc kubenswrapper[4599]: I1012 07:35:25.888364 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:25Z","lastTransitionTime":"2025-10-12T07:35:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:25 crc kubenswrapper[4599]: I1012 07:35:25.990779 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:25 crc kubenswrapper[4599]: I1012 07:35:25.990851 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:25 crc kubenswrapper[4599]: I1012 07:35:25.990862 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:25 crc kubenswrapper[4599]: I1012 07:35:25.990881 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:25 crc kubenswrapper[4599]: I1012 07:35:25.990909 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:25Z","lastTransitionTime":"2025-10-12T07:35:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:26 crc kubenswrapper[4599]: I1012 07:35:26.093042 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:26 crc kubenswrapper[4599]: I1012 07:35:26.093085 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:26 crc kubenswrapper[4599]: I1012 07:35:26.093094 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:26 crc kubenswrapper[4599]: I1012 07:35:26.093122 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:26 crc kubenswrapper[4599]: I1012 07:35:26.093132 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:26Z","lastTransitionTime":"2025-10-12T07:35:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:26 crc kubenswrapper[4599]: I1012 07:35:26.195765 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:26 crc kubenswrapper[4599]: I1012 07:35:26.195823 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:26 crc kubenswrapper[4599]: I1012 07:35:26.195833 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:26 crc kubenswrapper[4599]: I1012 07:35:26.195853 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:26 crc kubenswrapper[4599]: I1012 07:35:26.195863 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:26Z","lastTransitionTime":"2025-10-12T07:35:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:26 crc kubenswrapper[4599]: I1012 07:35:26.299490 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:26 crc kubenswrapper[4599]: I1012 07:35:26.299552 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:26 crc kubenswrapper[4599]: I1012 07:35:26.299563 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:26 crc kubenswrapper[4599]: I1012 07:35:26.299584 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:26 crc kubenswrapper[4599]: I1012 07:35:26.299596 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:26Z","lastTransitionTime":"2025-10-12T07:35:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:26 crc kubenswrapper[4599]: I1012 07:35:26.401995 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:26 crc kubenswrapper[4599]: I1012 07:35:26.402049 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:26 crc kubenswrapper[4599]: I1012 07:35:26.402064 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:26 crc kubenswrapper[4599]: I1012 07:35:26.402081 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:26 crc kubenswrapper[4599]: I1012 07:35:26.402093 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:26Z","lastTransitionTime":"2025-10-12T07:35:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:26 crc kubenswrapper[4599]: I1012 07:35:26.505008 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:26 crc kubenswrapper[4599]: I1012 07:35:26.505064 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:26 crc kubenswrapper[4599]: I1012 07:35:26.505078 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:26 crc kubenswrapper[4599]: I1012 07:35:26.505100 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:26 crc kubenswrapper[4599]: I1012 07:35:26.505113 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:26Z","lastTransitionTime":"2025-10-12T07:35:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:26 crc kubenswrapper[4599]: I1012 07:35:26.608231 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:26 crc kubenswrapper[4599]: I1012 07:35:26.608289 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:26 crc kubenswrapper[4599]: I1012 07:35:26.608299 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:26 crc kubenswrapper[4599]: I1012 07:35:26.608318 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:26 crc kubenswrapper[4599]: I1012 07:35:26.608353 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:26Z","lastTransitionTime":"2025-10-12T07:35:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:26 crc kubenswrapper[4599]: I1012 07:35:26.682032 4599 generic.go:334] "Generic (PLEG): container finished" podID="18ca8765-c435-4750-b803-14b539958d9e" containerID="6f07c63fbebde2f05a5b1682af69eb42b5aec20728e0997f1ebc2f5a59815ad1" exitCode=0 Oct 12 07:35:26 crc kubenswrapper[4599]: I1012 07:35:26.682081 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9xbn5" event={"ID":"18ca8765-c435-4750-b803-14b539958d9e","Type":"ContainerDied","Data":"6f07c63fbebde2f05a5b1682af69eb42b5aec20728e0997f1ebc2f5a59815ad1"} Oct 12 07:35:26 crc kubenswrapper[4599]: I1012 07:35:26.694849 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:26Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:26 crc kubenswrapper[4599]: I1012 07:35:26.707150 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:26Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:26 crc kubenswrapper[4599]: I1012 07:35:26.710492 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:26 crc kubenswrapper[4599]: I1012 07:35:26.710534 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:26 crc kubenswrapper[4599]: I1012 07:35:26.710546 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:26 crc kubenswrapper[4599]: I1012 07:35:26.710571 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:26 crc kubenswrapper[4599]: I1012 07:35:26.710581 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:26Z","lastTransitionTime":"2025-10-12T07:35:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:26 crc kubenswrapper[4599]: I1012 07:35:26.718738 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4b9189620cc5d626fe6c3e7bbe0efd22f7a6f061bc2afd3d33b1e04ebdbfb1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc203371af6667734441e7dca5f3b301365eb1136f363df8c87377ae58168130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:26Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:26 crc kubenswrapper[4599]: I1012 07:35:26.728507 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b574e7cb76b05a81f63a55ce4544b1d4b3a58e6793be2c89620f66282642e39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:26Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:26 crc kubenswrapper[4599]: I1012 07:35:26.742875 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9xbn5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18ca8765-c435-4750-b803-14b539958d9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16c75354b9fbe00273ab4aaa5ca1306c5d097996b8ce8074a2cee8185118a001\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16c75354b9fbe00273ab4aaa5ca1306c5d097996b8ce8074a2cee8185118a001\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a7c28a100796e21008c597d15ee36786f7e87553a8a4feb5734ac79cef9b556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a7c28a100796e21008c597d15ee36786f7e87553a8a4feb5734ac79cef9b556\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6388c1e143c0caded216941ec25bd07be0a14f2f97f6328c35d5cc04815a88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6388c1e143c0caded216941ec25bd07be0a14f2f97f6328c35d5cc04815a88a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e201774968890569efafc736378cff0d700067bb0bd39677f42ef65fa2c07646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e201774968890569efafc736378cff0d700067bb0bd39677f42ef65fa2c07646\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f07c63fbebde2f05a5b1682af69eb42b5aec20728e0997f1ebc2f5a59815ad1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f07c63fbebde2f05a5b1682af69eb42b5aec20728e0997f1ebc2f5a59815ad1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9xbn5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:26Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:26 crc kubenswrapper[4599]: I1012 07:35:26.754540 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8hm26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce311f52-0501-45d3-8209-b1d2aa25028b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://611114a1fecbd461507dd2f566f975c7ef3e8d7655d90c1a5b31cd05a430cb38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8hm26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:26Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:26 crc kubenswrapper[4599]: I1012 07:35:26.763384 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rxr42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92909e59-b659-4fc0-91c0-1880ff96b4f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4162f0c9d7789e80004cf3ea54eed47b4d473832cf68a03e348973c300f9dcc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7hr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rxr42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:26Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:26 crc kubenswrapper[4599]: I1012 07:35:26.771515 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f5988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0147ed28-bc0f-409c-a813-dc2ffffba092\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b06de751793dade650ec18334a6bcf8bb26e5e89af47ede264d087d1a950178a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzz9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f5988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:26Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:26 crc kubenswrapper[4599]: I1012 07:35:26.784623 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a95b7ab-8632-4332-a30f-64f28ef8d313\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce4f64c7fab07385f9215821c5c75c5b345653fa141a238bb3dd74e7a1a291a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce4f64c7fab07385f9215821c5c75c5b345653fa141a238bb3dd74e7a1a291a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-whk5b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:26Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:26 crc kubenswrapper[4599]: I1012 07:35:26.793867 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db79c5e0b03856a5da5fba6b8e6cb77590ef9e29090ac66da24d54ea638a20f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:26Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:26 crc kubenswrapper[4599]: I1012 07:35:26.803136 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:26Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:26 crc kubenswrapper[4599]: I1012 07:35:26.812131 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc694bce-8c25-4729-b452-29d44d3efe6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c222e0cd037e0719d1ea94d4f28639dae42c676fa56b5f57b22cd50d87e6b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25l5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://238fe8d2b2e7f08d70bc5f938aca3c90b68c447db21a713d651d47b47a8127c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25l5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5mz5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:26Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:26 crc kubenswrapper[4599]: I1012 07:35:26.815475 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:26 crc kubenswrapper[4599]: I1012 07:35:26.815517 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:26 crc kubenswrapper[4599]: I1012 07:35:26.815528 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:26 crc kubenswrapper[4599]: I1012 07:35:26.815545 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:26 crc kubenswrapper[4599]: I1012 07:35:26.815560 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:26Z","lastTransitionTime":"2025-10-12T07:35:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:26 crc kubenswrapper[4599]: I1012 07:35:26.822102 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8afb4c88-1806-483a-9957-30833be28202\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c61ee75e4bbe142c54914177759cf0c49ecc36f904c8c6bf421cd4917256d5a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://545c8135b4e5319ac3b9c84ee50556b66e4f5cfe0987f5aa37ef668130f6f267\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40188eb89126b26a3002269d063827de60fa17f72c18fbbd0e15b16c744e8ddb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc5c76ab7433309a1f2c32d2ada884b5b0d926fee8219463079f8689a1f7efef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21d90e31b7ebf70d48881d9d0837e8a1fdeabc3d53988c7ba993ce28e902d3d4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T07:35:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW1012 07:35:20.755648 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1012 07:35:20.755757 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 07:35:20.757774 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2293705747/tls.crt::/tmp/serving-cert-2293705747/tls.key\\\\\\\"\\\\nI1012 07:35:20.975376 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1012 07:35:20.977324 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1012 07:35:20.977357 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1012 07:35:20.977377 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1012 07:35:20.977382 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1012 07:35:20.982144 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1012 07:35:20.982166 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1012 07:35:20.982172 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1012 07:35:20.982212 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1012 07:35:20.982216 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1012 07:35:20.982220 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1012 07:35:20.982223 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1012 07:35:20.982226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1012 07:35:20.985689 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e595e7d1dee4ecff3b5deb6e363b57de09e4bcfddd6ab9ae9be27b6e3578628\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afc60525d60f5b497a75efb631611d658c23f6336c21ca7421bf3efee3ece0ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afc60525d60f5b497a75efb631611d658c23f6336c21ca7421bf3efee3ece0ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:26Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:26 crc kubenswrapper[4599]: I1012 07:35:26.832055 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2869b43-5f5f-4ed7-9dc3-63d210bde137\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21abc118842f09c54256bd05f1e5978c928bad569bc9b52157287576117fb49d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb25175bfe42e193200fbc76f82294afbd5f23d9653637c12ddf67553431748e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da490def77b19758f5479ef7070846dee19a5057bd6b0568d96fd38a8dd6528\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://622eac195ecab593bc5fa71a4a88ac987a45be5fb130336aae7acd680c51abe4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:26Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:26 crc kubenswrapper[4599]: I1012 07:35:26.917937 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:26 crc kubenswrapper[4599]: I1012 07:35:26.917992 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:26 crc kubenswrapper[4599]: I1012 07:35:26.918003 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:26 crc kubenswrapper[4599]: I1012 07:35:26.918024 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:26 crc kubenswrapper[4599]: I1012 07:35:26.918038 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:26Z","lastTransitionTime":"2025-10-12T07:35:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:27 crc kubenswrapper[4599]: I1012 07:35:27.025159 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:27 crc kubenswrapper[4599]: I1012 07:35:27.025444 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:27 crc kubenswrapper[4599]: I1012 07:35:27.025466 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:27 crc kubenswrapper[4599]: I1012 07:35:27.025483 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:27 crc kubenswrapper[4599]: I1012 07:35:27.025495 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:27Z","lastTransitionTime":"2025-10-12T07:35:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:27 crc kubenswrapper[4599]: I1012 07:35:27.128760 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:27 crc kubenswrapper[4599]: I1012 07:35:27.128814 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:27 crc kubenswrapper[4599]: I1012 07:35:27.128825 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:27 crc kubenswrapper[4599]: I1012 07:35:27.128847 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:27 crc kubenswrapper[4599]: I1012 07:35:27.128861 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:27Z","lastTransitionTime":"2025-10-12T07:35:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:27 crc kubenswrapper[4599]: I1012 07:35:27.231232 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:27 crc kubenswrapper[4599]: I1012 07:35:27.231284 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:27 crc kubenswrapper[4599]: I1012 07:35:27.231295 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:27 crc kubenswrapper[4599]: I1012 07:35:27.231311 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:27 crc kubenswrapper[4599]: I1012 07:35:27.231321 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:27Z","lastTransitionTime":"2025-10-12T07:35:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:27 crc kubenswrapper[4599]: I1012 07:35:27.334612 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:27 crc kubenswrapper[4599]: I1012 07:35:27.334959 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:27 crc kubenswrapper[4599]: I1012 07:35:27.334971 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:27 crc kubenswrapper[4599]: I1012 07:35:27.334990 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:27 crc kubenswrapper[4599]: I1012 07:35:27.335002 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:27Z","lastTransitionTime":"2025-10-12T07:35:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:27 crc kubenswrapper[4599]: I1012 07:35:27.436819 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:27 crc kubenswrapper[4599]: I1012 07:35:27.436865 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:27 crc kubenswrapper[4599]: I1012 07:35:27.436874 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:27 crc kubenswrapper[4599]: I1012 07:35:27.436890 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:27 crc kubenswrapper[4599]: I1012 07:35:27.436899 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:27Z","lastTransitionTime":"2025-10-12T07:35:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:27 crc kubenswrapper[4599]: I1012 07:35:27.539659 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:27 crc kubenswrapper[4599]: I1012 07:35:27.539714 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:27 crc kubenswrapper[4599]: I1012 07:35:27.539724 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:27 crc kubenswrapper[4599]: I1012 07:35:27.539744 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:27 crc kubenswrapper[4599]: I1012 07:35:27.539759 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:27Z","lastTransitionTime":"2025-10-12T07:35:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:27 crc kubenswrapper[4599]: I1012 07:35:27.544919 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 07:35:27 crc kubenswrapper[4599]: I1012 07:35:27.544970 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 07:35:27 crc kubenswrapper[4599]: I1012 07:35:27.544917 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 07:35:27 crc kubenswrapper[4599]: E1012 07:35:27.545071 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 07:35:27 crc kubenswrapper[4599]: E1012 07:35:27.545232 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 07:35:27 crc kubenswrapper[4599]: E1012 07:35:27.545448 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 07:35:27 crc kubenswrapper[4599]: I1012 07:35:27.642483 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:27 crc kubenswrapper[4599]: I1012 07:35:27.642522 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:27 crc kubenswrapper[4599]: I1012 07:35:27.642531 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:27 crc kubenswrapper[4599]: I1012 07:35:27.642552 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:27 crc kubenswrapper[4599]: I1012 07:35:27.642563 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:27Z","lastTransitionTime":"2025-10-12T07:35:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:27 crc kubenswrapper[4599]: I1012 07:35:27.688306 4599 generic.go:334] "Generic (PLEG): container finished" podID="18ca8765-c435-4750-b803-14b539958d9e" containerID="8f62b824f1e7400e9588b3d3a1afc52f86b3bcae039fbe3c757e1e01733e3a69" exitCode=0 Oct 12 07:35:27 crc kubenswrapper[4599]: I1012 07:35:27.688421 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9xbn5" event={"ID":"18ca8765-c435-4750-b803-14b539958d9e","Type":"ContainerDied","Data":"8f62b824f1e7400e9588b3d3a1afc52f86b3bcae039fbe3c757e1e01733e3a69"} Oct 12 07:35:27 crc kubenswrapper[4599]: I1012 07:35:27.693402 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" event={"ID":"1a95b7ab-8632-4332-a30f-64f28ef8d313","Type":"ContainerStarted","Data":"ce496909e393b4896eb9ae165317648df3e828d219c231d95c6057cd08b78fc5"} Oct 12 07:35:27 crc kubenswrapper[4599]: I1012 07:35:27.704723 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:27Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:27 crc kubenswrapper[4599]: I1012 07:35:27.715801 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:27Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:27 crc kubenswrapper[4599]: I1012 07:35:27.725603 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4b9189620cc5d626fe6c3e7bbe0efd22f7a6f061bc2afd3d33b1e04ebdbfb1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc203371af6667734441e7dca5f3b301365eb1136f363df8c87377ae58168130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:27Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:27 crc kubenswrapper[4599]: I1012 07:35:27.733945 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b574e7cb76b05a81f63a55ce4544b1d4b3a58e6793be2c89620f66282642e39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:27Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:27 crc kubenswrapper[4599]: I1012 07:35:27.744794 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:27 crc kubenswrapper[4599]: I1012 07:35:27.744826 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:27 crc kubenswrapper[4599]: I1012 07:35:27.744836 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:27 crc kubenswrapper[4599]: I1012 07:35:27.744852 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:27 crc kubenswrapper[4599]: I1012 07:35:27.744863 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:27Z","lastTransitionTime":"2025-10-12T07:35:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:27 crc kubenswrapper[4599]: I1012 07:35:27.745577 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9xbn5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18ca8765-c435-4750-b803-14b539958d9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16c75354b9fbe00273ab4aaa5ca1306c5d097996b8ce8074a2cee8185118a001\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16c75354b9fbe00273ab4aaa5ca1306c5d097996b8ce8074a2cee8185118a001\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a7c28a100796e21008c597d15ee36786f7e87553a8a4feb5734ac79cef9b556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a7c28a100796e21008c597d15ee36786f7e87553a8a4feb5734ac79cef9b556\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6388c1e143c0caded216941ec25bd07be0a14f2f97f6328c35d5cc04815a88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6388c1e143c0caded216941ec25bd07be0a14f2f97f6328c35d5cc04815a88a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e201774968890569efafc736378cff0d700067bb0bd39677f42ef65fa2c07646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e201774968890569efafc736378cff0d700067bb0bd39677f42ef65fa2c07646\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f07c63fbebde2f05a5b1682af69eb42b5aec20728e0997f1ebc2f5a59815ad1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f07c63fbebde2f05a5b1682af69eb42b5aec20728e0997f1ebc2f5a59815ad1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f62b824f1e7400e9588b3d3a1afc52f86b3bcae039fbe3c757e1e01733e3a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f62b824f1e7400e9588b3d3a1afc52f86b3bcae039fbe3c757e1e01733e3a69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9xbn5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:27Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:27 crc kubenswrapper[4599]: I1012 07:35:27.755583 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8hm26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce311f52-0501-45d3-8209-b1d2aa25028b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://611114a1fecbd461507dd2f566f975c7ef3e8d7655d90c1a5b31cd05a430cb38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8hm26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:27Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:27 crc kubenswrapper[4599]: I1012 07:35:27.763111 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rxr42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92909e59-b659-4fc0-91c0-1880ff96b4f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4162f0c9d7789e80004cf3ea54eed47b4d473832cf68a03e348973c300f9dcc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7hr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rxr42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:27Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:27 crc kubenswrapper[4599]: I1012 07:35:27.771164 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f5988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0147ed28-bc0f-409c-a813-dc2ffffba092\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b06de751793dade650ec18334a6bcf8bb26e5e89af47ede264d087d1a950178a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzz9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f5988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:27Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:27 crc kubenswrapper[4599]: I1012 07:35:27.784851 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a95b7ab-8632-4332-a30f-64f28ef8d313\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce4f64c7fab07385f9215821c5c75c5b345653fa141a238bb3dd74e7a1a291a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce4f64c7fab07385f9215821c5c75c5b345653fa141a238bb3dd74e7a1a291a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-whk5b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:27Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:27 crc kubenswrapper[4599]: I1012 07:35:27.795287 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8afb4c88-1806-483a-9957-30833be28202\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c61ee75e4bbe142c54914177759cf0c49ecc36f904c8c6bf421cd4917256d5a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://545c8135b4e5319ac3b9c84ee50556b66e4f5cfe0987f5aa37ef668130f6f267\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40188eb89126b26a3002269d063827de60fa17f72c18fbbd0e15b16c744e8ddb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc5c76ab7433309a1f2c32d2ada884b5b0d926fee8219463079f8689a1f7efef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21d90e31b7ebf70d48881d9d0837e8a1fdeabc3d53988c7ba993ce28e902d3d4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T07:35:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW1012 07:35:20.755648 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1012 07:35:20.755757 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 07:35:20.757774 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2293705747/tls.crt::/tmp/serving-cert-2293705747/tls.key\\\\\\\"\\\\nI1012 07:35:20.975376 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1012 07:35:20.977324 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1012 07:35:20.977357 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1012 07:35:20.977377 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1012 07:35:20.977382 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1012 07:35:20.982144 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1012 07:35:20.982166 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1012 07:35:20.982172 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1012 07:35:20.982212 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1012 07:35:20.982216 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1012 07:35:20.982220 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1012 07:35:20.982223 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1012 07:35:20.982226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1012 07:35:20.985689 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e595e7d1dee4ecff3b5deb6e363b57de09e4bcfddd6ab9ae9be27b6e3578628\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afc60525d60f5b497a75efb631611d658c23f6336c21ca7421bf3efee3ece0ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afc60525d60f5b497a75efb631611d658c23f6336c21ca7421bf3efee3ece0ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:27Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:27 crc kubenswrapper[4599]: I1012 07:35:27.804570 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db79c5e0b03856a5da5fba6b8e6cb77590ef9e29090ac66da24d54ea638a20f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:27Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:27 crc kubenswrapper[4599]: I1012 07:35:27.814633 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:27Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:27 crc kubenswrapper[4599]: I1012 07:35:27.822437 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc694bce-8c25-4729-b452-29d44d3efe6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c222e0cd037e0719d1ea94d4f28639dae42c676fa56b5f57b22cd50d87e6b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25l5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://238fe8d2b2e7f08d70bc5f938aca3c90b68c447db21a713d651d47b47a8127c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25l5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5mz5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:27Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:27 crc kubenswrapper[4599]: I1012 07:35:27.831826 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2869b43-5f5f-4ed7-9dc3-63d210bde137\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21abc118842f09c54256bd05f1e5978c928bad569bc9b52157287576117fb49d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb25175bfe42e193200fbc76f82294afbd5f23d9653637c12ddf67553431748e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da490def77b19758f5479ef7070846dee19a5057bd6b0568d96fd38a8dd6528\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://622eac195ecab593bc5fa71a4a88ac987a45be5fb130336aae7acd680c51abe4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:27Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:27 crc kubenswrapper[4599]: I1012 07:35:27.847035 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:27 crc kubenswrapper[4599]: I1012 07:35:27.847075 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:27 crc kubenswrapper[4599]: I1012 07:35:27.847088 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:27 crc kubenswrapper[4599]: I1012 07:35:27.847105 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:27 crc kubenswrapper[4599]: I1012 07:35:27.847114 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:27Z","lastTransitionTime":"2025-10-12T07:35:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:27 crc kubenswrapper[4599]: I1012 07:35:27.949855 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:27 crc kubenswrapper[4599]: I1012 07:35:27.949898 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:27 crc kubenswrapper[4599]: I1012 07:35:27.949907 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:27 crc kubenswrapper[4599]: I1012 07:35:27.949933 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:27 crc kubenswrapper[4599]: I1012 07:35:27.949943 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:27Z","lastTransitionTime":"2025-10-12T07:35:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:28 crc kubenswrapper[4599]: I1012 07:35:28.052714 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:28 crc kubenswrapper[4599]: I1012 07:35:28.052763 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:28 crc kubenswrapper[4599]: I1012 07:35:28.052777 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:28 crc kubenswrapper[4599]: I1012 07:35:28.052794 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:28 crc kubenswrapper[4599]: I1012 07:35:28.052805 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:28Z","lastTransitionTime":"2025-10-12T07:35:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:28 crc kubenswrapper[4599]: I1012 07:35:28.154865 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:28 crc kubenswrapper[4599]: I1012 07:35:28.154912 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:28 crc kubenswrapper[4599]: I1012 07:35:28.154924 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:28 crc kubenswrapper[4599]: I1012 07:35:28.154941 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:28 crc kubenswrapper[4599]: I1012 07:35:28.154951 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:28Z","lastTransitionTime":"2025-10-12T07:35:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:28 crc kubenswrapper[4599]: I1012 07:35:28.257765 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:28 crc kubenswrapper[4599]: I1012 07:35:28.257817 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:28 crc kubenswrapper[4599]: I1012 07:35:28.257828 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:28 crc kubenswrapper[4599]: I1012 07:35:28.257846 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:28 crc kubenswrapper[4599]: I1012 07:35:28.257856 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:28Z","lastTransitionTime":"2025-10-12T07:35:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:28 crc kubenswrapper[4599]: I1012 07:35:28.365576 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:28 crc kubenswrapper[4599]: I1012 07:35:28.365719 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:28 crc kubenswrapper[4599]: I1012 07:35:28.365770 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:28 crc kubenswrapper[4599]: I1012 07:35:28.365799 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:28 crc kubenswrapper[4599]: I1012 07:35:28.365818 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:28Z","lastTransitionTime":"2025-10-12T07:35:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:28 crc kubenswrapper[4599]: I1012 07:35:28.467976 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:28 crc kubenswrapper[4599]: I1012 07:35:28.468021 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:28 crc kubenswrapper[4599]: I1012 07:35:28.468031 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:28 crc kubenswrapper[4599]: I1012 07:35:28.468046 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:28 crc kubenswrapper[4599]: I1012 07:35:28.468056 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:28Z","lastTransitionTime":"2025-10-12T07:35:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:28 crc kubenswrapper[4599]: I1012 07:35:28.570387 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:28 crc kubenswrapper[4599]: I1012 07:35:28.570434 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:28 crc kubenswrapper[4599]: I1012 07:35:28.570444 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:28 crc kubenswrapper[4599]: I1012 07:35:28.570462 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:28 crc kubenswrapper[4599]: I1012 07:35:28.570473 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:28Z","lastTransitionTime":"2025-10-12T07:35:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:28 crc kubenswrapper[4599]: I1012 07:35:28.672527 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:28 crc kubenswrapper[4599]: I1012 07:35:28.672559 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:28 crc kubenswrapper[4599]: I1012 07:35:28.672568 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:28 crc kubenswrapper[4599]: I1012 07:35:28.672583 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:28 crc kubenswrapper[4599]: I1012 07:35:28.672593 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:28Z","lastTransitionTime":"2025-10-12T07:35:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:28 crc kubenswrapper[4599]: I1012 07:35:28.700961 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9xbn5" event={"ID":"18ca8765-c435-4750-b803-14b539958d9e","Type":"ContainerStarted","Data":"4db4e7d06f60157d91fcb8514d176f78540085701360ebcfa7eabb36fe84d6b3"} Oct 12 07:35:28 crc kubenswrapper[4599]: I1012 07:35:28.722226 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a95b7ab-8632-4332-a30f-64f28ef8d313\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce4f64c7fab07385f9215821c5c75c5b345653fa141a238bb3dd74e7a1a291a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce4f64c7fab07385f9215821c5c75c5b345653fa141a238bb3dd74e7a1a291a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-whk5b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:28Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:28 crc kubenswrapper[4599]: I1012 07:35:28.732940 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rxr42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92909e59-b659-4fc0-91c0-1880ff96b4f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4162f0c9d7789e80004cf3ea54eed47b4d473832cf68a03e348973c300f9dcc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7hr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rxr42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:28Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:28 crc kubenswrapper[4599]: I1012 07:35:28.744653 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f5988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0147ed28-bc0f-409c-a813-dc2ffffba092\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b06de751793dade650ec18334a6bcf8bb26e5e89af47ede264d087d1a950178a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzz9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f5988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:28Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:28 crc kubenswrapper[4599]: I1012 07:35:28.754381 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc694bce-8c25-4729-b452-29d44d3efe6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c222e0cd037e0719d1ea94d4f28639dae42c676fa56b5f57b22cd50d87e6b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25l5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://238fe8d2b2e7f08d70bc5f938aca3c90b68c447db21a713d651d47b47a8127c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25l5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5mz5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:28Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:28 crc kubenswrapper[4599]: I1012 07:35:28.766472 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8afb4c88-1806-483a-9957-30833be28202\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c61ee75e4bbe142c54914177759cf0c49ecc36f904c8c6bf421cd4917256d5a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://545c8135b4e5319ac3b9c84ee50556b66e4f5cfe0987f5aa37ef668130f6f267\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40188eb89126b26a3002269d063827de60fa17f72c18fbbd0e15b16c744e8ddb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc5c76ab7433309a1f2c32d2ada884b5b0d926fee8219463079f8689a1f7efef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21d90e31b7ebf70d48881d9d0837e8a1fdeabc3d53988c7ba993ce28e902d3d4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T07:35:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW1012 07:35:20.755648 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1012 07:35:20.755757 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 07:35:20.757774 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2293705747/tls.crt::/tmp/serving-cert-2293705747/tls.key\\\\\\\"\\\\nI1012 07:35:20.975376 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1012 07:35:20.977324 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1012 07:35:20.977357 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1012 07:35:20.977377 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1012 07:35:20.977382 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1012 07:35:20.982144 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1012 07:35:20.982166 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1012 07:35:20.982172 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1012 07:35:20.982212 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1012 07:35:20.982216 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1012 07:35:20.982220 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1012 07:35:20.982223 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1012 07:35:20.982226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1012 07:35:20.985689 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e595e7d1dee4ecff3b5deb6e363b57de09e4bcfddd6ab9ae9be27b6e3578628\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afc60525d60f5b497a75efb631611d658c23f6336c21ca7421bf3efee3ece0ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afc60525d60f5b497a75efb631611d658c23f6336c21ca7421bf3efee3ece0ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:28Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:28 crc kubenswrapper[4599]: I1012 07:35:28.777535 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:28 crc kubenswrapper[4599]: I1012 07:35:28.777611 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:28 crc kubenswrapper[4599]: I1012 07:35:28.777626 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:28 crc kubenswrapper[4599]: I1012 07:35:28.777647 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:28 crc kubenswrapper[4599]: I1012 07:35:28.777662 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:28Z","lastTransitionTime":"2025-10-12T07:35:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:28 crc kubenswrapper[4599]: I1012 07:35:28.782246 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db79c5e0b03856a5da5fba6b8e6cb77590ef9e29090ac66da24d54ea638a20f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:28Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:28 crc kubenswrapper[4599]: I1012 07:35:28.792980 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:28Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:28 crc kubenswrapper[4599]: I1012 07:35:28.804803 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2869b43-5f5f-4ed7-9dc3-63d210bde137\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21abc118842f09c54256bd05f1e5978c928bad569bc9b52157287576117fb49d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb25175bfe42e193200fbc76f82294afbd5f23d9653637c12ddf67553431748e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da490def77b19758f5479ef7070846dee19a5057bd6b0568d96fd38a8dd6528\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://622eac195ecab593bc5fa71a4a88ac987a45be5fb130336aae7acd680c51abe4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:28Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:28 crc kubenswrapper[4599]: I1012 07:35:28.817647 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4b9189620cc5d626fe6c3e7bbe0efd22f7a6f061bc2afd3d33b1e04ebdbfb1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc203371af6667734441e7dca5f3b301365eb1136f363df8c87377ae58168130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:28Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:28 crc kubenswrapper[4599]: I1012 07:35:28.828217 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b574e7cb76b05a81f63a55ce4544b1d4b3a58e6793be2c89620f66282642e39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:28Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:28 crc kubenswrapper[4599]: I1012 07:35:28.840528 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9xbn5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18ca8765-c435-4750-b803-14b539958d9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4db4e7d06f60157d91fcb8514d176f78540085701360ebcfa7eabb36fe84d6b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16c75354b9fbe00273ab4aaa5ca1306c5d097996b8ce8074a2cee8185118a001\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16c75354b9fbe00273ab4aaa5ca1306c5d097996b8ce8074a2cee8185118a001\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a7c28a100796e21008c597d15ee36786f7e87553a8a4feb5734ac79cef9b556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a7c28a100796e21008c597d15ee36786f7e87553a8a4feb5734ac79cef9b556\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6388c1e143c0caded216941ec25bd07be0a14f2f97f6328c35d5cc04815a88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6388c1e143c0caded216941ec25bd07be0a14f2f97f6328c35d5cc04815a88a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e201774968890569efafc736378cff0d700067bb0bd39677f42ef65fa2c07646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e201774968890569efafc736378cff0d700067bb0bd39677f42ef65fa2c07646\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f07c63fbebde2f05a5b1682af69eb42b5aec20728e0997f1ebc2f5a59815ad1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f07c63fbebde2f05a5b1682af69eb42b5aec20728e0997f1ebc2f5a59815ad1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f62b824f1e7400e9588b3d3a1afc52f86b3bcae039fbe3c757e1e01733e3a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f62b824f1e7400e9588b3d3a1afc52f86b3bcae039fbe3c757e1e01733e3a69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9xbn5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:28Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:28 crc kubenswrapper[4599]: I1012 07:35:28.852083 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8hm26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce311f52-0501-45d3-8209-b1d2aa25028b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://611114a1fecbd461507dd2f566f975c7ef3e8d7655d90c1a5b31cd05a430cb38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8hm26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:28Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:28 crc kubenswrapper[4599]: I1012 07:35:28.865178 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:28Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:28 crc kubenswrapper[4599]: I1012 07:35:28.876464 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:28Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:28 crc kubenswrapper[4599]: I1012 07:35:28.880682 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:28 crc kubenswrapper[4599]: I1012 07:35:28.880718 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:28 crc kubenswrapper[4599]: I1012 07:35:28.880728 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:28 crc kubenswrapper[4599]: I1012 07:35:28.880747 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:28 crc kubenswrapper[4599]: I1012 07:35:28.880762 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:28Z","lastTransitionTime":"2025-10-12T07:35:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:28 crc kubenswrapper[4599]: I1012 07:35:28.983585 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:28 crc kubenswrapper[4599]: I1012 07:35:28.983633 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:28 crc kubenswrapper[4599]: I1012 07:35:28.983644 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:28 crc kubenswrapper[4599]: I1012 07:35:28.983662 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:28 crc kubenswrapper[4599]: I1012 07:35:28.983674 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:28Z","lastTransitionTime":"2025-10-12T07:35:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:29 crc kubenswrapper[4599]: I1012 07:35:29.086649 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:29 crc kubenswrapper[4599]: I1012 07:35:29.086705 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:29 crc kubenswrapper[4599]: I1012 07:35:29.086716 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:29 crc kubenswrapper[4599]: I1012 07:35:29.086734 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:29 crc kubenswrapper[4599]: I1012 07:35:29.086745 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:29Z","lastTransitionTime":"2025-10-12T07:35:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:29 crc kubenswrapper[4599]: I1012 07:35:29.154265 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 07:35:29 crc kubenswrapper[4599]: E1012 07:35:29.154546 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 07:35:37.154508078 +0000 UTC m=+33.943703580 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 07:35:29 crc kubenswrapper[4599]: I1012 07:35:29.189782 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:29 crc kubenswrapper[4599]: I1012 07:35:29.189823 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:29 crc kubenswrapper[4599]: I1012 07:35:29.189833 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:29 crc kubenswrapper[4599]: I1012 07:35:29.189851 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:29 crc kubenswrapper[4599]: I1012 07:35:29.189863 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:29Z","lastTransitionTime":"2025-10-12T07:35:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:29 crc kubenswrapper[4599]: I1012 07:35:29.255231 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 07:35:29 crc kubenswrapper[4599]: I1012 07:35:29.255283 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 07:35:29 crc kubenswrapper[4599]: I1012 07:35:29.255305 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 07:35:29 crc kubenswrapper[4599]: I1012 07:35:29.255359 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 07:35:29 crc kubenswrapper[4599]: E1012 07:35:29.255428 4599 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 12 07:35:29 crc kubenswrapper[4599]: E1012 07:35:29.255492 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-12 07:35:37.2554799 +0000 UTC m=+34.044675402 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 12 07:35:29 crc kubenswrapper[4599]: E1012 07:35:29.255429 4599 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 12 07:35:29 crc kubenswrapper[4599]: E1012 07:35:29.255556 4599 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 12 07:35:29 crc kubenswrapper[4599]: E1012 07:35:29.255596 4599 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 12 07:35:29 crc kubenswrapper[4599]: E1012 07:35:29.255612 4599 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 12 07:35:29 crc kubenswrapper[4599]: E1012 07:35:29.255611 4599 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 12 07:35:29 crc kubenswrapper[4599]: E1012 07:35:29.255677 4599 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 12 07:35:29 crc kubenswrapper[4599]: E1012 07:35:29.255718 4599 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 12 07:35:29 crc kubenswrapper[4599]: E1012 07:35:29.255574 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-12 07:35:37.255562248 +0000 UTC m=+34.044757760 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 12 07:35:29 crc kubenswrapper[4599]: E1012 07:35:29.255782 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-12 07:35:37.255764416 +0000 UTC m=+34.044959918 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 12 07:35:29 crc kubenswrapper[4599]: E1012 07:35:29.255801 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-12 07:35:37.255790747 +0000 UTC m=+34.044986248 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 12 07:35:29 crc kubenswrapper[4599]: I1012 07:35:29.292109 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:29 crc kubenswrapper[4599]: I1012 07:35:29.292167 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:29 crc kubenswrapper[4599]: I1012 07:35:29.292182 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:29 crc kubenswrapper[4599]: I1012 07:35:29.292225 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:29 crc kubenswrapper[4599]: I1012 07:35:29.292238 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:29Z","lastTransitionTime":"2025-10-12T07:35:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:29 crc kubenswrapper[4599]: I1012 07:35:29.394318 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:29 crc kubenswrapper[4599]: I1012 07:35:29.394562 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:29 crc kubenswrapper[4599]: I1012 07:35:29.394570 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:29 crc kubenswrapper[4599]: I1012 07:35:29.394585 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:29 crc kubenswrapper[4599]: I1012 07:35:29.394594 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:29Z","lastTransitionTime":"2025-10-12T07:35:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:29 crc kubenswrapper[4599]: I1012 07:35:29.497362 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:29 crc kubenswrapper[4599]: I1012 07:35:29.497415 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:29 crc kubenswrapper[4599]: I1012 07:35:29.497427 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:29 crc kubenswrapper[4599]: I1012 07:35:29.497446 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:29 crc kubenswrapper[4599]: I1012 07:35:29.497460 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:29Z","lastTransitionTime":"2025-10-12T07:35:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:29 crc kubenswrapper[4599]: I1012 07:35:29.544837 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 07:35:29 crc kubenswrapper[4599]: I1012 07:35:29.544837 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 07:35:29 crc kubenswrapper[4599]: E1012 07:35:29.544963 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 07:35:29 crc kubenswrapper[4599]: E1012 07:35:29.545025 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 07:35:29 crc kubenswrapper[4599]: I1012 07:35:29.544857 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 07:35:29 crc kubenswrapper[4599]: E1012 07:35:29.545116 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 07:35:29 crc kubenswrapper[4599]: I1012 07:35:29.600712 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:29 crc kubenswrapper[4599]: I1012 07:35:29.600756 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:29 crc kubenswrapper[4599]: I1012 07:35:29.600765 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:29 crc kubenswrapper[4599]: I1012 07:35:29.600779 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:29 crc kubenswrapper[4599]: I1012 07:35:29.600793 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:29Z","lastTransitionTime":"2025-10-12T07:35:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:29 crc kubenswrapper[4599]: I1012 07:35:29.703501 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:29 crc kubenswrapper[4599]: I1012 07:35:29.703545 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:29 crc kubenswrapper[4599]: I1012 07:35:29.703557 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:29 crc kubenswrapper[4599]: I1012 07:35:29.703583 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:29 crc kubenswrapper[4599]: I1012 07:35:29.703596 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:29Z","lastTransitionTime":"2025-10-12T07:35:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:29 crc kubenswrapper[4599]: I1012 07:35:29.707909 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" event={"ID":"1a95b7ab-8632-4332-a30f-64f28ef8d313","Type":"ContainerStarted","Data":"e32f2e2e03b1ed5891ac7e16e96345ae592a5bd631c69ebe61bd9c29bd9a192f"} Oct 12 07:35:29 crc kubenswrapper[4599]: I1012 07:35:29.708160 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" Oct 12 07:35:29 crc kubenswrapper[4599]: I1012 07:35:29.708185 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" Oct 12 07:35:29 crc kubenswrapper[4599]: I1012 07:35:29.722451 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8afb4c88-1806-483a-9957-30833be28202\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c61ee75e4bbe142c54914177759cf0c49ecc36f904c8c6bf421cd4917256d5a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://545c8135b4e5319ac3b9c84ee50556b66e4f5cfe0987f5aa37ef668130f6f267\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40188eb89126b26a3002269d063827de60fa17f72c18fbbd0e15b16c744e8ddb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc5c76ab7433309a1f2c32d2ada884b5b0d926fee8219463079f8689a1f7efef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21d90e31b7ebf70d48881d9d0837e8a1fdeabc3d53988c7ba993ce28e902d3d4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T07:35:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW1012 07:35:20.755648 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1012 07:35:20.755757 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 07:35:20.757774 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2293705747/tls.crt::/tmp/serving-cert-2293705747/tls.key\\\\\\\"\\\\nI1012 07:35:20.975376 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1012 07:35:20.977324 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1012 07:35:20.977357 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1012 07:35:20.977377 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1012 07:35:20.977382 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1012 07:35:20.982144 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1012 07:35:20.982166 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1012 07:35:20.982172 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1012 07:35:20.982212 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1012 07:35:20.982216 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1012 07:35:20.982220 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1012 07:35:20.982223 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1012 07:35:20.982226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1012 07:35:20.985689 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e595e7d1dee4ecff3b5deb6e363b57de09e4bcfddd6ab9ae9be27b6e3578628\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afc60525d60f5b497a75efb631611d658c23f6336c21ca7421bf3efee3ece0ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afc60525d60f5b497a75efb631611d658c23f6336c21ca7421bf3efee3ece0ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:29Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:29 crc kubenswrapper[4599]: I1012 07:35:29.729103 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" Oct 12 07:35:29 crc kubenswrapper[4599]: I1012 07:35:29.729195 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" Oct 12 07:35:29 crc kubenswrapper[4599]: I1012 07:35:29.734986 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db79c5e0b03856a5da5fba6b8e6cb77590ef9e29090ac66da24d54ea638a20f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:29Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:29 crc kubenswrapper[4599]: I1012 07:35:29.745289 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:29Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:29 crc kubenswrapper[4599]: I1012 07:35:29.754879 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc694bce-8c25-4729-b452-29d44d3efe6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c222e0cd037e0719d1ea94d4f28639dae42c676fa56b5f57b22cd50d87e6b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25l5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://238fe8d2b2e7f08d70bc5f938aca3c90b68c447db21a713d651d47b47a8127c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25l5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5mz5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:29Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:29 crc kubenswrapper[4599]: I1012 07:35:29.764741 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2869b43-5f5f-4ed7-9dc3-63d210bde137\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21abc118842f09c54256bd05f1e5978c928bad569bc9b52157287576117fb49d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb25175bfe42e193200fbc76f82294afbd5f23d9653637c12ddf67553431748e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da490def77b19758f5479ef7070846dee19a5057bd6b0568d96fd38a8dd6528\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://622eac195ecab593bc5fa71a4a88ac987a45be5fb130336aae7acd680c51abe4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:29Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:29 crc kubenswrapper[4599]: I1012 07:35:29.773977 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:29Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:29 crc kubenswrapper[4599]: I1012 07:35:29.782844 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:29Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:29 crc kubenswrapper[4599]: I1012 07:35:29.791959 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4b9189620cc5d626fe6c3e7bbe0efd22f7a6f061bc2afd3d33b1e04ebdbfb1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc203371af6667734441e7dca5f3b301365eb1136f363df8c87377ae58168130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:29Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:29 crc kubenswrapper[4599]: I1012 07:35:29.800130 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b574e7cb76b05a81f63a55ce4544b1d4b3a58e6793be2c89620f66282642e39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:29Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:29 crc kubenswrapper[4599]: I1012 07:35:29.806421 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:29 crc kubenswrapper[4599]: I1012 07:35:29.806461 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:29 crc kubenswrapper[4599]: I1012 07:35:29.806471 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:29 crc kubenswrapper[4599]: I1012 07:35:29.806487 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:29 crc kubenswrapper[4599]: I1012 07:35:29.806497 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:29Z","lastTransitionTime":"2025-10-12T07:35:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:29 crc kubenswrapper[4599]: I1012 07:35:29.811528 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9xbn5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18ca8765-c435-4750-b803-14b539958d9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4db4e7d06f60157d91fcb8514d176f78540085701360ebcfa7eabb36fe84d6b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16c75354b9fbe00273ab4aaa5ca1306c5d097996b8ce8074a2cee8185118a001\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16c75354b9fbe00273ab4aaa5ca1306c5d097996b8ce8074a2cee8185118a001\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a7c28a100796e21008c597d15ee36786f7e87553a8a4feb5734ac79cef9b556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a7c28a100796e21008c597d15ee36786f7e87553a8a4feb5734ac79cef9b556\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6388c1e143c0caded216941ec25bd07be0a14f2f97f6328c35d5cc04815a88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6388c1e143c0caded216941ec25bd07be0a14f2f97f6328c35d5cc04815a88a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e201774968890569efafc736378cff0d700067bb0bd39677f42ef65fa2c07646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e201774968890569efafc736378cff0d700067bb0bd39677f42ef65fa2c07646\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f07c63fbebde2f05a5b1682af69eb42b5aec20728e0997f1ebc2f5a59815ad1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f07c63fbebde2f05a5b1682af69eb42b5aec20728e0997f1ebc2f5a59815ad1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f62b824f1e7400e9588b3d3a1afc52f86b3bcae039fbe3c757e1e01733e3a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f62b824f1e7400e9588b3d3a1afc52f86b3bcae039fbe3c757e1e01733e3a69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9xbn5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:29Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:29 crc kubenswrapper[4599]: I1012 07:35:29.821414 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8hm26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce311f52-0501-45d3-8209-b1d2aa25028b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://611114a1fecbd461507dd2f566f975c7ef3e8d7655d90c1a5b31cd05a430cb38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8hm26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:29Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:29 crc kubenswrapper[4599]: I1012 07:35:29.828580 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rxr42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92909e59-b659-4fc0-91c0-1880ff96b4f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4162f0c9d7789e80004cf3ea54eed47b4d473832cf68a03e348973c300f9dcc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7hr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rxr42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:29Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:29 crc kubenswrapper[4599]: I1012 07:35:29.836524 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f5988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0147ed28-bc0f-409c-a813-dc2ffffba092\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b06de751793dade650ec18334a6bcf8bb26e5e89af47ede264d087d1a950178a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzz9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f5988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:29Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:29 crc kubenswrapper[4599]: I1012 07:35:29.849695 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a95b7ab-8632-4332-a30f-64f28ef8d313\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://543f674ce98b17c90c5a351ae72f596a2e90006a1b86fcede3359715c67b94bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c150b687d6b0c337604ac14537335d57d9c955a9e31001d274507e5e65ac7bc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f73bbbb549129c48a1421dda3d2385cf3515feafd8215cdac6b792823d614b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a156f480c7252f61ec3dc991814db0e7259fdf7ae618c8f43d86da97f8a6bc5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f99854365302221a36d3d7647292430b5655222c551953b6aa6de9eb1c922bea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://810de96e34e6b028e90bcc5693f1624d337cdd775c01901fc91e6dd2a7f3564d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e32f2e2e03b1ed5891ac7e16e96345ae592a5bd631c69ebe61bd9c29bd9a192f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce496909e393b4896eb9ae165317648df3e828d219c231d95c6057cd08b78fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce4f64c7fab07385f9215821c5c75c5b345653fa141a238bb3dd74e7a1a291a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce4f64c7fab07385f9215821c5c75c5b345653fa141a238bb3dd74e7a1a291a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-whk5b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:29Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:29 crc kubenswrapper[4599]: I1012 07:35:29.859792 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:29Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:29 crc kubenswrapper[4599]: I1012 07:35:29.870530 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4b9189620cc5d626fe6c3e7bbe0efd22f7a6f061bc2afd3d33b1e04ebdbfb1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc203371af6667734441e7dca5f3b301365eb1136f363df8c87377ae58168130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:29Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:29 crc kubenswrapper[4599]: I1012 07:35:29.879966 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b574e7cb76b05a81f63a55ce4544b1d4b3a58e6793be2c89620f66282642e39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:29Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:29 crc kubenswrapper[4599]: I1012 07:35:29.890478 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9xbn5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18ca8765-c435-4750-b803-14b539958d9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4db4e7d06f60157d91fcb8514d176f78540085701360ebcfa7eabb36fe84d6b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16c75354b9fbe00273ab4aaa5ca1306c5d097996b8ce8074a2cee8185118a001\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16c75354b9fbe00273ab4aaa5ca1306c5d097996b8ce8074a2cee8185118a001\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a7c28a100796e21008c597d15ee36786f7e87553a8a4feb5734ac79cef9b556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a7c28a100796e21008c597d15ee36786f7e87553a8a4feb5734ac79cef9b556\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6388c1e143c0caded216941ec25bd07be0a14f2f97f6328c35d5cc04815a88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6388c1e143c0caded216941ec25bd07be0a14f2f97f6328c35d5cc04815a88a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e201774968890569efafc736378cff0d700067bb0bd39677f42ef65fa2c07646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e201774968890569efafc736378cff0d700067bb0bd39677f42ef65fa2c07646\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f07c63fbebde2f05a5b1682af69eb42b5aec20728e0997f1ebc2f5a59815ad1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f07c63fbebde2f05a5b1682af69eb42b5aec20728e0997f1ebc2f5a59815ad1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f62b824f1e7400e9588b3d3a1afc52f86b3bcae039fbe3c757e1e01733e3a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f62b824f1e7400e9588b3d3a1afc52f86b3bcae039fbe3c757e1e01733e3a69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9xbn5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:29Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:29 crc kubenswrapper[4599]: I1012 07:35:29.900623 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8hm26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce311f52-0501-45d3-8209-b1d2aa25028b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://611114a1fecbd461507dd2f566f975c7ef3e8d7655d90c1a5b31cd05a430cb38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8hm26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:29Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:29 crc kubenswrapper[4599]: I1012 07:35:29.908796 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:29 crc kubenswrapper[4599]: I1012 07:35:29.908839 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:29 crc kubenswrapper[4599]: I1012 07:35:29.908851 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:29 crc kubenswrapper[4599]: I1012 07:35:29.908866 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:29 crc kubenswrapper[4599]: I1012 07:35:29.908876 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:29Z","lastTransitionTime":"2025-10-12T07:35:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:29 crc kubenswrapper[4599]: I1012 07:35:29.909463 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:29Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:29 crc kubenswrapper[4599]: I1012 07:35:29.916894 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f5988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0147ed28-bc0f-409c-a813-dc2ffffba092\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b06de751793dade650ec18334a6bcf8bb26e5e89af47ede264d087d1a950178a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzz9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f5988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:29Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:29 crc kubenswrapper[4599]: I1012 07:35:29.930124 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a95b7ab-8632-4332-a30f-64f28ef8d313\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://543f674ce98b17c90c5a351ae72f596a2e90006a1b86fcede3359715c67b94bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c150b687d6b0c337604ac14537335d57d9c955a9e31001d274507e5e65ac7bc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f73bbbb549129c48a1421dda3d2385cf3515feafd8215cdac6b792823d614b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a156f480c7252f61ec3dc991814db0e7259fdf7ae618c8f43d86da97f8a6bc5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f99854365302221a36d3d7647292430b5655222c551953b6aa6de9eb1c922bea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://810de96e34e6b028e90bcc5693f1624d337cdd775c01901fc91e6dd2a7f3564d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e32f2e2e03b1ed5891ac7e16e96345ae592a5bd631c69ebe61bd9c29bd9a192f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce496909e393b4896eb9ae165317648df3e828d219c231d95c6057cd08b78fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce4f64c7fab07385f9215821c5c75c5b345653fa141a238bb3dd74e7a1a291a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce4f64c7fab07385f9215821c5c75c5b345653fa141a238bb3dd74e7a1a291a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-whk5b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:29Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:29 crc kubenswrapper[4599]: I1012 07:35:29.937935 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rxr42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92909e59-b659-4fc0-91c0-1880ff96b4f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4162f0c9d7789e80004cf3ea54eed47b4d473832cf68a03e348973c300f9dcc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7hr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rxr42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:29Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:29 crc kubenswrapper[4599]: I1012 07:35:29.946155 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:29Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:29 crc kubenswrapper[4599]: I1012 07:35:29.953402 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc694bce-8c25-4729-b452-29d44d3efe6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c222e0cd037e0719d1ea94d4f28639dae42c676fa56b5f57b22cd50d87e6b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25l5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://238fe8d2b2e7f08d70bc5f938aca3c90b68c447db21a713d651d47b47a8127c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25l5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5mz5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:29Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:29 crc kubenswrapper[4599]: I1012 07:35:29.963605 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8afb4c88-1806-483a-9957-30833be28202\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c61ee75e4bbe142c54914177759cf0c49ecc36f904c8c6bf421cd4917256d5a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://545c8135b4e5319ac3b9c84ee50556b66e4f5cfe0987f5aa37ef668130f6f267\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40188eb89126b26a3002269d063827de60fa17f72c18fbbd0e15b16c744e8ddb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc5c76ab7433309a1f2c32d2ada884b5b0d926fee8219463079f8689a1f7efef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21d90e31b7ebf70d48881d9d0837e8a1fdeabc3d53988c7ba993ce28e902d3d4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T07:35:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW1012 07:35:20.755648 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1012 07:35:20.755757 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 07:35:20.757774 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2293705747/tls.crt::/tmp/serving-cert-2293705747/tls.key\\\\\\\"\\\\nI1012 07:35:20.975376 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1012 07:35:20.977324 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1012 07:35:20.977357 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1012 07:35:20.977377 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1012 07:35:20.977382 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1012 07:35:20.982144 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1012 07:35:20.982166 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1012 07:35:20.982172 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1012 07:35:20.982212 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1012 07:35:20.982216 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1012 07:35:20.982220 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1012 07:35:20.982223 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1012 07:35:20.982226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1012 07:35:20.985689 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e595e7d1dee4ecff3b5deb6e363b57de09e4bcfddd6ab9ae9be27b6e3578628\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afc60525d60f5b497a75efb631611d658c23f6336c21ca7421bf3efee3ece0ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afc60525d60f5b497a75efb631611d658c23f6336c21ca7421bf3efee3ece0ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:29Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:29 crc kubenswrapper[4599]: I1012 07:35:29.973960 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db79c5e0b03856a5da5fba6b8e6cb77590ef9e29090ac66da24d54ea638a20f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:29Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:29 crc kubenswrapper[4599]: I1012 07:35:29.983968 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2869b43-5f5f-4ed7-9dc3-63d210bde137\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21abc118842f09c54256bd05f1e5978c928bad569bc9b52157287576117fb49d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb25175bfe42e193200fbc76f82294afbd5f23d9653637c12ddf67553431748e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da490def77b19758f5479ef7070846dee19a5057bd6b0568d96fd38a8dd6528\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://622eac195ecab593bc5fa71a4a88ac987a45be5fb130336aae7acd680c51abe4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:29Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:30 crc kubenswrapper[4599]: I1012 07:35:30.011292 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:30 crc kubenswrapper[4599]: I1012 07:35:30.011327 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:30 crc kubenswrapper[4599]: I1012 07:35:30.011372 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:30 crc kubenswrapper[4599]: I1012 07:35:30.011403 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:30 crc kubenswrapper[4599]: I1012 07:35:30.011417 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:30Z","lastTransitionTime":"2025-10-12T07:35:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:30 crc kubenswrapper[4599]: I1012 07:35:30.114302 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:30 crc kubenswrapper[4599]: I1012 07:35:30.114375 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:30 crc kubenswrapper[4599]: I1012 07:35:30.114389 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:30 crc kubenswrapper[4599]: I1012 07:35:30.114412 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:30 crc kubenswrapper[4599]: I1012 07:35:30.114425 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:30Z","lastTransitionTime":"2025-10-12T07:35:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:30 crc kubenswrapper[4599]: I1012 07:35:30.216898 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:30 crc kubenswrapper[4599]: I1012 07:35:30.216935 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:30 crc kubenswrapper[4599]: I1012 07:35:30.216944 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:30 crc kubenswrapper[4599]: I1012 07:35:30.216959 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:30 crc kubenswrapper[4599]: I1012 07:35:30.216969 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:30Z","lastTransitionTime":"2025-10-12T07:35:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:30 crc kubenswrapper[4599]: I1012 07:35:30.320669 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:30 crc kubenswrapper[4599]: I1012 07:35:30.320728 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:30 crc kubenswrapper[4599]: I1012 07:35:30.320737 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:30 crc kubenswrapper[4599]: I1012 07:35:30.320756 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:30 crc kubenswrapper[4599]: I1012 07:35:30.320768 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:30Z","lastTransitionTime":"2025-10-12T07:35:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:30 crc kubenswrapper[4599]: I1012 07:35:30.423395 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:30 crc kubenswrapper[4599]: I1012 07:35:30.423454 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:30 crc kubenswrapper[4599]: I1012 07:35:30.423465 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:30 crc kubenswrapper[4599]: I1012 07:35:30.423484 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:30 crc kubenswrapper[4599]: I1012 07:35:30.423494 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:30Z","lastTransitionTime":"2025-10-12T07:35:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:30 crc kubenswrapper[4599]: I1012 07:35:30.527038 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:30 crc kubenswrapper[4599]: I1012 07:35:30.527075 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:30 crc kubenswrapper[4599]: I1012 07:35:30.527084 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:30 crc kubenswrapper[4599]: I1012 07:35:30.527099 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:30 crc kubenswrapper[4599]: I1012 07:35:30.527109 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:30Z","lastTransitionTime":"2025-10-12T07:35:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:30 crc kubenswrapper[4599]: I1012 07:35:30.629912 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:30 crc kubenswrapper[4599]: I1012 07:35:30.629954 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:30 crc kubenswrapper[4599]: I1012 07:35:30.629964 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:30 crc kubenswrapper[4599]: I1012 07:35:30.629981 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:30 crc kubenswrapper[4599]: I1012 07:35:30.629992 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:30Z","lastTransitionTime":"2025-10-12T07:35:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:30 crc kubenswrapper[4599]: I1012 07:35:30.710832 4599 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 12 07:35:30 crc kubenswrapper[4599]: I1012 07:35:30.732353 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:30 crc kubenswrapper[4599]: I1012 07:35:30.732399 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:30 crc kubenswrapper[4599]: I1012 07:35:30.732406 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:30 crc kubenswrapper[4599]: I1012 07:35:30.732419 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:30 crc kubenswrapper[4599]: I1012 07:35:30.732432 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:30Z","lastTransitionTime":"2025-10-12T07:35:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:30 crc kubenswrapper[4599]: I1012 07:35:30.834696 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:30 crc kubenswrapper[4599]: I1012 07:35:30.834741 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:30 crc kubenswrapper[4599]: I1012 07:35:30.834751 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:30 crc kubenswrapper[4599]: I1012 07:35:30.834768 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:30 crc kubenswrapper[4599]: I1012 07:35:30.834778 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:30Z","lastTransitionTime":"2025-10-12T07:35:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:30 crc kubenswrapper[4599]: I1012 07:35:30.937286 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:30 crc kubenswrapper[4599]: I1012 07:35:30.937365 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:30 crc kubenswrapper[4599]: I1012 07:35:30.937390 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:30 crc kubenswrapper[4599]: I1012 07:35:30.937410 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:30 crc kubenswrapper[4599]: I1012 07:35:30.937422 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:30Z","lastTransitionTime":"2025-10-12T07:35:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:31 crc kubenswrapper[4599]: I1012 07:35:31.042386 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:31 crc kubenswrapper[4599]: I1012 07:35:31.042430 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:31 crc kubenswrapper[4599]: I1012 07:35:31.042440 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:31 crc kubenswrapper[4599]: I1012 07:35:31.042455 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:31 crc kubenswrapper[4599]: I1012 07:35:31.042465 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:31Z","lastTransitionTime":"2025-10-12T07:35:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:31 crc kubenswrapper[4599]: I1012 07:35:31.144592 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:31 crc kubenswrapper[4599]: I1012 07:35:31.144652 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:31 crc kubenswrapper[4599]: I1012 07:35:31.144662 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:31 crc kubenswrapper[4599]: I1012 07:35:31.144688 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:31 crc kubenswrapper[4599]: I1012 07:35:31.144706 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:31Z","lastTransitionTime":"2025-10-12T07:35:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:31 crc kubenswrapper[4599]: I1012 07:35:31.249103 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:31 crc kubenswrapper[4599]: I1012 07:35:31.249150 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:31 crc kubenswrapper[4599]: I1012 07:35:31.249162 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:31 crc kubenswrapper[4599]: I1012 07:35:31.249181 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:31 crc kubenswrapper[4599]: I1012 07:35:31.249195 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:31Z","lastTransitionTime":"2025-10-12T07:35:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:31 crc kubenswrapper[4599]: I1012 07:35:31.351776 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:31 crc kubenswrapper[4599]: I1012 07:35:31.351825 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:31 crc kubenswrapper[4599]: I1012 07:35:31.351840 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:31 crc kubenswrapper[4599]: I1012 07:35:31.351862 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:31 crc kubenswrapper[4599]: I1012 07:35:31.351872 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:31Z","lastTransitionTime":"2025-10-12T07:35:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:31 crc kubenswrapper[4599]: I1012 07:35:31.454557 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:31 crc kubenswrapper[4599]: I1012 07:35:31.454601 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:31 crc kubenswrapper[4599]: I1012 07:35:31.454611 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:31 crc kubenswrapper[4599]: I1012 07:35:31.454631 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:31 crc kubenswrapper[4599]: I1012 07:35:31.454645 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:31Z","lastTransitionTime":"2025-10-12T07:35:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:31 crc kubenswrapper[4599]: I1012 07:35:31.544451 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 07:35:31 crc kubenswrapper[4599]: I1012 07:35:31.544492 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 07:35:31 crc kubenswrapper[4599]: I1012 07:35:31.544498 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 07:35:31 crc kubenswrapper[4599]: E1012 07:35:31.544586 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 07:35:31 crc kubenswrapper[4599]: E1012 07:35:31.544696 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 07:35:31 crc kubenswrapper[4599]: E1012 07:35:31.544805 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 07:35:31 crc kubenswrapper[4599]: I1012 07:35:31.556497 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:31 crc kubenswrapper[4599]: I1012 07:35:31.556549 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:31 crc kubenswrapper[4599]: I1012 07:35:31.556562 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:31 crc kubenswrapper[4599]: I1012 07:35:31.556577 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:31 crc kubenswrapper[4599]: I1012 07:35:31.556589 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:31Z","lastTransitionTime":"2025-10-12T07:35:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:31 crc kubenswrapper[4599]: I1012 07:35:31.658896 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:31 crc kubenswrapper[4599]: I1012 07:35:31.658950 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:31 crc kubenswrapper[4599]: I1012 07:35:31.658961 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:31 crc kubenswrapper[4599]: I1012 07:35:31.658977 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:31 crc kubenswrapper[4599]: I1012 07:35:31.658992 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:31Z","lastTransitionTime":"2025-10-12T07:35:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:31 crc kubenswrapper[4599]: I1012 07:35:31.720049 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-whk5b_1a95b7ab-8632-4332-a30f-64f28ef8d313/ovnkube-controller/0.log" Oct 12 07:35:31 crc kubenswrapper[4599]: I1012 07:35:31.722817 4599 generic.go:334] "Generic (PLEG): container finished" podID="1a95b7ab-8632-4332-a30f-64f28ef8d313" containerID="e32f2e2e03b1ed5891ac7e16e96345ae592a5bd631c69ebe61bd9c29bd9a192f" exitCode=1 Oct 12 07:35:31 crc kubenswrapper[4599]: I1012 07:35:31.722872 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" event={"ID":"1a95b7ab-8632-4332-a30f-64f28ef8d313","Type":"ContainerDied","Data":"e32f2e2e03b1ed5891ac7e16e96345ae592a5bd631c69ebe61bd9c29bd9a192f"} Oct 12 07:35:31 crc kubenswrapper[4599]: I1012 07:35:31.723641 4599 scope.go:117] "RemoveContainer" containerID="e32f2e2e03b1ed5891ac7e16e96345ae592a5bd631c69ebe61bd9c29bd9a192f" Oct 12 07:35:31 crc kubenswrapper[4599]: I1012 07:35:31.732862 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rxr42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92909e59-b659-4fc0-91c0-1880ff96b4f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4162f0c9d7789e80004cf3ea54eed47b4d473832cf68a03e348973c300f9dcc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7hr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rxr42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:31Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:31 crc kubenswrapper[4599]: I1012 07:35:31.741124 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f5988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0147ed28-bc0f-409c-a813-dc2ffffba092\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b06de751793dade650ec18334a6bcf8bb26e5e89af47ede264d087d1a950178a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzz9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f5988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:31Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:31 crc kubenswrapper[4599]: I1012 07:35:31.755873 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a95b7ab-8632-4332-a30f-64f28ef8d313\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://543f674ce98b17c90c5a351ae72f596a2e90006a1b86fcede3359715c67b94bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c150b687d6b0c337604ac14537335d57d9c955a9e31001d274507e5e65ac7bc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f73bbbb549129c48a1421dda3d2385cf3515feafd8215cdac6b792823d614b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a156f480c7252f61ec3dc991814db0e7259fdf7ae618c8f43d86da97f8a6bc5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f99854365302221a36d3d7647292430b5655222c551953b6aa6de9eb1c922bea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://810de96e34e6b028e90bcc5693f1624d337cdd775c01901fc91e6dd2a7f3564d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e32f2e2e03b1ed5891ac7e16e96345ae592a5bd631c69ebe61bd9c29bd9a192f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e32f2e2e03b1ed5891ac7e16e96345ae592a5bd631c69ebe61bd9c29bd9a192f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T07:35:31Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1012 07:35:31.359812 5883 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1012 07:35:31.359827 5883 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1012 07:35:31.359854 5883 handler.go:208] Removed *v1.Node event handler 2\\\\nI1012 07:35:31.359866 5883 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1012 07:35:31.359917 5883 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1012 07:35:31.359924 5883 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1012 07:35:31.359943 5883 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1012 07:35:31.359954 5883 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1012 07:35:31.359958 5883 handler.go:208] Removed *v1.Node event handler 7\\\\nI1012 07:35:31.359957 5883 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1012 07:35:31.359977 5883 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1012 07:35:31.359988 5883 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1012 07:35:31.360014 5883 factory.go:656] Stopping watch factory\\\\nI1012 07:35:31.360027 5883 ovnkube.go:599] Stopped ovnkube\\\\nI1012 07:35:31.360047 5883 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1012 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce496909e393b4896eb9ae165317648df3e828d219c231d95c6057cd08b78fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce4f64c7fab07385f9215821c5c75c5b345653fa141a238bb3dd74e7a1a291a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce4f64c7fab07385f9215821c5c75c5b345653fa141a238bb3dd74e7a1a291a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-whk5b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:31Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:31 crc kubenswrapper[4599]: I1012 07:35:31.761486 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:31 crc kubenswrapper[4599]: I1012 07:35:31.761529 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:31 crc kubenswrapper[4599]: I1012 07:35:31.761539 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:31 crc kubenswrapper[4599]: I1012 07:35:31.761559 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:31 crc kubenswrapper[4599]: I1012 07:35:31.761572 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:31Z","lastTransitionTime":"2025-10-12T07:35:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:31 crc kubenswrapper[4599]: I1012 07:35:31.769296 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8afb4c88-1806-483a-9957-30833be28202\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c61ee75e4bbe142c54914177759cf0c49ecc36f904c8c6bf421cd4917256d5a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://545c8135b4e5319ac3b9c84ee50556b66e4f5cfe0987f5aa37ef668130f6f267\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40188eb89126b26a3002269d063827de60fa17f72c18fbbd0e15b16c744e8ddb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc5c76ab7433309a1f2c32d2ada884b5b0d926fee8219463079f8689a1f7efef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21d90e31b7ebf70d48881d9d0837e8a1fdeabc3d53988c7ba993ce28e902d3d4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T07:35:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW1012 07:35:20.755648 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1012 07:35:20.755757 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 07:35:20.757774 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2293705747/tls.crt::/tmp/serving-cert-2293705747/tls.key\\\\\\\"\\\\nI1012 07:35:20.975376 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1012 07:35:20.977324 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1012 07:35:20.977357 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1012 07:35:20.977377 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1012 07:35:20.977382 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1012 07:35:20.982144 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1012 07:35:20.982166 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1012 07:35:20.982172 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1012 07:35:20.982212 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1012 07:35:20.982216 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1012 07:35:20.982220 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1012 07:35:20.982223 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1012 07:35:20.982226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1012 07:35:20.985689 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e595e7d1dee4ecff3b5deb6e363b57de09e4bcfddd6ab9ae9be27b6e3578628\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afc60525d60f5b497a75efb631611d658c23f6336c21ca7421bf3efee3ece0ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afc60525d60f5b497a75efb631611d658c23f6336c21ca7421bf3efee3ece0ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:31Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:31 crc kubenswrapper[4599]: I1012 07:35:31.782012 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db79c5e0b03856a5da5fba6b8e6cb77590ef9e29090ac66da24d54ea638a20f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:31Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:31 crc kubenswrapper[4599]: I1012 07:35:31.792100 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:31Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:31 crc kubenswrapper[4599]: I1012 07:35:31.800883 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc694bce-8c25-4729-b452-29d44d3efe6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c222e0cd037e0719d1ea94d4f28639dae42c676fa56b5f57b22cd50d87e6b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25l5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://238fe8d2b2e7f08d70bc5f938aca3c90b68c447db21a713d651d47b47a8127c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25l5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5mz5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:31Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:31 crc kubenswrapper[4599]: I1012 07:35:31.811685 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2869b43-5f5f-4ed7-9dc3-63d210bde137\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21abc118842f09c54256bd05f1e5978c928bad569bc9b52157287576117fb49d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb25175bfe42e193200fbc76f82294afbd5f23d9653637c12ddf67553431748e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da490def77b19758f5479ef7070846dee19a5057bd6b0568d96fd38a8dd6528\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://622eac195ecab593bc5fa71a4a88ac987a45be5fb130336aae7acd680c51abe4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:31Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:31 crc kubenswrapper[4599]: I1012 07:35:31.826990 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:31Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:31 crc kubenswrapper[4599]: I1012 07:35:31.837377 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:31Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:31 crc kubenswrapper[4599]: I1012 07:35:31.847643 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4b9189620cc5d626fe6c3e7bbe0efd22f7a6f061bc2afd3d33b1e04ebdbfb1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc203371af6667734441e7dca5f3b301365eb1136f363df8c87377ae58168130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:31Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:31 crc kubenswrapper[4599]: I1012 07:35:31.860160 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b574e7cb76b05a81f63a55ce4544b1d4b3a58e6793be2c89620f66282642e39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:31Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:31 crc kubenswrapper[4599]: I1012 07:35:31.863685 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:31 crc kubenswrapper[4599]: I1012 07:35:31.863738 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:31 crc kubenswrapper[4599]: I1012 07:35:31.863750 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:31 crc kubenswrapper[4599]: I1012 07:35:31.863767 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:31 crc kubenswrapper[4599]: I1012 07:35:31.863777 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:31Z","lastTransitionTime":"2025-10-12T07:35:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:31 crc kubenswrapper[4599]: I1012 07:35:31.870716 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9xbn5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18ca8765-c435-4750-b803-14b539958d9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4db4e7d06f60157d91fcb8514d176f78540085701360ebcfa7eabb36fe84d6b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16c75354b9fbe00273ab4aaa5ca1306c5d097996b8ce8074a2cee8185118a001\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16c75354b9fbe00273ab4aaa5ca1306c5d097996b8ce8074a2cee8185118a001\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a7c28a100796e21008c597d15ee36786f7e87553a8a4feb5734ac79cef9b556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a7c28a100796e21008c597d15ee36786f7e87553a8a4feb5734ac79cef9b556\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6388c1e143c0caded216941ec25bd07be0a14f2f97f6328c35d5cc04815a88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6388c1e143c0caded216941ec25bd07be0a14f2f97f6328c35d5cc04815a88a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e201774968890569efafc736378cff0d700067bb0bd39677f42ef65fa2c07646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e201774968890569efafc736378cff0d700067bb0bd39677f42ef65fa2c07646\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f07c63fbebde2f05a5b1682af69eb42b5aec20728e0997f1ebc2f5a59815ad1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f07c63fbebde2f05a5b1682af69eb42b5aec20728e0997f1ebc2f5a59815ad1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f62b824f1e7400e9588b3d3a1afc52f86b3bcae039fbe3c757e1e01733e3a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f62b824f1e7400e9588b3d3a1afc52f86b3bcae039fbe3c757e1e01733e3a69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9xbn5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:31Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:31 crc kubenswrapper[4599]: I1012 07:35:31.879261 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8hm26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce311f52-0501-45d3-8209-b1d2aa25028b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://611114a1fecbd461507dd2f566f975c7ef3e8d7655d90c1a5b31cd05a430cb38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8hm26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:31Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:31 crc kubenswrapper[4599]: I1012 07:35:31.967912 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:31 crc kubenswrapper[4599]: I1012 07:35:31.967971 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:31 crc kubenswrapper[4599]: I1012 07:35:31.967983 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:31 crc kubenswrapper[4599]: I1012 07:35:31.968010 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:31 crc kubenswrapper[4599]: I1012 07:35:31.968025 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:31Z","lastTransitionTime":"2025-10-12T07:35:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:32 crc kubenswrapper[4599]: I1012 07:35:32.070762 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:32 crc kubenswrapper[4599]: I1012 07:35:32.070800 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:32 crc kubenswrapper[4599]: I1012 07:35:32.070809 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:32 crc kubenswrapper[4599]: I1012 07:35:32.070824 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:32 crc kubenswrapper[4599]: I1012 07:35:32.070834 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:32Z","lastTransitionTime":"2025-10-12T07:35:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:32 crc kubenswrapper[4599]: I1012 07:35:32.173325 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:32 crc kubenswrapper[4599]: I1012 07:35:32.173402 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:32 crc kubenswrapper[4599]: I1012 07:35:32.173413 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:32 crc kubenswrapper[4599]: I1012 07:35:32.173436 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:32 crc kubenswrapper[4599]: I1012 07:35:32.173450 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:32Z","lastTransitionTime":"2025-10-12T07:35:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:32 crc kubenswrapper[4599]: I1012 07:35:32.275622 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:32 crc kubenswrapper[4599]: I1012 07:35:32.275664 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:32 crc kubenswrapper[4599]: I1012 07:35:32.275673 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:32 crc kubenswrapper[4599]: I1012 07:35:32.275689 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:32 crc kubenswrapper[4599]: I1012 07:35:32.275698 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:32Z","lastTransitionTime":"2025-10-12T07:35:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:32 crc kubenswrapper[4599]: I1012 07:35:32.377615 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:32 crc kubenswrapper[4599]: I1012 07:35:32.377663 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:32 crc kubenswrapper[4599]: I1012 07:35:32.377672 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:32 crc kubenswrapper[4599]: I1012 07:35:32.377691 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:32 crc kubenswrapper[4599]: I1012 07:35:32.377702 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:32Z","lastTransitionTime":"2025-10-12T07:35:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:32 crc kubenswrapper[4599]: I1012 07:35:32.480446 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:32 crc kubenswrapper[4599]: I1012 07:35:32.480513 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:32 crc kubenswrapper[4599]: I1012 07:35:32.480523 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:32 crc kubenswrapper[4599]: I1012 07:35:32.480540 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:32 crc kubenswrapper[4599]: I1012 07:35:32.480551 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:32Z","lastTransitionTime":"2025-10-12T07:35:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:32 crc kubenswrapper[4599]: I1012 07:35:32.582250 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:32 crc kubenswrapper[4599]: I1012 07:35:32.582300 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:32 crc kubenswrapper[4599]: I1012 07:35:32.582311 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:32 crc kubenswrapper[4599]: I1012 07:35:32.582324 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:32 crc kubenswrapper[4599]: I1012 07:35:32.582345 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:32Z","lastTransitionTime":"2025-10-12T07:35:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:32 crc kubenswrapper[4599]: I1012 07:35:32.684746 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:32 crc kubenswrapper[4599]: I1012 07:35:32.684807 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:32 crc kubenswrapper[4599]: I1012 07:35:32.684821 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:32 crc kubenswrapper[4599]: I1012 07:35:32.684841 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:32 crc kubenswrapper[4599]: I1012 07:35:32.684853 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:32Z","lastTransitionTime":"2025-10-12T07:35:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:32 crc kubenswrapper[4599]: I1012 07:35:32.727584 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-whk5b_1a95b7ab-8632-4332-a30f-64f28ef8d313/ovnkube-controller/1.log" Oct 12 07:35:32 crc kubenswrapper[4599]: I1012 07:35:32.728217 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-whk5b_1a95b7ab-8632-4332-a30f-64f28ef8d313/ovnkube-controller/0.log" Oct 12 07:35:32 crc kubenswrapper[4599]: I1012 07:35:32.730953 4599 generic.go:334] "Generic (PLEG): container finished" podID="1a95b7ab-8632-4332-a30f-64f28ef8d313" containerID="a5b9102b596e6d192154de99c79ffcf4423af88d8c95b78cfc1a27a7651679b5" exitCode=1 Oct 12 07:35:32 crc kubenswrapper[4599]: I1012 07:35:32.730993 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" event={"ID":"1a95b7ab-8632-4332-a30f-64f28ef8d313","Type":"ContainerDied","Data":"a5b9102b596e6d192154de99c79ffcf4423af88d8c95b78cfc1a27a7651679b5"} Oct 12 07:35:32 crc kubenswrapper[4599]: I1012 07:35:32.731051 4599 scope.go:117] "RemoveContainer" containerID="e32f2e2e03b1ed5891ac7e16e96345ae592a5bd631c69ebe61bd9c29bd9a192f" Oct 12 07:35:32 crc kubenswrapper[4599]: I1012 07:35:32.731554 4599 scope.go:117] "RemoveContainer" containerID="a5b9102b596e6d192154de99c79ffcf4423af88d8c95b78cfc1a27a7651679b5" Oct 12 07:35:32 crc kubenswrapper[4599]: E1012 07:35:32.731724 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-whk5b_openshift-ovn-kubernetes(1a95b7ab-8632-4332-a30f-64f28ef8d313)\"" pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" podUID="1a95b7ab-8632-4332-a30f-64f28ef8d313" Oct 12 07:35:32 crc kubenswrapper[4599]: I1012 07:35:32.744013 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rxr42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92909e59-b659-4fc0-91c0-1880ff96b4f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4162f0c9d7789e80004cf3ea54eed47b4d473832cf68a03e348973c300f9dcc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7hr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rxr42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:32Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:32 crc kubenswrapper[4599]: I1012 07:35:32.751275 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f5988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0147ed28-bc0f-409c-a813-dc2ffffba092\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b06de751793dade650ec18334a6bcf8bb26e5e89af47ede264d087d1a950178a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzz9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f5988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:32Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:32 crc kubenswrapper[4599]: I1012 07:35:32.764457 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a95b7ab-8632-4332-a30f-64f28ef8d313\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://543f674ce98b17c90c5a351ae72f596a2e90006a1b86fcede3359715c67b94bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c150b687d6b0c337604ac14537335d57d9c955a9e31001d274507e5e65ac7bc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f73bbbb549129c48a1421dda3d2385cf3515feafd8215cdac6b792823d614b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a156f480c7252f61ec3dc991814db0e7259fdf7ae618c8f43d86da97f8a6bc5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f99854365302221a36d3d7647292430b5655222c551953b6aa6de9eb1c922bea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://810de96e34e6b028e90bcc5693f1624d337cdd775c01901fc91e6dd2a7f3564d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5b9102b596e6d192154de99c79ffcf4423af88d8c95b78cfc1a27a7651679b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e32f2e2e03b1ed5891ac7e16e96345ae592a5bd631c69ebe61bd9c29bd9a192f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T07:35:31Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1012 07:35:31.359812 5883 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1012 07:35:31.359827 5883 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1012 07:35:31.359854 5883 handler.go:208] Removed *v1.Node event handler 2\\\\nI1012 07:35:31.359866 5883 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1012 07:35:31.359917 5883 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1012 07:35:31.359924 5883 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1012 07:35:31.359943 5883 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1012 07:35:31.359954 5883 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1012 07:35:31.359958 5883 handler.go:208] Removed *v1.Node event handler 7\\\\nI1012 07:35:31.359957 5883 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1012 07:35:31.359977 5883 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1012 07:35:31.359988 5883 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1012 07:35:31.360014 5883 factory.go:656] Stopping watch factory\\\\nI1012 07:35:31.360027 5883 ovnkube.go:599] Stopped ovnkube\\\\nI1012 07:35:31.360047 5883 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1012 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5b9102b596e6d192154de99c79ffcf4423af88d8c95b78cfc1a27a7651679b5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T07:35:32Z\\\",\\\"message\\\":\\\"n network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:32Z is after 2025-08-24T17:21:41Z]\\\\nI1012 07:35:32.411040 6004 services_controller.go:454] Service openshift-machine-api/machine-api-operator-machine-webhook for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI1012 07:35:32.411167 6004 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/machine-config-daemon-5mz5c\\\\nI1012 07:35:32.411173 6004 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-f5988\\\\nI1012 07:35:32.411177 6004 obj_retry.go:365] Adding new object: *v1.Pod openshift-machine-config-operator/machine-config-daemon-5mz5c\\\\nI1012 07:35:32.411180 6004 obj_retry.go:365] Adding new object: *v1.Pod openshift-dns/node-resolver-f5988\\\\nI1012 07:35:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce496909e393b4896eb9ae165317648df3e828d219c231d95c6057cd08b78fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce4f64c7fab07385f9215821c5c75c5b345653fa141a238bb3dd74e7a1a291a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce4f64c7fab07385f9215821c5c75c5b345653fa141a238bb3dd74e7a1a291a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-whk5b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:32Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:32 crc kubenswrapper[4599]: I1012 07:35:32.774838 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db79c5e0b03856a5da5fba6b8e6cb77590ef9e29090ac66da24d54ea638a20f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:32Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:32 crc kubenswrapper[4599]: I1012 07:35:32.786781 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:32 crc kubenswrapper[4599]: I1012 07:35:32.786816 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:32 crc kubenswrapper[4599]: I1012 07:35:32.786829 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:32 crc kubenswrapper[4599]: I1012 07:35:32.786849 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:32 crc kubenswrapper[4599]: I1012 07:35:32.786862 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:32Z","lastTransitionTime":"2025-10-12T07:35:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:32 crc kubenswrapper[4599]: I1012 07:35:32.787458 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:32Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:32 crc kubenswrapper[4599]: I1012 07:35:32.795498 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc694bce-8c25-4729-b452-29d44d3efe6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c222e0cd037e0719d1ea94d4f28639dae42c676fa56b5f57b22cd50d87e6b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25l5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://238fe8d2b2e7f08d70bc5f938aca3c90b68c447db21a713d651d47b47a8127c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25l5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5mz5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:32Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:32 crc kubenswrapper[4599]: I1012 07:35:32.805599 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8afb4c88-1806-483a-9957-30833be28202\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c61ee75e4bbe142c54914177759cf0c49ecc36f904c8c6bf421cd4917256d5a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://545c8135b4e5319ac3b9c84ee50556b66e4f5cfe0987f5aa37ef668130f6f267\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40188eb89126b26a3002269d063827de60fa17f72c18fbbd0e15b16c744e8ddb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc5c76ab7433309a1f2c32d2ada884b5b0d926fee8219463079f8689a1f7efef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21d90e31b7ebf70d48881d9d0837e8a1fdeabc3d53988c7ba993ce28e902d3d4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T07:35:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW1012 07:35:20.755648 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1012 07:35:20.755757 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 07:35:20.757774 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2293705747/tls.crt::/tmp/serving-cert-2293705747/tls.key\\\\\\\"\\\\nI1012 07:35:20.975376 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1012 07:35:20.977324 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1012 07:35:20.977357 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1012 07:35:20.977377 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1012 07:35:20.977382 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1012 07:35:20.982144 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1012 07:35:20.982166 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1012 07:35:20.982172 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1012 07:35:20.982212 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1012 07:35:20.982216 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1012 07:35:20.982220 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1012 07:35:20.982223 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1012 07:35:20.982226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1012 07:35:20.985689 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e595e7d1dee4ecff3b5deb6e363b57de09e4bcfddd6ab9ae9be27b6e3578628\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afc60525d60f5b497a75efb631611d658c23f6336c21ca7421bf3efee3ece0ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afc60525d60f5b497a75efb631611d658c23f6336c21ca7421bf3efee3ece0ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:32Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:32 crc kubenswrapper[4599]: I1012 07:35:32.814408 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2869b43-5f5f-4ed7-9dc3-63d210bde137\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21abc118842f09c54256bd05f1e5978c928bad569bc9b52157287576117fb49d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb25175bfe42e193200fbc76f82294afbd5f23d9653637c12ddf67553431748e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da490def77b19758f5479ef7070846dee19a5057bd6b0568d96fd38a8dd6528\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://622eac195ecab593bc5fa71a4a88ac987a45be5fb130336aae7acd680c51abe4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:32Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:32 crc kubenswrapper[4599]: I1012 07:35:32.822728 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:32Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:32 crc kubenswrapper[4599]: I1012 07:35:32.837264 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:32Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:32 crc kubenswrapper[4599]: I1012 07:35:32.847695 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4b9189620cc5d626fe6c3e7bbe0efd22f7a6f061bc2afd3d33b1e04ebdbfb1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc203371af6667734441e7dca5f3b301365eb1136f363df8c87377ae58168130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:32Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:32 crc kubenswrapper[4599]: I1012 07:35:32.856230 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b574e7cb76b05a81f63a55ce4544b1d4b3a58e6793be2c89620f66282642e39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:32Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:32 crc kubenswrapper[4599]: I1012 07:35:32.867176 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9xbn5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18ca8765-c435-4750-b803-14b539958d9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4db4e7d06f60157d91fcb8514d176f78540085701360ebcfa7eabb36fe84d6b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16c75354b9fbe00273ab4aaa5ca1306c5d097996b8ce8074a2cee8185118a001\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16c75354b9fbe00273ab4aaa5ca1306c5d097996b8ce8074a2cee8185118a001\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a7c28a100796e21008c597d15ee36786f7e87553a8a4feb5734ac79cef9b556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a7c28a100796e21008c597d15ee36786f7e87553a8a4feb5734ac79cef9b556\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6388c1e143c0caded216941ec25bd07be0a14f2f97f6328c35d5cc04815a88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6388c1e143c0caded216941ec25bd07be0a14f2f97f6328c35d5cc04815a88a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e201774968890569efafc736378cff0d700067bb0bd39677f42ef65fa2c07646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e201774968890569efafc736378cff0d700067bb0bd39677f42ef65fa2c07646\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f07c63fbebde2f05a5b1682af69eb42b5aec20728e0997f1ebc2f5a59815ad1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f07c63fbebde2f05a5b1682af69eb42b5aec20728e0997f1ebc2f5a59815ad1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f62b824f1e7400e9588b3d3a1afc52f86b3bcae039fbe3c757e1e01733e3a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f62b824f1e7400e9588b3d3a1afc52f86b3bcae039fbe3c757e1e01733e3a69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9xbn5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:32Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:32 crc kubenswrapper[4599]: I1012 07:35:32.876207 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8hm26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce311f52-0501-45d3-8209-b1d2aa25028b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://611114a1fecbd461507dd2f566f975c7ef3e8d7655d90c1a5b31cd05a430cb38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8hm26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:32Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:32 crc kubenswrapper[4599]: I1012 07:35:32.889889 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:32 crc kubenswrapper[4599]: I1012 07:35:32.889921 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:32 crc kubenswrapper[4599]: I1012 07:35:32.889931 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:32 crc kubenswrapper[4599]: I1012 07:35:32.889945 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:32 crc kubenswrapper[4599]: I1012 07:35:32.889954 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:32Z","lastTransitionTime":"2025-10-12T07:35:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:32 crc kubenswrapper[4599]: I1012 07:35:32.992834 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:32 crc kubenswrapper[4599]: I1012 07:35:32.992873 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:32 crc kubenswrapper[4599]: I1012 07:35:32.992884 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:32 crc kubenswrapper[4599]: I1012 07:35:32.992897 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:32 crc kubenswrapper[4599]: I1012 07:35:32.992910 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:32Z","lastTransitionTime":"2025-10-12T07:35:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.095216 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.095266 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.095276 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.095296 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.095309 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:33Z","lastTransitionTime":"2025-10-12T07:35:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.109138 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xl2ns"] Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.109852 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xl2ns" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.112808 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.112882 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.124995 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a95b7ab-8632-4332-a30f-64f28ef8d313\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://543f674ce98b17c90c5a351ae72f596a2e90006a1b86fcede3359715c67b94bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c150b687d6b0c337604ac14537335d57d9c955a9e31001d274507e5e65ac7bc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f73bbbb549129c48a1421dda3d2385cf3515feafd8215cdac6b792823d614b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a156f480c7252f61ec3dc991814db0e7259fdf7ae618c8f43d86da97f8a6bc5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f99854365302221a36d3d7647292430b5655222c551953b6aa6de9eb1c922bea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://810de96e34e6b028e90bcc5693f1624d337cdd775c01901fc91e6dd2a7f3564d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5b9102b596e6d192154de99c79ffcf4423af88d8c95b78cfc1a27a7651679b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e32f2e2e03b1ed5891ac7e16e96345ae592a5bd631c69ebe61bd9c29bd9a192f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T07:35:31Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1012 07:35:31.359812 5883 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1012 07:35:31.359827 5883 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1012 07:35:31.359854 5883 handler.go:208] Removed *v1.Node event handler 2\\\\nI1012 07:35:31.359866 5883 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1012 07:35:31.359917 5883 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1012 07:35:31.359924 5883 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1012 07:35:31.359943 5883 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1012 07:35:31.359954 5883 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1012 07:35:31.359958 5883 handler.go:208] Removed *v1.Node event handler 7\\\\nI1012 07:35:31.359957 5883 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1012 07:35:31.359977 5883 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1012 07:35:31.359988 5883 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1012 07:35:31.360014 5883 factory.go:656] Stopping watch factory\\\\nI1012 07:35:31.360027 5883 ovnkube.go:599] Stopped ovnkube\\\\nI1012 07:35:31.360047 5883 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1012 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5b9102b596e6d192154de99c79ffcf4423af88d8c95b78cfc1a27a7651679b5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T07:35:32Z\\\",\\\"message\\\":\\\"n network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:32Z is after 2025-08-24T17:21:41Z]\\\\nI1012 07:35:32.411040 6004 services_controller.go:454] Service openshift-machine-api/machine-api-operator-machine-webhook for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI1012 07:35:32.411167 6004 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/machine-config-daemon-5mz5c\\\\nI1012 07:35:32.411173 6004 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-f5988\\\\nI1012 07:35:32.411177 6004 obj_retry.go:365] Adding new object: *v1.Pod openshift-machine-config-operator/machine-config-daemon-5mz5c\\\\nI1012 07:35:32.411180 6004 obj_retry.go:365] Adding new object: *v1.Pod openshift-dns/node-resolver-f5988\\\\nI1012 07:35:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce496909e393b4896eb9ae165317648df3e828d219c231d95c6057cd08b78fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce4f64c7fab07385f9215821c5c75c5b345653fa141a238bb3dd74e7a1a291a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce4f64c7fab07385f9215821c5c75c5b345653fa141a238bb3dd74e7a1a291a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-whk5b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:33Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.134825 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rxr42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92909e59-b659-4fc0-91c0-1880ff96b4f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4162f0c9d7789e80004cf3ea54eed47b4d473832cf68a03e348973c300f9dcc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7hr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rxr42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:33Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.142466 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f5988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0147ed28-bc0f-409c-a813-dc2ffffba092\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b06de751793dade650ec18334a6bcf8bb26e5e89af47ede264d087d1a950178a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzz9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f5988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:33Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.151535 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc694bce-8c25-4729-b452-29d44d3efe6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c222e0cd037e0719d1ea94d4f28639dae42c676fa56b5f57b22cd50d87e6b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25l5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://238fe8d2b2e7f08d70bc5f938aca3c90b68c447db21a713d651d47b47a8127c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25l5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5mz5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:33Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.159933 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xl2ns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29fbfc4b-8e32-4132-a34d-48b25ec31428\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqqzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqqzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xl2ns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:33Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.171833 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8afb4c88-1806-483a-9957-30833be28202\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c61ee75e4bbe142c54914177759cf0c49ecc36f904c8c6bf421cd4917256d5a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://545c8135b4e5319ac3b9c84ee50556b66e4f5cfe0987f5aa37ef668130f6f267\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40188eb89126b26a3002269d063827de60fa17f72c18fbbd0e15b16c744e8ddb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc5c76ab7433309a1f2c32d2ada884b5b0d926fee8219463079f8689a1f7efef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21d90e31b7ebf70d48881d9d0837e8a1fdeabc3d53988c7ba993ce28e902d3d4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T07:35:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW1012 07:35:20.755648 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1012 07:35:20.755757 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 07:35:20.757774 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2293705747/tls.crt::/tmp/serving-cert-2293705747/tls.key\\\\\\\"\\\\nI1012 07:35:20.975376 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1012 07:35:20.977324 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1012 07:35:20.977357 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1012 07:35:20.977377 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1012 07:35:20.977382 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1012 07:35:20.982144 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1012 07:35:20.982166 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1012 07:35:20.982172 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1012 07:35:20.982212 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1012 07:35:20.982216 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1012 07:35:20.982220 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1012 07:35:20.982223 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1012 07:35:20.982226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1012 07:35:20.985689 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e595e7d1dee4ecff3b5deb6e363b57de09e4bcfddd6ab9ae9be27b6e3578628\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afc60525d60f5b497a75efb631611d658c23f6336c21ca7421bf3efee3ece0ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afc60525d60f5b497a75efb631611d658c23f6336c21ca7421bf3efee3ece0ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:33Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.182366 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db79c5e0b03856a5da5fba6b8e6cb77590ef9e29090ac66da24d54ea638a20f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:33Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.192435 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:33Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.196935 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.196965 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.196976 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.196988 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.196998 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:33Z","lastTransitionTime":"2025-10-12T07:35:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.202364 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2869b43-5f5f-4ed7-9dc3-63d210bde137\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21abc118842f09c54256bd05f1e5978c928bad569bc9b52157287576117fb49d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb25175bfe42e193200fbc76f82294afbd5f23d9653637c12ddf67553431748e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da490def77b19758f5479ef7070846dee19a5057bd6b0568d96fd38a8dd6528\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://622eac195ecab593bc5fa71a4a88ac987a45be5fb130336aae7acd680c51abe4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:33Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.219985 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4b9189620cc5d626fe6c3e7bbe0efd22f7a6f061bc2afd3d33b1e04ebdbfb1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc203371af6667734441e7dca5f3b301365eb1136f363df8c87377ae58168130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:33Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.231090 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b574e7cb76b05a81f63a55ce4544b1d4b3a58e6793be2c89620f66282642e39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:33Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.241327 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9xbn5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18ca8765-c435-4750-b803-14b539958d9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4db4e7d06f60157d91fcb8514d176f78540085701360ebcfa7eabb36fe84d6b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16c75354b9fbe00273ab4aaa5ca1306c5d097996b8ce8074a2cee8185118a001\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16c75354b9fbe00273ab4aaa5ca1306c5d097996b8ce8074a2cee8185118a001\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a7c28a100796e21008c597d15ee36786f7e87553a8a4feb5734ac79cef9b556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a7c28a100796e21008c597d15ee36786f7e87553a8a4feb5734ac79cef9b556\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6388c1e143c0caded216941ec25bd07be0a14f2f97f6328c35d5cc04815a88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6388c1e143c0caded216941ec25bd07be0a14f2f97f6328c35d5cc04815a88a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e201774968890569efafc736378cff0d700067bb0bd39677f42ef65fa2c07646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e201774968890569efafc736378cff0d700067bb0bd39677f42ef65fa2c07646\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f07c63fbebde2f05a5b1682af69eb42b5aec20728e0997f1ebc2f5a59815ad1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f07c63fbebde2f05a5b1682af69eb42b5aec20728e0997f1ebc2f5a59815ad1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f62b824f1e7400e9588b3d3a1afc52f86b3bcae039fbe3c757e1e01733e3a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f62b824f1e7400e9588b3d3a1afc52f86b3bcae039fbe3c757e1e01733e3a69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9xbn5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:33Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.250397 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8hm26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce311f52-0501-45d3-8209-b1d2aa25028b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://611114a1fecbd461507dd2f566f975c7ef3e8d7655d90c1a5b31cd05a430cb38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8hm26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:33Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.259399 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:33Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.267530 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:33Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.294913 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/29fbfc4b-8e32-4132-a34d-48b25ec31428-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-xl2ns\" (UID: \"29fbfc4b-8e32-4132-a34d-48b25ec31428\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xl2ns" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.294945 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/29fbfc4b-8e32-4132-a34d-48b25ec31428-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-xl2ns\" (UID: \"29fbfc4b-8e32-4132-a34d-48b25ec31428\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xl2ns" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.294967 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqqzg\" (UniqueName: \"kubernetes.io/projected/29fbfc4b-8e32-4132-a34d-48b25ec31428-kube-api-access-sqqzg\") pod \"ovnkube-control-plane-749d76644c-xl2ns\" (UID: \"29fbfc4b-8e32-4132-a34d-48b25ec31428\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xl2ns" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.295018 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/29fbfc4b-8e32-4132-a34d-48b25ec31428-env-overrides\") pod \"ovnkube-control-plane-749d76644c-xl2ns\" (UID: \"29fbfc4b-8e32-4132-a34d-48b25ec31428\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xl2ns" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.300809 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.300844 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.300855 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.300870 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.300880 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:33Z","lastTransitionTime":"2025-10-12T07:35:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.395908 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/29fbfc4b-8e32-4132-a34d-48b25ec31428-env-overrides\") pod \"ovnkube-control-plane-749d76644c-xl2ns\" (UID: \"29fbfc4b-8e32-4132-a34d-48b25ec31428\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xl2ns" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.395944 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/29fbfc4b-8e32-4132-a34d-48b25ec31428-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-xl2ns\" (UID: \"29fbfc4b-8e32-4132-a34d-48b25ec31428\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xl2ns" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.395963 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/29fbfc4b-8e32-4132-a34d-48b25ec31428-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-xl2ns\" (UID: \"29fbfc4b-8e32-4132-a34d-48b25ec31428\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xl2ns" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.395988 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqqzg\" (UniqueName: \"kubernetes.io/projected/29fbfc4b-8e32-4132-a34d-48b25ec31428-kube-api-access-sqqzg\") pod \"ovnkube-control-plane-749d76644c-xl2ns\" (UID: \"29fbfc4b-8e32-4132-a34d-48b25ec31428\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xl2ns" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.396895 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/29fbfc4b-8e32-4132-a34d-48b25ec31428-env-overrides\") pod \"ovnkube-control-plane-749d76644c-xl2ns\" (UID: \"29fbfc4b-8e32-4132-a34d-48b25ec31428\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xl2ns" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.397221 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/29fbfc4b-8e32-4132-a34d-48b25ec31428-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-xl2ns\" (UID: \"29fbfc4b-8e32-4132-a34d-48b25ec31428\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xl2ns" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.400921 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/29fbfc4b-8e32-4132-a34d-48b25ec31428-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-xl2ns\" (UID: \"29fbfc4b-8e32-4132-a34d-48b25ec31428\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xl2ns" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.403038 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.403079 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.403091 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.403111 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.403122 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:33Z","lastTransitionTime":"2025-10-12T07:35:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.410884 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqqzg\" (UniqueName: \"kubernetes.io/projected/29fbfc4b-8e32-4132-a34d-48b25ec31428-kube-api-access-sqqzg\") pod \"ovnkube-control-plane-749d76644c-xl2ns\" (UID: \"29fbfc4b-8e32-4132-a34d-48b25ec31428\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xl2ns" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.422848 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xl2ns" Oct 12 07:35:33 crc kubenswrapper[4599]: W1012 07:35:33.433038 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29fbfc4b_8e32_4132_a34d_48b25ec31428.slice/crio-22ee0bbaaab3bac0b19b20fff16d4c8b6dea5b95510a5dddbbf8e155777eb629 WatchSource:0}: Error finding container 22ee0bbaaab3bac0b19b20fff16d4c8b6dea5b95510a5dddbbf8e155777eb629: Status 404 returned error can't find the container with id 22ee0bbaaab3bac0b19b20fff16d4c8b6dea5b95510a5dddbbf8e155777eb629 Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.505435 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.505480 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.505490 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.505507 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.505555 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:33Z","lastTransitionTime":"2025-10-12T07:35:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.544345 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.544377 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.544393 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 07:35:33 crc kubenswrapper[4599]: E1012 07:35:33.544488 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 07:35:33 crc kubenswrapper[4599]: E1012 07:35:33.544625 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 07:35:33 crc kubenswrapper[4599]: E1012 07:35:33.544763 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.557130 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4b9189620cc5d626fe6c3e7bbe0efd22f7a6f061bc2afd3d33b1e04ebdbfb1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc203371af6667734441e7dca5f3b301365eb1136f363df8c87377ae58168130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:33Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.566052 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b574e7cb76b05a81f63a55ce4544b1d4b3a58e6793be2c89620f66282642e39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:33Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.576654 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9xbn5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18ca8765-c435-4750-b803-14b539958d9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4db4e7d06f60157d91fcb8514d176f78540085701360ebcfa7eabb36fe84d6b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16c75354b9fbe00273ab4aaa5ca1306c5d097996b8ce8074a2cee8185118a001\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16c75354b9fbe00273ab4aaa5ca1306c5d097996b8ce8074a2cee8185118a001\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a7c28a100796e21008c597d15ee36786f7e87553a8a4feb5734ac79cef9b556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a7c28a100796e21008c597d15ee36786f7e87553a8a4feb5734ac79cef9b556\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6388c1e143c0caded216941ec25bd07be0a14f2f97f6328c35d5cc04815a88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6388c1e143c0caded216941ec25bd07be0a14f2f97f6328c35d5cc04815a88a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e201774968890569efafc736378cff0d700067bb0bd39677f42ef65fa2c07646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e201774968890569efafc736378cff0d700067bb0bd39677f42ef65fa2c07646\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f07c63fbebde2f05a5b1682af69eb42b5aec20728e0997f1ebc2f5a59815ad1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f07c63fbebde2f05a5b1682af69eb42b5aec20728e0997f1ebc2f5a59815ad1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f62b824f1e7400e9588b3d3a1afc52f86b3bcae039fbe3c757e1e01733e3a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f62b824f1e7400e9588b3d3a1afc52f86b3bcae039fbe3c757e1e01733e3a69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9xbn5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:33Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.586435 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8hm26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce311f52-0501-45d3-8209-b1d2aa25028b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://611114a1fecbd461507dd2f566f975c7ef3e8d7655d90c1a5b31cd05a430cb38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8hm26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:33Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.595296 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:33Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.606871 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.606897 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.606907 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.606922 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.606932 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:33Z","lastTransitionTime":"2025-10-12T07:35:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.610047 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:33Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.623152 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a95b7ab-8632-4332-a30f-64f28ef8d313\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://543f674ce98b17c90c5a351ae72f596a2e90006a1b86fcede3359715c67b94bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c150b687d6b0c337604ac14537335d57d9c955a9e31001d274507e5e65ac7bc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f73bbbb549129c48a1421dda3d2385cf3515feafd8215cdac6b792823d614b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a156f480c7252f61ec3dc991814db0e7259fdf7ae618c8f43d86da97f8a6bc5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f99854365302221a36d3d7647292430b5655222c551953b6aa6de9eb1c922bea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://810de96e34e6b028e90bcc5693f1624d337cdd775c01901fc91e6dd2a7f3564d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5b9102b596e6d192154de99c79ffcf4423af88d8c95b78cfc1a27a7651679b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e32f2e2e03b1ed5891ac7e16e96345ae592a5bd631c69ebe61bd9c29bd9a192f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T07:35:31Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1012 07:35:31.359812 5883 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1012 07:35:31.359827 5883 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1012 07:35:31.359854 5883 handler.go:208] Removed *v1.Node event handler 2\\\\nI1012 07:35:31.359866 5883 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1012 07:35:31.359917 5883 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1012 07:35:31.359924 5883 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1012 07:35:31.359943 5883 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1012 07:35:31.359954 5883 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1012 07:35:31.359958 5883 handler.go:208] Removed *v1.Node event handler 7\\\\nI1012 07:35:31.359957 5883 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1012 07:35:31.359977 5883 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1012 07:35:31.359988 5883 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1012 07:35:31.360014 5883 factory.go:656] Stopping watch factory\\\\nI1012 07:35:31.360027 5883 ovnkube.go:599] Stopped ovnkube\\\\nI1012 07:35:31.360047 5883 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1012 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5b9102b596e6d192154de99c79ffcf4423af88d8c95b78cfc1a27a7651679b5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T07:35:32Z\\\",\\\"message\\\":\\\"n network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:32Z is after 2025-08-24T17:21:41Z]\\\\nI1012 07:35:32.411040 6004 services_controller.go:454] Service openshift-machine-api/machine-api-operator-machine-webhook for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI1012 07:35:32.411167 6004 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/machine-config-daemon-5mz5c\\\\nI1012 07:35:32.411173 6004 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-f5988\\\\nI1012 07:35:32.411177 6004 obj_retry.go:365] Adding new object: *v1.Pod openshift-machine-config-operator/machine-config-daemon-5mz5c\\\\nI1012 07:35:32.411180 6004 obj_retry.go:365] Adding new object: *v1.Pod openshift-dns/node-resolver-f5988\\\\nI1012 07:35:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce496909e393b4896eb9ae165317648df3e828d219c231d95c6057cd08b78fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce4f64c7fab07385f9215821c5c75c5b345653fa141a238bb3dd74e7a1a291a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce4f64c7fab07385f9215821c5c75c5b345653fa141a238bb3dd74e7a1a291a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-whk5b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:33Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.631043 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rxr42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92909e59-b659-4fc0-91c0-1880ff96b4f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4162f0c9d7789e80004cf3ea54eed47b4d473832cf68a03e348973c300f9dcc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7hr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rxr42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:33Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.638000 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f5988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0147ed28-bc0f-409c-a813-dc2ffffba092\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b06de751793dade650ec18334a6bcf8bb26e5e89af47ede264d087d1a950178a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzz9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f5988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:33Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.646940 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc694bce-8c25-4729-b452-29d44d3efe6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c222e0cd037e0719d1ea94d4f28639dae42c676fa56b5f57b22cd50d87e6b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25l5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://238fe8d2b2e7f08d70bc5f938aca3c90b68c447db21a713d651d47b47a8127c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25l5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5mz5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:33Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.656310 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xl2ns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29fbfc4b-8e32-4132-a34d-48b25ec31428\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqqzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqqzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xl2ns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:33Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.667923 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8afb4c88-1806-483a-9957-30833be28202\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c61ee75e4bbe142c54914177759cf0c49ecc36f904c8c6bf421cd4917256d5a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://545c8135b4e5319ac3b9c84ee50556b66e4f5cfe0987f5aa37ef668130f6f267\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40188eb89126b26a3002269d063827de60fa17f72c18fbbd0e15b16c744e8ddb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc5c76ab7433309a1f2c32d2ada884b5b0d926fee8219463079f8689a1f7efef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21d90e31b7ebf70d48881d9d0837e8a1fdeabc3d53988c7ba993ce28e902d3d4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T07:35:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW1012 07:35:20.755648 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1012 07:35:20.755757 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 07:35:20.757774 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2293705747/tls.crt::/tmp/serving-cert-2293705747/tls.key\\\\\\\"\\\\nI1012 07:35:20.975376 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1012 07:35:20.977324 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1012 07:35:20.977357 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1012 07:35:20.977377 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1012 07:35:20.977382 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1012 07:35:20.982144 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1012 07:35:20.982166 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1012 07:35:20.982172 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1012 07:35:20.982212 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1012 07:35:20.982216 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1012 07:35:20.982220 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1012 07:35:20.982223 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1012 07:35:20.982226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1012 07:35:20.985689 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e595e7d1dee4ecff3b5deb6e363b57de09e4bcfddd6ab9ae9be27b6e3578628\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afc60525d60f5b497a75efb631611d658c23f6336c21ca7421bf3efee3ece0ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afc60525d60f5b497a75efb631611d658c23f6336c21ca7421bf3efee3ece0ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:33Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.679634 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db79c5e0b03856a5da5fba6b8e6cb77590ef9e29090ac66da24d54ea638a20f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:33Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.692717 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:33Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.708530 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2869b43-5f5f-4ed7-9dc3-63d210bde137\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21abc118842f09c54256bd05f1e5978c928bad569bc9b52157287576117fb49d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb25175bfe42e193200fbc76f82294afbd5f23d9653637c12ddf67553431748e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da490def77b19758f5479ef7070846dee19a5057bd6b0568d96fd38a8dd6528\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://622eac195ecab593bc5fa71a4a88ac987a45be5fb130336aae7acd680c51abe4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:33Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.710853 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.710888 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.710899 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.710914 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.710924 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:33Z","lastTransitionTime":"2025-10-12T07:35:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.735185 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xl2ns" event={"ID":"29fbfc4b-8e32-4132-a34d-48b25ec31428","Type":"ContainerStarted","Data":"b84de1dd84c3c72376a170b7ccbdfff8c3bdd5e9e18a344002e112e81ca555a4"} Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.735236 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xl2ns" event={"ID":"29fbfc4b-8e32-4132-a34d-48b25ec31428","Type":"ContainerStarted","Data":"7e2f2c3c17ada7047edb8e0fcd51227104458618c98e5ddec12123aa7ee173e8"} Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.735247 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xl2ns" event={"ID":"29fbfc4b-8e32-4132-a34d-48b25ec31428","Type":"ContainerStarted","Data":"22ee0bbaaab3bac0b19b20fff16d4c8b6dea5b95510a5dddbbf8e155777eb629"} Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.736717 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-whk5b_1a95b7ab-8632-4332-a30f-64f28ef8d313/ovnkube-controller/1.log" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.739573 4599 scope.go:117] "RemoveContainer" containerID="a5b9102b596e6d192154de99c79ffcf4423af88d8c95b78cfc1a27a7651679b5" Oct 12 07:35:33 crc kubenswrapper[4599]: E1012 07:35:33.739751 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-whk5b_openshift-ovn-kubernetes(1a95b7ab-8632-4332-a30f-64f28ef8d313)\"" pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" podUID="1a95b7ab-8632-4332-a30f-64f28ef8d313" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.752227 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8afb4c88-1806-483a-9957-30833be28202\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c61ee75e4bbe142c54914177759cf0c49ecc36f904c8c6bf421cd4917256d5a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://545c8135b4e5319ac3b9c84ee50556b66e4f5cfe0987f5aa37ef668130f6f267\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40188eb89126b26a3002269d063827de60fa17f72c18fbbd0e15b16c744e8ddb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc5c76ab7433309a1f2c32d2ada884b5b0d926fee8219463079f8689a1f7efef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21d90e31b7ebf70d48881d9d0837e8a1fdeabc3d53988c7ba993ce28e902d3d4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T07:35:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW1012 07:35:20.755648 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1012 07:35:20.755757 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 07:35:20.757774 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2293705747/tls.crt::/tmp/serving-cert-2293705747/tls.key\\\\\\\"\\\\nI1012 07:35:20.975376 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1012 07:35:20.977324 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1012 07:35:20.977357 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1012 07:35:20.977377 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1012 07:35:20.977382 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1012 07:35:20.982144 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1012 07:35:20.982166 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1012 07:35:20.982172 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1012 07:35:20.982212 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1012 07:35:20.982216 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1012 07:35:20.982220 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1012 07:35:20.982223 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1012 07:35:20.982226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1012 07:35:20.985689 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e595e7d1dee4ecff3b5deb6e363b57de09e4bcfddd6ab9ae9be27b6e3578628\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afc60525d60f5b497a75efb631611d658c23f6336c21ca7421bf3efee3ece0ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afc60525d60f5b497a75efb631611d658c23f6336c21ca7421bf3efee3ece0ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:33Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.764517 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db79c5e0b03856a5da5fba6b8e6cb77590ef9e29090ac66da24d54ea638a20f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:33Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.775959 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:33Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.784884 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc694bce-8c25-4729-b452-29d44d3efe6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c222e0cd037e0719d1ea94d4f28639dae42c676fa56b5f57b22cd50d87e6b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25l5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://238fe8d2b2e7f08d70bc5f938aca3c90b68c447db21a713d651d47b47a8127c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25l5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5mz5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:33Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.793241 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xl2ns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29fbfc4b-8e32-4132-a34d-48b25ec31428\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e2f2c3c17ada7047edb8e0fcd51227104458618c98e5ddec12123aa7ee173e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqqzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b84de1dd84c3c72376a170b7ccbdfff8c3bdd5e9e18a344002e112e81ca555a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqqzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xl2ns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:33Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.801874 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2869b43-5f5f-4ed7-9dc3-63d210bde137\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21abc118842f09c54256bd05f1e5978c928bad569bc9b52157287576117fb49d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb25175bfe42e193200fbc76f82294afbd5f23d9653637c12ddf67553431748e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da490def77b19758f5479ef7070846dee19a5057bd6b0568d96fd38a8dd6528\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://622eac195ecab593bc5fa71a4a88ac987a45be5fb130336aae7acd680c51abe4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:33Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.810157 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:33Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.812754 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.812784 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.812792 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.812807 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.812815 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:33Z","lastTransitionTime":"2025-10-12T07:35:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.818777 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:33Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.827132 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4b9189620cc5d626fe6c3e7bbe0efd22f7a6f061bc2afd3d33b1e04ebdbfb1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc203371af6667734441e7dca5f3b301365eb1136f363df8c87377ae58168130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:33Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.835266 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b574e7cb76b05a81f63a55ce4544b1d4b3a58e6793be2c89620f66282642e39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:33Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.845509 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9xbn5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18ca8765-c435-4750-b803-14b539958d9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4db4e7d06f60157d91fcb8514d176f78540085701360ebcfa7eabb36fe84d6b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16c75354b9fbe00273ab4aaa5ca1306c5d097996b8ce8074a2cee8185118a001\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16c75354b9fbe00273ab4aaa5ca1306c5d097996b8ce8074a2cee8185118a001\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a7c28a100796e21008c597d15ee36786f7e87553a8a4feb5734ac79cef9b556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a7c28a100796e21008c597d15ee36786f7e87553a8a4feb5734ac79cef9b556\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6388c1e143c0caded216941ec25bd07be0a14f2f97f6328c35d5cc04815a88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6388c1e143c0caded216941ec25bd07be0a14f2f97f6328c35d5cc04815a88a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e201774968890569efafc736378cff0d700067bb0bd39677f42ef65fa2c07646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e201774968890569efafc736378cff0d700067bb0bd39677f42ef65fa2c07646\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f07c63fbebde2f05a5b1682af69eb42b5aec20728e0997f1ebc2f5a59815ad1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f07c63fbebde2f05a5b1682af69eb42b5aec20728e0997f1ebc2f5a59815ad1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f62b824f1e7400e9588b3d3a1afc52f86b3bcae039fbe3c757e1e01733e3a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f62b824f1e7400e9588b3d3a1afc52f86b3bcae039fbe3c757e1e01733e3a69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9xbn5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:33Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.855533 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8hm26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce311f52-0501-45d3-8209-b1d2aa25028b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://611114a1fecbd461507dd2f566f975c7ef3e8d7655d90c1a5b31cd05a430cb38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8hm26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:33Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.868048 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rxr42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92909e59-b659-4fc0-91c0-1880ff96b4f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4162f0c9d7789e80004cf3ea54eed47b4d473832cf68a03e348973c300f9dcc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7hr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rxr42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:33Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.875431 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f5988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0147ed28-bc0f-409c-a813-dc2ffffba092\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b06de751793dade650ec18334a6bcf8bb26e5e89af47ede264d087d1a950178a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzz9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f5988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:33Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.890445 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a95b7ab-8632-4332-a30f-64f28ef8d313\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://543f674ce98b17c90c5a351ae72f596a2e90006a1b86fcede3359715c67b94bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c150b687d6b0c337604ac14537335d57d9c955a9e31001d274507e5e65ac7bc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f73bbbb549129c48a1421dda3d2385cf3515feafd8215cdac6b792823d614b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a156f480c7252f61ec3dc991814db0e7259fdf7ae618c8f43d86da97f8a6bc5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f99854365302221a36d3d7647292430b5655222c551953b6aa6de9eb1c922bea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://810de96e34e6b028e90bcc5693f1624d337cdd775c01901fc91e6dd2a7f3564d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5b9102b596e6d192154de99c79ffcf4423af88d8c95b78cfc1a27a7651679b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e32f2e2e03b1ed5891ac7e16e96345ae592a5bd631c69ebe61bd9c29bd9a192f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T07:35:31Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1012 07:35:31.359812 5883 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1012 07:35:31.359827 5883 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1012 07:35:31.359854 5883 handler.go:208] Removed *v1.Node event handler 2\\\\nI1012 07:35:31.359866 5883 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1012 07:35:31.359917 5883 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1012 07:35:31.359924 5883 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1012 07:35:31.359943 5883 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1012 07:35:31.359954 5883 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1012 07:35:31.359958 5883 handler.go:208] Removed *v1.Node event handler 7\\\\nI1012 07:35:31.359957 5883 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1012 07:35:31.359977 5883 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1012 07:35:31.359988 5883 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1012 07:35:31.360014 5883 factory.go:656] Stopping watch factory\\\\nI1012 07:35:31.360027 5883 ovnkube.go:599] Stopped ovnkube\\\\nI1012 07:35:31.360047 5883 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1012 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5b9102b596e6d192154de99c79ffcf4423af88d8c95b78cfc1a27a7651679b5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T07:35:32Z\\\",\\\"message\\\":\\\"n network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:32Z is after 2025-08-24T17:21:41Z]\\\\nI1012 07:35:32.411040 6004 services_controller.go:454] Service openshift-machine-api/machine-api-operator-machine-webhook for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI1012 07:35:32.411167 6004 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/machine-config-daemon-5mz5c\\\\nI1012 07:35:32.411173 6004 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-f5988\\\\nI1012 07:35:32.411177 6004 obj_retry.go:365] Adding new object: *v1.Pod openshift-machine-config-operator/machine-config-daemon-5mz5c\\\\nI1012 07:35:32.411180 6004 obj_retry.go:365] Adding new object: *v1.Pod openshift-dns/node-resolver-f5988\\\\nI1012 07:35:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce496909e393b4896eb9ae165317648df3e828d219c231d95c6057cd08b78fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce4f64c7fab07385f9215821c5c75c5b345653fa141a238bb3dd74e7a1a291a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce4f64c7fab07385f9215821c5c75c5b345653fa141a238bb3dd74e7a1a291a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-whk5b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:33Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.899726 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2869b43-5f5f-4ed7-9dc3-63d210bde137\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21abc118842f09c54256bd05f1e5978c928bad569bc9b52157287576117fb49d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb25175bfe42e193200fbc76f82294afbd5f23d9653637c12ddf67553431748e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da490def77b19758f5479ef7070846dee19a5057bd6b0568d96fd38a8dd6528\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://622eac195ecab593bc5fa71a4a88ac987a45be5fb130336aae7acd680c51abe4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:33Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.909152 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8hm26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce311f52-0501-45d3-8209-b1d2aa25028b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://611114a1fecbd461507dd2f566f975c7ef3e8d7655d90c1a5b31cd05a430cb38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8hm26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:33Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.914991 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.915050 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.915061 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.915076 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.915087 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:33Z","lastTransitionTime":"2025-10-12T07:35:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.919464 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:33Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.927640 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:33Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.936537 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4b9189620cc5d626fe6c3e7bbe0efd22f7a6f061bc2afd3d33b1e04ebdbfb1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc203371af6667734441e7dca5f3b301365eb1136f363df8c87377ae58168130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:33Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.943859 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b574e7cb76b05a81f63a55ce4544b1d4b3a58e6793be2c89620f66282642e39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:33Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.953148 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9xbn5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18ca8765-c435-4750-b803-14b539958d9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4db4e7d06f60157d91fcb8514d176f78540085701360ebcfa7eabb36fe84d6b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16c75354b9fbe00273ab4aaa5ca1306c5d097996b8ce8074a2cee8185118a001\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16c75354b9fbe00273ab4aaa5ca1306c5d097996b8ce8074a2cee8185118a001\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a7c28a100796e21008c597d15ee36786f7e87553a8a4feb5734ac79cef9b556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a7c28a100796e21008c597d15ee36786f7e87553a8a4feb5734ac79cef9b556\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6388c1e143c0caded216941ec25bd07be0a14f2f97f6328c35d5cc04815a88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6388c1e143c0caded216941ec25bd07be0a14f2f97f6328c35d5cc04815a88a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e201774968890569efafc736378cff0d700067bb0bd39677f42ef65fa2c07646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e201774968890569efafc736378cff0d700067bb0bd39677f42ef65fa2c07646\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f07c63fbebde2f05a5b1682af69eb42b5aec20728e0997f1ebc2f5a59815ad1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f07c63fbebde2f05a5b1682af69eb42b5aec20728e0997f1ebc2f5a59815ad1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f62b824f1e7400e9588b3d3a1afc52f86b3bcae039fbe3c757e1e01733e3a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f62b824f1e7400e9588b3d3a1afc52f86b3bcae039fbe3c757e1e01733e3a69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9xbn5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:33Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.960015 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rxr42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92909e59-b659-4fc0-91c0-1880ff96b4f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4162f0c9d7789e80004cf3ea54eed47b4d473832cf68a03e348973c300f9dcc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7hr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rxr42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:33Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.968568 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f5988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0147ed28-bc0f-409c-a813-dc2ffffba092\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b06de751793dade650ec18334a6bcf8bb26e5e89af47ede264d087d1a950178a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzz9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f5988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:33Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.981847 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a95b7ab-8632-4332-a30f-64f28ef8d313\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://543f674ce98b17c90c5a351ae72f596a2e90006a1b86fcede3359715c67b94bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c150b687d6b0c337604ac14537335d57d9c955a9e31001d274507e5e65ac7bc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f73bbbb549129c48a1421dda3d2385cf3515feafd8215cdac6b792823d614b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a156f480c7252f61ec3dc991814db0e7259fdf7ae618c8f43d86da97f8a6bc5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f99854365302221a36d3d7647292430b5655222c551953b6aa6de9eb1c922bea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://810de96e34e6b028e90bcc5693f1624d337cdd775c01901fc91e6dd2a7f3564d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5b9102b596e6d192154de99c79ffcf4423af88d8c95b78cfc1a27a7651679b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5b9102b596e6d192154de99c79ffcf4423af88d8c95b78cfc1a27a7651679b5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T07:35:32Z\\\",\\\"message\\\":\\\"n network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:32Z is after 2025-08-24T17:21:41Z]\\\\nI1012 07:35:32.411040 6004 services_controller.go:454] Service openshift-machine-api/machine-api-operator-machine-webhook for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI1012 07:35:32.411167 6004 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/machine-config-daemon-5mz5c\\\\nI1012 07:35:32.411173 6004 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-f5988\\\\nI1012 07:35:32.411177 6004 obj_retry.go:365] Adding new object: *v1.Pod openshift-machine-config-operator/machine-config-daemon-5mz5c\\\\nI1012 07:35:32.411180 6004 obj_retry.go:365] Adding new object: *v1.Pod openshift-dns/node-resolver-f5988\\\\nI1012 07:35:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-whk5b_openshift-ovn-kubernetes(1a95b7ab-8632-4332-a30f-64f28ef8d313)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce496909e393b4896eb9ae165317648df3e828d219c231d95c6057cd08b78fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce4f64c7fab07385f9215821c5c75c5b345653fa141a238bb3dd74e7a1a291a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce4f64c7fab07385f9215821c5c75c5b345653fa141a238bb3dd74e7a1a291a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-whk5b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:33Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:33 crc kubenswrapper[4599]: I1012 07:35:33.991475 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8afb4c88-1806-483a-9957-30833be28202\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c61ee75e4bbe142c54914177759cf0c49ecc36f904c8c6bf421cd4917256d5a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://545c8135b4e5319ac3b9c84ee50556b66e4f5cfe0987f5aa37ef668130f6f267\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40188eb89126b26a3002269d063827de60fa17f72c18fbbd0e15b16c744e8ddb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc5c76ab7433309a1f2c32d2ada884b5b0d926fee8219463079f8689a1f7efef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21d90e31b7ebf70d48881d9d0837e8a1fdeabc3d53988c7ba993ce28e902d3d4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T07:35:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW1012 07:35:20.755648 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1012 07:35:20.755757 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 07:35:20.757774 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2293705747/tls.crt::/tmp/serving-cert-2293705747/tls.key\\\\\\\"\\\\nI1012 07:35:20.975376 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1012 07:35:20.977324 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1012 07:35:20.977357 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1012 07:35:20.977377 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1012 07:35:20.977382 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1012 07:35:20.982144 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1012 07:35:20.982166 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1012 07:35:20.982172 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1012 07:35:20.982212 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1012 07:35:20.982216 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1012 07:35:20.982220 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1012 07:35:20.982223 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1012 07:35:20.982226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1012 07:35:20.985689 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e595e7d1dee4ecff3b5deb6e363b57de09e4bcfddd6ab9ae9be27b6e3578628\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afc60525d60f5b497a75efb631611d658c23f6336c21ca7421bf3efee3ece0ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afc60525d60f5b497a75efb631611d658c23f6336c21ca7421bf3efee3ece0ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:33Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.002156 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db79c5e0b03856a5da5fba6b8e6cb77590ef9e29090ac66da24d54ea638a20f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:34Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.012041 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:34Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.017194 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.017229 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.017239 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.017255 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.017264 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:34Z","lastTransitionTime":"2025-10-12T07:35:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.020717 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc694bce-8c25-4729-b452-29d44d3efe6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c222e0cd037e0719d1ea94d4f28639dae42c676fa56b5f57b22cd50d87e6b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25l5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://238fe8d2b2e7f08d70bc5f938aca3c90b68c447db21a713d651d47b47a8127c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25l5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5mz5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:34Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.028528 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xl2ns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29fbfc4b-8e32-4132-a34d-48b25ec31428\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e2f2c3c17ada7047edb8e0fcd51227104458618c98e5ddec12123aa7ee173e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqqzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b84de1dd84c3c72376a170b7ccbdfff8c3bdd5e9e18a344002e112e81ca555a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqqzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xl2ns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:34Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.119951 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.120011 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.120025 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.120046 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.120058 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:34Z","lastTransitionTime":"2025-10-12T07:35:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.198449 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.198518 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.198529 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.198550 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.198567 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:34Z","lastTransitionTime":"2025-10-12T07:35:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:34 crc kubenswrapper[4599]: E1012 07:35:34.209137 4599 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b3fd0901-686d-4e85-8767-c56bc470edcc\\\",\\\"systemUUID\\\":\\\"c0d90076-d180-408b-98a9-f48996ced0a6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:34Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.213025 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.213078 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.213093 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.213112 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.213122 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:34Z","lastTransitionTime":"2025-10-12T07:35:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:34 crc kubenswrapper[4599]: E1012 07:35:34.222789 4599 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b3fd0901-686d-4e85-8767-c56bc470edcc\\\",\\\"systemUUID\\\":\\\"c0d90076-d180-408b-98a9-f48996ced0a6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:34Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.227004 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.227040 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.227053 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.227068 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.227078 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:34Z","lastTransitionTime":"2025-10-12T07:35:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:34 crc kubenswrapper[4599]: E1012 07:35:34.240580 4599 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b3fd0901-686d-4e85-8767-c56bc470edcc\\\",\\\"systemUUID\\\":\\\"c0d90076-d180-408b-98a9-f48996ced0a6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:34Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.243814 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.243847 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.243858 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.243878 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.243889 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:34Z","lastTransitionTime":"2025-10-12T07:35:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:34 crc kubenswrapper[4599]: E1012 07:35:34.255271 4599 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b3fd0901-686d-4e85-8767-c56bc470edcc\\\",\\\"systemUUID\\\":\\\"c0d90076-d180-408b-98a9-f48996ced0a6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:34Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.258490 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.258540 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.258553 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.258570 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.258581 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:34Z","lastTransitionTime":"2025-10-12T07:35:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:34 crc kubenswrapper[4599]: E1012 07:35:34.268015 4599 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b3fd0901-686d-4e85-8767-c56bc470edcc\\\",\\\"systemUUID\\\":\\\"c0d90076-d180-408b-98a9-f48996ced0a6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:34Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:34 crc kubenswrapper[4599]: E1012 07:35:34.268125 4599 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.269393 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.269423 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.269446 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.269457 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.269465 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:34Z","lastTransitionTime":"2025-10-12T07:35:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.289250 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.301441 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8hm26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce311f52-0501-45d3-8209-b1d2aa25028b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://611114a1fecbd461507dd2f566f975c7ef3e8d7655d90c1a5b31cd05a430cb38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8hm26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:34Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.316575 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:34Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.326696 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:34Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.336064 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4b9189620cc5d626fe6c3e7bbe0efd22f7a6f061bc2afd3d33b1e04ebdbfb1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc203371af6667734441e7dca5f3b301365eb1136f363df8c87377ae58168130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:34Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.344519 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b574e7cb76b05a81f63a55ce4544b1d4b3a58e6793be2c89620f66282642e39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:34Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.355374 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9xbn5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18ca8765-c435-4750-b803-14b539958d9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4db4e7d06f60157d91fcb8514d176f78540085701360ebcfa7eabb36fe84d6b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16c75354b9fbe00273ab4aaa5ca1306c5d097996b8ce8074a2cee8185118a001\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16c75354b9fbe00273ab4aaa5ca1306c5d097996b8ce8074a2cee8185118a001\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a7c28a100796e21008c597d15ee36786f7e87553a8a4feb5734ac79cef9b556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a7c28a100796e21008c597d15ee36786f7e87553a8a4feb5734ac79cef9b556\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6388c1e143c0caded216941ec25bd07be0a14f2f97f6328c35d5cc04815a88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6388c1e143c0caded216941ec25bd07be0a14f2f97f6328c35d5cc04815a88a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e201774968890569efafc736378cff0d700067bb0bd39677f42ef65fa2c07646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e201774968890569efafc736378cff0d700067bb0bd39677f42ef65fa2c07646\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f07c63fbebde2f05a5b1682af69eb42b5aec20728e0997f1ebc2f5a59815ad1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f07c63fbebde2f05a5b1682af69eb42b5aec20728e0997f1ebc2f5a59815ad1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f62b824f1e7400e9588b3d3a1afc52f86b3bcae039fbe3c757e1e01733e3a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f62b824f1e7400e9588b3d3a1afc52f86b3bcae039fbe3c757e1e01733e3a69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9xbn5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:34Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.363111 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rxr42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92909e59-b659-4fc0-91c0-1880ff96b4f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4162f0c9d7789e80004cf3ea54eed47b4d473832cf68a03e348973c300f9dcc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7hr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rxr42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:34Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.371309 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f5988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0147ed28-bc0f-409c-a813-dc2ffffba092\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b06de751793dade650ec18334a6bcf8bb26e5e89af47ede264d087d1a950178a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzz9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f5988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:34Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.372211 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.372305 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.372395 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.372476 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.372536 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:34Z","lastTransitionTime":"2025-10-12T07:35:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.385075 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a95b7ab-8632-4332-a30f-64f28ef8d313\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://543f674ce98b17c90c5a351ae72f596a2e90006a1b86fcede3359715c67b94bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c150b687d6b0c337604ac14537335d57d9c955a9e31001d274507e5e65ac7bc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f73bbbb549129c48a1421dda3d2385cf3515feafd8215cdac6b792823d614b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a156f480c7252f61ec3dc991814db0e7259fdf7ae618c8f43d86da97f8a6bc5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f99854365302221a36d3d7647292430b5655222c551953b6aa6de9eb1c922bea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://810de96e34e6b028e90bcc5693f1624d337cdd775c01901fc91e6dd2a7f3564d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5b9102b596e6d192154de99c79ffcf4423af88d8c95b78cfc1a27a7651679b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5b9102b596e6d192154de99c79ffcf4423af88d8c95b78cfc1a27a7651679b5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T07:35:32Z\\\",\\\"message\\\":\\\"n network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:32Z is after 2025-08-24T17:21:41Z]\\\\nI1012 07:35:32.411040 6004 services_controller.go:454] Service openshift-machine-api/machine-api-operator-machine-webhook for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI1012 07:35:32.411167 6004 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/machine-config-daemon-5mz5c\\\\nI1012 07:35:32.411173 6004 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-f5988\\\\nI1012 07:35:32.411177 6004 obj_retry.go:365] Adding new object: *v1.Pod openshift-machine-config-operator/machine-config-daemon-5mz5c\\\\nI1012 07:35:32.411180 6004 obj_retry.go:365] Adding new object: *v1.Pod openshift-dns/node-resolver-f5988\\\\nI1012 07:35:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-whk5b_openshift-ovn-kubernetes(1a95b7ab-8632-4332-a30f-64f28ef8d313)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce496909e393b4896eb9ae165317648df3e828d219c231d95c6057cd08b78fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce4f64c7fab07385f9215821c5c75c5b345653fa141a238bb3dd74e7a1a291a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce4f64c7fab07385f9215821c5c75c5b345653fa141a238bb3dd74e7a1a291a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-whk5b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:34Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.395523 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8afb4c88-1806-483a-9957-30833be28202\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c61ee75e4bbe142c54914177759cf0c49ecc36f904c8c6bf421cd4917256d5a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://545c8135b4e5319ac3b9c84ee50556b66e4f5cfe0987f5aa37ef668130f6f267\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40188eb89126b26a3002269d063827de60fa17f72c18fbbd0e15b16c744e8ddb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc5c76ab7433309a1f2c32d2ada884b5b0d926fee8219463079f8689a1f7efef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21d90e31b7ebf70d48881d9d0837e8a1fdeabc3d53988c7ba993ce28e902d3d4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T07:35:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW1012 07:35:20.755648 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1012 07:35:20.755757 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 07:35:20.757774 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2293705747/tls.crt::/tmp/serving-cert-2293705747/tls.key\\\\\\\"\\\\nI1012 07:35:20.975376 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1012 07:35:20.977324 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1012 07:35:20.977357 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1012 07:35:20.977377 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1012 07:35:20.977382 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1012 07:35:20.982144 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1012 07:35:20.982166 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1012 07:35:20.982172 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1012 07:35:20.982212 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1012 07:35:20.982216 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1012 07:35:20.982220 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1012 07:35:20.982223 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1012 07:35:20.982226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1012 07:35:20.985689 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e595e7d1dee4ecff3b5deb6e363b57de09e4bcfddd6ab9ae9be27b6e3578628\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afc60525d60f5b497a75efb631611d658c23f6336c21ca7421bf3efee3ece0ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afc60525d60f5b497a75efb631611d658c23f6336c21ca7421bf3efee3ece0ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:34Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.405493 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db79c5e0b03856a5da5fba6b8e6cb77590ef9e29090ac66da24d54ea638a20f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:34Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.414382 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:34Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.425212 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc694bce-8c25-4729-b452-29d44d3efe6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c222e0cd037e0719d1ea94d4f28639dae42c676fa56b5f57b22cd50d87e6b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25l5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://238fe8d2b2e7f08d70bc5f938aca3c90b68c447db21a713d651d47b47a8127c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25l5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5mz5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:34Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.441243 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xl2ns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29fbfc4b-8e32-4132-a34d-48b25ec31428\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e2f2c3c17ada7047edb8e0fcd51227104458618c98e5ddec12123aa7ee173e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqqzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b84de1dd84c3c72376a170b7ccbdfff8c3bdd5e9e18a344002e112e81ca555a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqqzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xl2ns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:34Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.461317 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2869b43-5f5f-4ed7-9dc3-63d210bde137\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21abc118842f09c54256bd05f1e5978c928bad569bc9b52157287576117fb49d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb25175bfe42e193200fbc76f82294afbd5f23d9653637c12ddf67553431748e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da490def77b19758f5479ef7070846dee19a5057bd6b0568d96fd38a8dd6528\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://622eac195ecab593bc5fa71a4a88ac987a45be5fb130336aae7acd680c51abe4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:34Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.475374 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.475466 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.475478 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.475495 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.475505 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:34Z","lastTransitionTime":"2025-10-12T07:35:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.531055 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-kwphq"] Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.531531 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kwphq" Oct 12 07:35:34 crc kubenswrapper[4599]: E1012 07:35:34.531595 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kwphq" podUID="3c3e76cc-139b-4a2a-b96b-6077e3706376" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.542544 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db79c5e0b03856a5da5fba6b8e6cb77590ef9e29090ac66da24d54ea638a20f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:34Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.552068 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:34Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.560235 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc694bce-8c25-4729-b452-29d44d3efe6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c222e0cd037e0719d1ea94d4f28639dae42c676fa56b5f57b22cd50d87e6b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25l5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://238fe8d2b2e7f08d70bc5f938aca3c90b68c447db21a713d651d47b47a8127c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25l5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5mz5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:34Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.568948 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xl2ns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29fbfc4b-8e32-4132-a34d-48b25ec31428\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e2f2c3c17ada7047edb8e0fcd51227104458618c98e5ddec12123aa7ee173e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqqzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b84de1dd84c3c72376a170b7ccbdfff8c3bdd5e9e18a344002e112e81ca555a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqqzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xl2ns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:34Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.577403 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.577428 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.577443 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.577458 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.577468 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:34Z","lastTransitionTime":"2025-10-12T07:35:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.580259 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8afb4c88-1806-483a-9957-30833be28202\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c61ee75e4bbe142c54914177759cf0c49ecc36f904c8c6bf421cd4917256d5a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://545c8135b4e5319ac3b9c84ee50556b66e4f5cfe0987f5aa37ef668130f6f267\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40188eb89126b26a3002269d063827de60fa17f72c18fbbd0e15b16c744e8ddb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc5c76ab7433309a1f2c32d2ada884b5b0d926fee8219463079f8689a1f7efef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21d90e31b7ebf70d48881d9d0837e8a1fdeabc3d53988c7ba993ce28e902d3d4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T07:35:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW1012 07:35:20.755648 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1012 07:35:20.755757 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 07:35:20.757774 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2293705747/tls.crt::/tmp/serving-cert-2293705747/tls.key\\\\\\\"\\\\nI1012 07:35:20.975376 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1012 07:35:20.977324 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1012 07:35:20.977357 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1012 07:35:20.977377 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1012 07:35:20.977382 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1012 07:35:20.982144 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1012 07:35:20.982166 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1012 07:35:20.982172 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1012 07:35:20.982212 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1012 07:35:20.982216 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1012 07:35:20.982220 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1012 07:35:20.982223 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1012 07:35:20.982226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1012 07:35:20.985689 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e595e7d1dee4ecff3b5deb6e363b57de09e4bcfddd6ab9ae9be27b6e3578628\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afc60525d60f5b497a75efb631611d658c23f6336c21ca7421bf3efee3ece0ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afc60525d60f5b497a75efb631611d658c23f6336c21ca7421bf3efee3ece0ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:34Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.600011 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2869b43-5f5f-4ed7-9dc3-63d210bde137\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21abc118842f09c54256bd05f1e5978c928bad569bc9b52157287576117fb49d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb25175bfe42e193200fbc76f82294afbd5f23d9653637c12ddf67553431748e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da490def77b19758f5479ef7070846dee19a5057bd6b0568d96fd38a8dd6528\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://622eac195ecab593bc5fa71a4a88ac987a45be5fb130336aae7acd680c51abe4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:34Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.640423 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:34Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.679791 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.679854 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.679871 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.679898 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.679912 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:34Z","lastTransitionTime":"2025-10-12T07:35:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.682246 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:34Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.712322 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9v52\" (UniqueName: \"kubernetes.io/projected/3c3e76cc-139b-4a2a-b96b-6077e3706376-kube-api-access-f9v52\") pod \"network-metrics-daemon-kwphq\" (UID: \"3c3e76cc-139b-4a2a-b96b-6077e3706376\") " pod="openshift-multus/network-metrics-daemon-kwphq" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.712463 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3c3e76cc-139b-4a2a-b96b-6077e3706376-metrics-certs\") pod \"network-metrics-daemon-kwphq\" (UID: \"3c3e76cc-139b-4a2a-b96b-6077e3706376\") " pod="openshift-multus/network-metrics-daemon-kwphq" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.723634 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4b9189620cc5d626fe6c3e7bbe0efd22f7a6f061bc2afd3d33b1e04ebdbfb1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc203371af6667734441e7dca5f3b301365eb1136f363df8c87377ae58168130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:34Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.760488 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b574e7cb76b05a81f63a55ce4544b1d4b3a58e6793be2c89620f66282642e39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:34Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.781618 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.781647 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.781656 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.781668 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.781677 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:34Z","lastTransitionTime":"2025-10-12T07:35:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.800628 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9xbn5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18ca8765-c435-4750-b803-14b539958d9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4db4e7d06f60157d91fcb8514d176f78540085701360ebcfa7eabb36fe84d6b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16c75354b9fbe00273ab4aaa5ca1306c5d097996b8ce8074a2cee8185118a001\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16c75354b9fbe00273ab4aaa5ca1306c5d097996b8ce8074a2cee8185118a001\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a7c28a100796e21008c597d15ee36786f7e87553a8a4feb5734ac79cef9b556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a7c28a100796e21008c597d15ee36786f7e87553a8a4feb5734ac79cef9b556\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6388c1e143c0caded216941ec25bd07be0a14f2f97f6328c35d5cc04815a88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6388c1e143c0caded216941ec25bd07be0a14f2f97f6328c35d5cc04815a88a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e201774968890569efafc736378cff0d700067bb0bd39677f42ef65fa2c07646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e201774968890569efafc736378cff0d700067bb0bd39677f42ef65fa2c07646\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f07c63fbebde2f05a5b1682af69eb42b5aec20728e0997f1ebc2f5a59815ad1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f07c63fbebde2f05a5b1682af69eb42b5aec20728e0997f1ebc2f5a59815ad1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f62b824f1e7400e9588b3d3a1afc52f86b3bcae039fbe3c757e1e01733e3a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f62b824f1e7400e9588b3d3a1afc52f86b3bcae039fbe3c757e1e01733e3a69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9xbn5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:34Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.813185 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9v52\" (UniqueName: \"kubernetes.io/projected/3c3e76cc-139b-4a2a-b96b-6077e3706376-kube-api-access-f9v52\") pod \"network-metrics-daemon-kwphq\" (UID: \"3c3e76cc-139b-4a2a-b96b-6077e3706376\") " pod="openshift-multus/network-metrics-daemon-kwphq" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.813216 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3c3e76cc-139b-4a2a-b96b-6077e3706376-metrics-certs\") pod \"network-metrics-daemon-kwphq\" (UID: \"3c3e76cc-139b-4a2a-b96b-6077e3706376\") " pod="openshift-multus/network-metrics-daemon-kwphq" Oct 12 07:35:34 crc kubenswrapper[4599]: E1012 07:35:34.813355 4599 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 12 07:35:34 crc kubenswrapper[4599]: E1012 07:35:34.813408 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c3e76cc-139b-4a2a-b96b-6077e3706376-metrics-certs podName:3c3e76cc-139b-4a2a-b96b-6077e3706376 nodeName:}" failed. No retries permitted until 2025-10-12 07:35:35.31339459 +0000 UTC m=+32.102590093 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3c3e76cc-139b-4a2a-b96b-6077e3706376-metrics-certs") pod "network-metrics-daemon-kwphq" (UID: "3c3e76cc-139b-4a2a-b96b-6077e3706376") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.849036 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9v52\" (UniqueName: \"kubernetes.io/projected/3c3e76cc-139b-4a2a-b96b-6077e3706376-kube-api-access-f9v52\") pod \"network-metrics-daemon-kwphq\" (UID: \"3c3e76cc-139b-4a2a-b96b-6077e3706376\") " pod="openshift-multus/network-metrics-daemon-kwphq" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.862433 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8hm26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce311f52-0501-45d3-8209-b1d2aa25028b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://611114a1fecbd461507dd2f566f975c7ef3e8d7655d90c1a5b31cd05a430cb38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8hm26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:34Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.883279 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.883312 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.883323 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.883357 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.883368 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:34Z","lastTransitionTime":"2025-10-12T07:35:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.899449 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rxr42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92909e59-b659-4fc0-91c0-1880ff96b4f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4162f0c9d7789e80004cf3ea54eed47b4d473832cf68a03e348973c300f9dcc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7hr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rxr42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:34Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.940911 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f5988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0147ed28-bc0f-409c-a813-dc2ffffba092\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b06de751793dade650ec18334a6bcf8bb26e5e89af47ede264d087d1a950178a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzz9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f5988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:34Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.984764 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.984796 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.984805 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.984820 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.984831 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:34Z","lastTransitionTime":"2025-10-12T07:35:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:34 crc kubenswrapper[4599]: I1012 07:35:34.986781 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a95b7ab-8632-4332-a30f-64f28ef8d313\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://543f674ce98b17c90c5a351ae72f596a2e90006a1b86fcede3359715c67b94bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c150b687d6b0c337604ac14537335d57d9c955a9e31001d274507e5e65ac7bc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f73bbbb549129c48a1421dda3d2385cf3515feafd8215cdac6b792823d614b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a156f480c7252f61ec3dc991814db0e7259fdf7ae618c8f43d86da97f8a6bc5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f99854365302221a36d3d7647292430b5655222c551953b6aa6de9eb1c922bea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://810de96e34e6b028e90bcc5693f1624d337cdd775c01901fc91e6dd2a7f3564d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5b9102b596e6d192154de99c79ffcf4423af88d8c95b78cfc1a27a7651679b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5b9102b596e6d192154de99c79ffcf4423af88d8c95b78cfc1a27a7651679b5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T07:35:32Z\\\",\\\"message\\\":\\\"n network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:32Z is after 2025-08-24T17:21:41Z]\\\\nI1012 07:35:32.411040 6004 services_controller.go:454] Service openshift-machine-api/machine-api-operator-machine-webhook for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI1012 07:35:32.411167 6004 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/machine-config-daemon-5mz5c\\\\nI1012 07:35:32.411173 6004 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-f5988\\\\nI1012 07:35:32.411177 6004 obj_retry.go:365] Adding new object: *v1.Pod openshift-machine-config-operator/machine-config-daemon-5mz5c\\\\nI1012 07:35:32.411180 6004 obj_retry.go:365] Adding new object: *v1.Pod openshift-dns/node-resolver-f5988\\\\nI1012 07:35:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-whk5b_openshift-ovn-kubernetes(1a95b7ab-8632-4332-a30f-64f28ef8d313)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce496909e393b4896eb9ae165317648df3e828d219c231d95c6057cd08b78fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce4f64c7fab07385f9215821c5c75c5b345653fa141a238bb3dd74e7a1a291a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce4f64c7fab07385f9215821c5c75c5b345653fa141a238bb3dd74e7a1a291a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-whk5b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:34Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:35 crc kubenswrapper[4599]: I1012 07:35:35.020002 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kwphq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c3e76cc-139b-4a2a-b96b-6077e3706376\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9v52\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9v52\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kwphq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:35Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:35 crc kubenswrapper[4599]: I1012 07:35:35.086114 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:35 crc kubenswrapper[4599]: I1012 07:35:35.086140 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:35 crc kubenswrapper[4599]: I1012 07:35:35.086150 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:35 crc kubenswrapper[4599]: I1012 07:35:35.086163 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:35 crc kubenswrapper[4599]: I1012 07:35:35.086171 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:35Z","lastTransitionTime":"2025-10-12T07:35:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:35 crc kubenswrapper[4599]: I1012 07:35:35.192519 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:35 crc kubenswrapper[4599]: I1012 07:35:35.192997 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:35 crc kubenswrapper[4599]: I1012 07:35:35.193097 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:35 crc kubenswrapper[4599]: I1012 07:35:35.193323 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:35 crc kubenswrapper[4599]: I1012 07:35:35.193389 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:35Z","lastTransitionTime":"2025-10-12T07:35:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:35 crc kubenswrapper[4599]: I1012 07:35:35.296619 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:35 crc kubenswrapper[4599]: I1012 07:35:35.296657 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:35 crc kubenswrapper[4599]: I1012 07:35:35.296671 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:35 crc kubenswrapper[4599]: I1012 07:35:35.296691 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:35 crc kubenswrapper[4599]: I1012 07:35:35.296707 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:35Z","lastTransitionTime":"2025-10-12T07:35:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:35 crc kubenswrapper[4599]: I1012 07:35:35.317382 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3c3e76cc-139b-4a2a-b96b-6077e3706376-metrics-certs\") pod \"network-metrics-daemon-kwphq\" (UID: \"3c3e76cc-139b-4a2a-b96b-6077e3706376\") " pod="openshift-multus/network-metrics-daemon-kwphq" Oct 12 07:35:35 crc kubenswrapper[4599]: E1012 07:35:35.317652 4599 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 12 07:35:35 crc kubenswrapper[4599]: E1012 07:35:35.317779 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c3e76cc-139b-4a2a-b96b-6077e3706376-metrics-certs podName:3c3e76cc-139b-4a2a-b96b-6077e3706376 nodeName:}" failed. No retries permitted until 2025-10-12 07:35:36.317751992 +0000 UTC m=+33.106947494 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3c3e76cc-139b-4a2a-b96b-6077e3706376-metrics-certs") pod "network-metrics-daemon-kwphq" (UID: "3c3e76cc-139b-4a2a-b96b-6077e3706376") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 12 07:35:35 crc kubenswrapper[4599]: I1012 07:35:35.399393 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:35 crc kubenswrapper[4599]: I1012 07:35:35.399430 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:35 crc kubenswrapper[4599]: I1012 07:35:35.399441 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:35 crc kubenswrapper[4599]: I1012 07:35:35.399468 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:35 crc kubenswrapper[4599]: I1012 07:35:35.399477 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:35Z","lastTransitionTime":"2025-10-12T07:35:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:35 crc kubenswrapper[4599]: I1012 07:35:35.501662 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:35 crc kubenswrapper[4599]: I1012 07:35:35.501697 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:35 crc kubenswrapper[4599]: I1012 07:35:35.501707 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:35 crc kubenswrapper[4599]: I1012 07:35:35.501721 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:35 crc kubenswrapper[4599]: I1012 07:35:35.501731 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:35Z","lastTransitionTime":"2025-10-12T07:35:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:35 crc kubenswrapper[4599]: I1012 07:35:35.544926 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 07:35:35 crc kubenswrapper[4599]: I1012 07:35:35.544955 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 07:35:35 crc kubenswrapper[4599]: I1012 07:35:35.544963 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 07:35:35 crc kubenswrapper[4599]: E1012 07:35:35.545061 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 07:35:35 crc kubenswrapper[4599]: E1012 07:35:35.545221 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 07:35:35 crc kubenswrapper[4599]: E1012 07:35:35.545390 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 07:35:35 crc kubenswrapper[4599]: I1012 07:35:35.604167 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:35 crc kubenswrapper[4599]: I1012 07:35:35.604202 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:35 crc kubenswrapper[4599]: I1012 07:35:35.604210 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:35 crc kubenswrapper[4599]: I1012 07:35:35.604224 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:35 crc kubenswrapper[4599]: I1012 07:35:35.604232 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:35Z","lastTransitionTime":"2025-10-12T07:35:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:35 crc kubenswrapper[4599]: I1012 07:35:35.706915 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:35 crc kubenswrapper[4599]: I1012 07:35:35.706977 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:35 crc kubenswrapper[4599]: I1012 07:35:35.706992 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:35 crc kubenswrapper[4599]: I1012 07:35:35.707017 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:35 crc kubenswrapper[4599]: I1012 07:35:35.707033 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:35Z","lastTransitionTime":"2025-10-12T07:35:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:35 crc kubenswrapper[4599]: I1012 07:35:35.809633 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:35 crc kubenswrapper[4599]: I1012 07:35:35.809673 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:35 crc kubenswrapper[4599]: I1012 07:35:35.809682 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:35 crc kubenswrapper[4599]: I1012 07:35:35.809699 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:35 crc kubenswrapper[4599]: I1012 07:35:35.809711 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:35Z","lastTransitionTime":"2025-10-12T07:35:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:35 crc kubenswrapper[4599]: I1012 07:35:35.913370 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:35 crc kubenswrapper[4599]: I1012 07:35:35.913412 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:35 crc kubenswrapper[4599]: I1012 07:35:35.913421 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:35 crc kubenswrapper[4599]: I1012 07:35:35.913437 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:35 crc kubenswrapper[4599]: I1012 07:35:35.913447 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:35Z","lastTransitionTime":"2025-10-12T07:35:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:36 crc kubenswrapper[4599]: I1012 07:35:36.015768 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:36 crc kubenswrapper[4599]: I1012 07:35:36.015808 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:36 crc kubenswrapper[4599]: I1012 07:35:36.015819 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:36 crc kubenswrapper[4599]: I1012 07:35:36.015837 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:36 crc kubenswrapper[4599]: I1012 07:35:36.015849 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:36Z","lastTransitionTime":"2025-10-12T07:35:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:36 crc kubenswrapper[4599]: I1012 07:35:36.118211 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:36 crc kubenswrapper[4599]: I1012 07:35:36.118260 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:36 crc kubenswrapper[4599]: I1012 07:35:36.118270 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:36 crc kubenswrapper[4599]: I1012 07:35:36.118289 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:36 crc kubenswrapper[4599]: I1012 07:35:36.118303 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:36Z","lastTransitionTime":"2025-10-12T07:35:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:36 crc kubenswrapper[4599]: I1012 07:35:36.220798 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:36 crc kubenswrapper[4599]: I1012 07:35:36.220826 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:36 crc kubenswrapper[4599]: I1012 07:35:36.220836 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:36 crc kubenswrapper[4599]: I1012 07:35:36.220854 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:36 crc kubenswrapper[4599]: I1012 07:35:36.220864 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:36Z","lastTransitionTime":"2025-10-12T07:35:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:36 crc kubenswrapper[4599]: I1012 07:35:36.323762 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:36 crc kubenswrapper[4599]: I1012 07:35:36.323807 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:36 crc kubenswrapper[4599]: I1012 07:35:36.323817 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:36 crc kubenswrapper[4599]: I1012 07:35:36.323833 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:36 crc kubenswrapper[4599]: I1012 07:35:36.323845 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:36Z","lastTransitionTime":"2025-10-12T07:35:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:36 crc kubenswrapper[4599]: I1012 07:35:36.328321 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3c3e76cc-139b-4a2a-b96b-6077e3706376-metrics-certs\") pod \"network-metrics-daemon-kwphq\" (UID: \"3c3e76cc-139b-4a2a-b96b-6077e3706376\") " pod="openshift-multus/network-metrics-daemon-kwphq" Oct 12 07:35:36 crc kubenswrapper[4599]: E1012 07:35:36.328539 4599 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 12 07:35:36 crc kubenswrapper[4599]: E1012 07:35:36.328634 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c3e76cc-139b-4a2a-b96b-6077e3706376-metrics-certs podName:3c3e76cc-139b-4a2a-b96b-6077e3706376 nodeName:}" failed. No retries permitted until 2025-10-12 07:35:38.328608416 +0000 UTC m=+35.117803918 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3c3e76cc-139b-4a2a-b96b-6077e3706376-metrics-certs") pod "network-metrics-daemon-kwphq" (UID: "3c3e76cc-139b-4a2a-b96b-6077e3706376") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 12 07:35:36 crc kubenswrapper[4599]: I1012 07:35:36.426560 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:36 crc kubenswrapper[4599]: I1012 07:35:36.426589 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:36 crc kubenswrapper[4599]: I1012 07:35:36.426598 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:36 crc kubenswrapper[4599]: I1012 07:35:36.426613 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:36 crc kubenswrapper[4599]: I1012 07:35:36.426621 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:36Z","lastTransitionTime":"2025-10-12T07:35:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:36 crc kubenswrapper[4599]: I1012 07:35:36.528170 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:36 crc kubenswrapper[4599]: I1012 07:35:36.528210 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:36 crc kubenswrapper[4599]: I1012 07:35:36.528219 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:36 crc kubenswrapper[4599]: I1012 07:35:36.528232 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:36 crc kubenswrapper[4599]: I1012 07:35:36.528243 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:36Z","lastTransitionTime":"2025-10-12T07:35:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:36 crc kubenswrapper[4599]: I1012 07:35:36.544193 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kwphq" Oct 12 07:35:36 crc kubenswrapper[4599]: E1012 07:35:36.544450 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kwphq" podUID="3c3e76cc-139b-4a2a-b96b-6077e3706376" Oct 12 07:35:36 crc kubenswrapper[4599]: I1012 07:35:36.630029 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:36 crc kubenswrapper[4599]: I1012 07:35:36.630059 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:36 crc kubenswrapper[4599]: I1012 07:35:36.630071 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:36 crc kubenswrapper[4599]: I1012 07:35:36.630085 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:36 crc kubenswrapper[4599]: I1012 07:35:36.630095 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:36Z","lastTransitionTime":"2025-10-12T07:35:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:36 crc kubenswrapper[4599]: I1012 07:35:36.731416 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:36 crc kubenswrapper[4599]: I1012 07:35:36.731437 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:36 crc kubenswrapper[4599]: I1012 07:35:36.731446 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:36 crc kubenswrapper[4599]: I1012 07:35:36.731457 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:36 crc kubenswrapper[4599]: I1012 07:35:36.731466 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:36Z","lastTransitionTime":"2025-10-12T07:35:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:36 crc kubenswrapper[4599]: I1012 07:35:36.832841 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:36 crc kubenswrapper[4599]: I1012 07:35:36.832874 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:36 crc kubenswrapper[4599]: I1012 07:35:36.832884 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:36 crc kubenswrapper[4599]: I1012 07:35:36.832898 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:36 crc kubenswrapper[4599]: I1012 07:35:36.832907 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:36Z","lastTransitionTime":"2025-10-12T07:35:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:36 crc kubenswrapper[4599]: I1012 07:35:36.934628 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:36 crc kubenswrapper[4599]: I1012 07:35:36.934663 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:36 crc kubenswrapper[4599]: I1012 07:35:36.934676 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:36 crc kubenswrapper[4599]: I1012 07:35:36.934689 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:36 crc kubenswrapper[4599]: I1012 07:35:36.934697 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:36Z","lastTransitionTime":"2025-10-12T07:35:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:37 crc kubenswrapper[4599]: I1012 07:35:37.036966 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:37 crc kubenswrapper[4599]: I1012 07:35:37.037010 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:37 crc kubenswrapper[4599]: I1012 07:35:37.037020 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:37 crc kubenswrapper[4599]: I1012 07:35:37.037035 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:37 crc kubenswrapper[4599]: I1012 07:35:37.037045 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:37Z","lastTransitionTime":"2025-10-12T07:35:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:37 crc kubenswrapper[4599]: I1012 07:35:37.138947 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:37 crc kubenswrapper[4599]: I1012 07:35:37.138973 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:37 crc kubenswrapper[4599]: I1012 07:35:37.138982 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:37 crc kubenswrapper[4599]: I1012 07:35:37.138996 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:37 crc kubenswrapper[4599]: I1012 07:35:37.139005 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:37Z","lastTransitionTime":"2025-10-12T07:35:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:37 crc kubenswrapper[4599]: I1012 07:35:37.234741 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 07:35:37 crc kubenswrapper[4599]: E1012 07:35:37.234986 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 07:35:53.234969109 +0000 UTC m=+50.024164611 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 07:35:37 crc kubenswrapper[4599]: I1012 07:35:37.242671 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:37 crc kubenswrapper[4599]: I1012 07:35:37.242718 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:37 crc kubenswrapper[4599]: I1012 07:35:37.242734 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:37 crc kubenswrapper[4599]: I1012 07:35:37.242760 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:37 crc kubenswrapper[4599]: I1012 07:35:37.242777 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:37Z","lastTransitionTime":"2025-10-12T07:35:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:37 crc kubenswrapper[4599]: I1012 07:35:37.335222 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 07:35:37 crc kubenswrapper[4599]: I1012 07:35:37.335268 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 07:35:37 crc kubenswrapper[4599]: I1012 07:35:37.335291 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 07:35:37 crc kubenswrapper[4599]: I1012 07:35:37.335314 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 07:35:37 crc kubenswrapper[4599]: E1012 07:35:37.335429 4599 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 12 07:35:37 crc kubenswrapper[4599]: E1012 07:35:37.335491 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-12 07:35:53.335468882 +0000 UTC m=+50.124664383 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 12 07:35:37 crc kubenswrapper[4599]: E1012 07:35:37.335704 4599 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 12 07:35:37 crc kubenswrapper[4599]: E1012 07:35:37.335756 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-12 07:35:53.335743945 +0000 UTC m=+50.124939448 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 12 07:35:37 crc kubenswrapper[4599]: E1012 07:35:37.335811 4599 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 12 07:35:37 crc kubenswrapper[4599]: E1012 07:35:37.335872 4599 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 12 07:35:37 crc kubenswrapper[4599]: E1012 07:35:37.335980 4599 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 12 07:35:37 crc kubenswrapper[4599]: E1012 07:35:37.335994 4599 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 12 07:35:37 crc kubenswrapper[4599]: E1012 07:35:37.336065 4599 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 12 07:35:37 crc kubenswrapper[4599]: E1012 07:35:37.336090 4599 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 12 07:35:37 crc kubenswrapper[4599]: E1012 07:35:37.336108 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-12 07:35:53.336074464 +0000 UTC m=+50.125269966 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 12 07:35:37 crc kubenswrapper[4599]: E1012 07:35:37.336200 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-12 07:35:53.336184183 +0000 UTC m=+50.125379685 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 12 07:35:37 crc kubenswrapper[4599]: I1012 07:35:37.345626 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:37 crc kubenswrapper[4599]: I1012 07:35:37.345675 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:37 crc kubenswrapper[4599]: I1012 07:35:37.345686 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:37 crc kubenswrapper[4599]: I1012 07:35:37.345703 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:37 crc kubenswrapper[4599]: I1012 07:35:37.345715 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:37Z","lastTransitionTime":"2025-10-12T07:35:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:37 crc kubenswrapper[4599]: I1012 07:35:37.447633 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:37 crc kubenswrapper[4599]: I1012 07:35:37.447661 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:37 crc kubenswrapper[4599]: I1012 07:35:37.447671 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:37 crc kubenswrapper[4599]: I1012 07:35:37.447682 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:37 crc kubenswrapper[4599]: I1012 07:35:37.447689 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:37Z","lastTransitionTime":"2025-10-12T07:35:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:37 crc kubenswrapper[4599]: I1012 07:35:37.544612 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 07:35:37 crc kubenswrapper[4599]: I1012 07:35:37.544698 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 07:35:37 crc kubenswrapper[4599]: I1012 07:35:37.544742 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 07:35:37 crc kubenswrapper[4599]: E1012 07:35:37.544841 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 07:35:37 crc kubenswrapper[4599]: E1012 07:35:37.544951 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 07:35:37 crc kubenswrapper[4599]: E1012 07:35:37.545006 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 07:35:37 crc kubenswrapper[4599]: I1012 07:35:37.549681 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:37 crc kubenswrapper[4599]: I1012 07:35:37.549713 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:37 crc kubenswrapper[4599]: I1012 07:35:37.549723 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:37 crc kubenswrapper[4599]: I1012 07:35:37.549742 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:37 crc kubenswrapper[4599]: I1012 07:35:37.549757 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:37Z","lastTransitionTime":"2025-10-12T07:35:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:37 crc kubenswrapper[4599]: I1012 07:35:37.652328 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:37 crc kubenswrapper[4599]: I1012 07:35:37.652450 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:37 crc kubenswrapper[4599]: I1012 07:35:37.652459 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:37 crc kubenswrapper[4599]: I1012 07:35:37.652477 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:37 crc kubenswrapper[4599]: I1012 07:35:37.652498 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:37Z","lastTransitionTime":"2025-10-12T07:35:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:37 crc kubenswrapper[4599]: I1012 07:35:37.754869 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:37 crc kubenswrapper[4599]: I1012 07:35:37.754914 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:37 crc kubenswrapper[4599]: I1012 07:35:37.754923 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:37 crc kubenswrapper[4599]: I1012 07:35:37.754939 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:37 crc kubenswrapper[4599]: I1012 07:35:37.754952 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:37Z","lastTransitionTime":"2025-10-12T07:35:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:37 crc kubenswrapper[4599]: I1012 07:35:37.856798 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:37 crc kubenswrapper[4599]: I1012 07:35:37.856845 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:37 crc kubenswrapper[4599]: I1012 07:35:37.856855 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:37 crc kubenswrapper[4599]: I1012 07:35:37.856865 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:37 crc kubenswrapper[4599]: I1012 07:35:37.856875 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:37Z","lastTransitionTime":"2025-10-12T07:35:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:37 crc kubenswrapper[4599]: I1012 07:35:37.958969 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:37 crc kubenswrapper[4599]: I1012 07:35:37.959031 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:37 crc kubenswrapper[4599]: I1012 07:35:37.959041 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:37 crc kubenswrapper[4599]: I1012 07:35:37.959064 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:37 crc kubenswrapper[4599]: I1012 07:35:37.959082 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:37Z","lastTransitionTime":"2025-10-12T07:35:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:38 crc kubenswrapper[4599]: I1012 07:35:38.062150 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:38 crc kubenswrapper[4599]: I1012 07:35:38.062214 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:38 crc kubenswrapper[4599]: I1012 07:35:38.062223 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:38 crc kubenswrapper[4599]: I1012 07:35:38.062246 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:38 crc kubenswrapper[4599]: I1012 07:35:38.062255 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:38Z","lastTransitionTime":"2025-10-12T07:35:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:38 crc kubenswrapper[4599]: I1012 07:35:38.164013 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:38 crc kubenswrapper[4599]: I1012 07:35:38.164067 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:38 crc kubenswrapper[4599]: I1012 07:35:38.164076 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:38 crc kubenswrapper[4599]: I1012 07:35:38.164096 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:38 crc kubenswrapper[4599]: I1012 07:35:38.164108 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:38Z","lastTransitionTime":"2025-10-12T07:35:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:38 crc kubenswrapper[4599]: I1012 07:35:38.266392 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:38 crc kubenswrapper[4599]: I1012 07:35:38.266428 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:38 crc kubenswrapper[4599]: I1012 07:35:38.266437 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:38 crc kubenswrapper[4599]: I1012 07:35:38.266453 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:38 crc kubenswrapper[4599]: I1012 07:35:38.266463 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:38Z","lastTransitionTime":"2025-10-12T07:35:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:38 crc kubenswrapper[4599]: I1012 07:35:38.344227 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3c3e76cc-139b-4a2a-b96b-6077e3706376-metrics-certs\") pod \"network-metrics-daemon-kwphq\" (UID: \"3c3e76cc-139b-4a2a-b96b-6077e3706376\") " pod="openshift-multus/network-metrics-daemon-kwphq" Oct 12 07:35:38 crc kubenswrapper[4599]: E1012 07:35:38.344393 4599 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 12 07:35:38 crc kubenswrapper[4599]: E1012 07:35:38.344456 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c3e76cc-139b-4a2a-b96b-6077e3706376-metrics-certs podName:3c3e76cc-139b-4a2a-b96b-6077e3706376 nodeName:}" failed. No retries permitted until 2025-10-12 07:35:42.344439201 +0000 UTC m=+39.133634703 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3c3e76cc-139b-4a2a-b96b-6077e3706376-metrics-certs") pod "network-metrics-daemon-kwphq" (UID: "3c3e76cc-139b-4a2a-b96b-6077e3706376") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 12 07:35:38 crc kubenswrapper[4599]: I1012 07:35:38.369036 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:38 crc kubenswrapper[4599]: I1012 07:35:38.369071 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:38 crc kubenswrapper[4599]: I1012 07:35:38.369080 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:38 crc kubenswrapper[4599]: I1012 07:35:38.369096 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:38 crc kubenswrapper[4599]: I1012 07:35:38.369109 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:38Z","lastTransitionTime":"2025-10-12T07:35:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:38 crc kubenswrapper[4599]: I1012 07:35:38.471763 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:38 crc kubenswrapper[4599]: I1012 07:35:38.471804 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:38 crc kubenswrapper[4599]: I1012 07:35:38.471812 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:38 crc kubenswrapper[4599]: I1012 07:35:38.471826 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:38 crc kubenswrapper[4599]: I1012 07:35:38.471835 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:38Z","lastTransitionTime":"2025-10-12T07:35:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:38 crc kubenswrapper[4599]: I1012 07:35:38.544750 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kwphq" Oct 12 07:35:38 crc kubenswrapper[4599]: E1012 07:35:38.544905 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kwphq" podUID="3c3e76cc-139b-4a2a-b96b-6077e3706376" Oct 12 07:35:38 crc kubenswrapper[4599]: I1012 07:35:38.574157 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:38 crc kubenswrapper[4599]: I1012 07:35:38.574416 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:38 crc kubenswrapper[4599]: I1012 07:35:38.574478 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:38 crc kubenswrapper[4599]: I1012 07:35:38.574596 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:38 crc kubenswrapper[4599]: I1012 07:35:38.574662 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:38Z","lastTransitionTime":"2025-10-12T07:35:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:38 crc kubenswrapper[4599]: I1012 07:35:38.677000 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:38 crc kubenswrapper[4599]: I1012 07:35:38.677053 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:38 crc kubenswrapper[4599]: I1012 07:35:38.677063 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:38 crc kubenswrapper[4599]: I1012 07:35:38.677080 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:38 crc kubenswrapper[4599]: I1012 07:35:38.677091 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:38Z","lastTransitionTime":"2025-10-12T07:35:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:38 crc kubenswrapper[4599]: I1012 07:35:38.779572 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:38 crc kubenswrapper[4599]: I1012 07:35:38.779605 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:38 crc kubenswrapper[4599]: I1012 07:35:38.779613 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:38 crc kubenswrapper[4599]: I1012 07:35:38.779626 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:38 crc kubenswrapper[4599]: I1012 07:35:38.779634 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:38Z","lastTransitionTime":"2025-10-12T07:35:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:38 crc kubenswrapper[4599]: I1012 07:35:38.881504 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:38 crc kubenswrapper[4599]: I1012 07:35:38.881560 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:38 crc kubenswrapper[4599]: I1012 07:35:38.881570 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:38 crc kubenswrapper[4599]: I1012 07:35:38.881584 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:38 crc kubenswrapper[4599]: I1012 07:35:38.881592 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:38Z","lastTransitionTime":"2025-10-12T07:35:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:38 crc kubenswrapper[4599]: I1012 07:35:38.983682 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:38 crc kubenswrapper[4599]: I1012 07:35:38.983725 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:38 crc kubenswrapper[4599]: I1012 07:35:38.983734 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:38 crc kubenswrapper[4599]: I1012 07:35:38.983750 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:38 crc kubenswrapper[4599]: I1012 07:35:38.983758 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:38Z","lastTransitionTime":"2025-10-12T07:35:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:39 crc kubenswrapper[4599]: I1012 07:35:39.085920 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:39 crc kubenswrapper[4599]: I1012 07:35:39.085973 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:39 crc kubenswrapper[4599]: I1012 07:35:39.085982 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:39 crc kubenswrapper[4599]: I1012 07:35:39.085998 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:39 crc kubenswrapper[4599]: I1012 07:35:39.086011 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:39Z","lastTransitionTime":"2025-10-12T07:35:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:39 crc kubenswrapper[4599]: I1012 07:35:39.188199 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:39 crc kubenswrapper[4599]: I1012 07:35:39.188256 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:39 crc kubenswrapper[4599]: I1012 07:35:39.188268 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:39 crc kubenswrapper[4599]: I1012 07:35:39.188285 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:39 crc kubenswrapper[4599]: I1012 07:35:39.188295 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:39Z","lastTransitionTime":"2025-10-12T07:35:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:39 crc kubenswrapper[4599]: I1012 07:35:39.290221 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:39 crc kubenswrapper[4599]: I1012 07:35:39.290266 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:39 crc kubenswrapper[4599]: I1012 07:35:39.290275 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:39 crc kubenswrapper[4599]: I1012 07:35:39.290292 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:39 crc kubenswrapper[4599]: I1012 07:35:39.290302 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:39Z","lastTransitionTime":"2025-10-12T07:35:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:39 crc kubenswrapper[4599]: I1012 07:35:39.392512 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:39 crc kubenswrapper[4599]: I1012 07:35:39.395710 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:39 crc kubenswrapper[4599]: I1012 07:35:39.395730 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:39 crc kubenswrapper[4599]: I1012 07:35:39.395773 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:39 crc kubenswrapper[4599]: I1012 07:35:39.395791 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:39Z","lastTransitionTime":"2025-10-12T07:35:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:39 crc kubenswrapper[4599]: I1012 07:35:39.498826 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:39 crc kubenswrapper[4599]: I1012 07:35:39.498895 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:39 crc kubenswrapper[4599]: I1012 07:35:39.498907 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:39 crc kubenswrapper[4599]: I1012 07:35:39.498963 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:39 crc kubenswrapper[4599]: I1012 07:35:39.498975 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:39Z","lastTransitionTime":"2025-10-12T07:35:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:39 crc kubenswrapper[4599]: I1012 07:35:39.544716 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 07:35:39 crc kubenswrapper[4599]: I1012 07:35:39.544801 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 07:35:39 crc kubenswrapper[4599]: I1012 07:35:39.544826 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 07:35:39 crc kubenswrapper[4599]: E1012 07:35:39.545083 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 07:35:39 crc kubenswrapper[4599]: E1012 07:35:39.545146 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 07:35:39 crc kubenswrapper[4599]: E1012 07:35:39.545224 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 07:35:39 crc kubenswrapper[4599]: I1012 07:35:39.601165 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:39 crc kubenswrapper[4599]: I1012 07:35:39.601203 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:39 crc kubenswrapper[4599]: I1012 07:35:39.601212 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:39 crc kubenswrapper[4599]: I1012 07:35:39.601230 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:39 crc kubenswrapper[4599]: I1012 07:35:39.601240 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:39Z","lastTransitionTime":"2025-10-12T07:35:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:39 crc kubenswrapper[4599]: I1012 07:35:39.703130 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:39 crc kubenswrapper[4599]: I1012 07:35:39.703159 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:39 crc kubenswrapper[4599]: I1012 07:35:39.703168 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:39 crc kubenswrapper[4599]: I1012 07:35:39.703181 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:39 crc kubenswrapper[4599]: I1012 07:35:39.703191 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:39Z","lastTransitionTime":"2025-10-12T07:35:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:39 crc kubenswrapper[4599]: I1012 07:35:39.806251 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:39 crc kubenswrapper[4599]: I1012 07:35:39.806444 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:39 crc kubenswrapper[4599]: I1012 07:35:39.806541 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:39 crc kubenswrapper[4599]: I1012 07:35:39.806619 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:39 crc kubenswrapper[4599]: I1012 07:35:39.806676 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:39Z","lastTransitionTime":"2025-10-12T07:35:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:39 crc kubenswrapper[4599]: I1012 07:35:39.909065 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:39 crc kubenswrapper[4599]: I1012 07:35:39.909127 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:39 crc kubenswrapper[4599]: I1012 07:35:39.909143 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:39 crc kubenswrapper[4599]: I1012 07:35:39.909165 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:39 crc kubenswrapper[4599]: I1012 07:35:39.909180 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:39Z","lastTransitionTime":"2025-10-12T07:35:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:40 crc kubenswrapper[4599]: I1012 07:35:40.012244 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:40 crc kubenswrapper[4599]: I1012 07:35:40.012300 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:40 crc kubenswrapper[4599]: I1012 07:35:40.012310 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:40 crc kubenswrapper[4599]: I1012 07:35:40.012329 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:40 crc kubenswrapper[4599]: I1012 07:35:40.012354 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:40Z","lastTransitionTime":"2025-10-12T07:35:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:40 crc kubenswrapper[4599]: I1012 07:35:40.115098 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:40 crc kubenswrapper[4599]: I1012 07:35:40.115153 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:40 crc kubenswrapper[4599]: I1012 07:35:40.115167 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:40 crc kubenswrapper[4599]: I1012 07:35:40.115186 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:40 crc kubenswrapper[4599]: I1012 07:35:40.115199 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:40Z","lastTransitionTime":"2025-10-12T07:35:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:40 crc kubenswrapper[4599]: I1012 07:35:40.218669 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:40 crc kubenswrapper[4599]: I1012 07:35:40.218713 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:40 crc kubenswrapper[4599]: I1012 07:35:40.218723 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:40 crc kubenswrapper[4599]: I1012 07:35:40.218744 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:40 crc kubenswrapper[4599]: I1012 07:35:40.218759 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:40Z","lastTransitionTime":"2025-10-12T07:35:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:40 crc kubenswrapper[4599]: I1012 07:35:40.320875 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:40 crc kubenswrapper[4599]: I1012 07:35:40.320937 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:40 crc kubenswrapper[4599]: I1012 07:35:40.320947 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:40 crc kubenswrapper[4599]: I1012 07:35:40.320963 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:40 crc kubenswrapper[4599]: I1012 07:35:40.320975 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:40Z","lastTransitionTime":"2025-10-12T07:35:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:40 crc kubenswrapper[4599]: I1012 07:35:40.423799 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:40 crc kubenswrapper[4599]: I1012 07:35:40.423848 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:40 crc kubenswrapper[4599]: I1012 07:35:40.423858 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:40 crc kubenswrapper[4599]: I1012 07:35:40.423872 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:40 crc kubenswrapper[4599]: I1012 07:35:40.423883 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:40Z","lastTransitionTime":"2025-10-12T07:35:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:40 crc kubenswrapper[4599]: I1012 07:35:40.526586 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:40 crc kubenswrapper[4599]: I1012 07:35:40.526642 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:40 crc kubenswrapper[4599]: I1012 07:35:40.526652 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:40 crc kubenswrapper[4599]: I1012 07:35:40.526673 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:40 crc kubenswrapper[4599]: I1012 07:35:40.526683 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:40Z","lastTransitionTime":"2025-10-12T07:35:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:40 crc kubenswrapper[4599]: I1012 07:35:40.545265 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kwphq" Oct 12 07:35:40 crc kubenswrapper[4599]: E1012 07:35:40.545470 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kwphq" podUID="3c3e76cc-139b-4a2a-b96b-6077e3706376" Oct 12 07:35:40 crc kubenswrapper[4599]: I1012 07:35:40.629753 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:40 crc kubenswrapper[4599]: I1012 07:35:40.629798 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:40 crc kubenswrapper[4599]: I1012 07:35:40.629815 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:40 crc kubenswrapper[4599]: I1012 07:35:40.629832 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:40 crc kubenswrapper[4599]: I1012 07:35:40.629844 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:40Z","lastTransitionTime":"2025-10-12T07:35:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:40 crc kubenswrapper[4599]: I1012 07:35:40.732166 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:40 crc kubenswrapper[4599]: I1012 07:35:40.732200 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:40 crc kubenswrapper[4599]: I1012 07:35:40.732210 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:40 crc kubenswrapper[4599]: I1012 07:35:40.732224 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:40 crc kubenswrapper[4599]: I1012 07:35:40.732234 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:40Z","lastTransitionTime":"2025-10-12T07:35:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:40 crc kubenswrapper[4599]: I1012 07:35:40.834414 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:40 crc kubenswrapper[4599]: I1012 07:35:40.834475 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:40 crc kubenswrapper[4599]: I1012 07:35:40.834485 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:40 crc kubenswrapper[4599]: I1012 07:35:40.834504 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:40 crc kubenswrapper[4599]: I1012 07:35:40.834518 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:40Z","lastTransitionTime":"2025-10-12T07:35:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:40 crc kubenswrapper[4599]: I1012 07:35:40.936254 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:40 crc kubenswrapper[4599]: I1012 07:35:40.936301 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:40 crc kubenswrapper[4599]: I1012 07:35:40.936318 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:40 crc kubenswrapper[4599]: I1012 07:35:40.936353 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:40 crc kubenswrapper[4599]: I1012 07:35:40.936362 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:40Z","lastTransitionTime":"2025-10-12T07:35:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:41 crc kubenswrapper[4599]: I1012 07:35:41.038377 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:41 crc kubenswrapper[4599]: I1012 07:35:41.038413 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:41 crc kubenswrapper[4599]: I1012 07:35:41.038422 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:41 crc kubenswrapper[4599]: I1012 07:35:41.038434 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:41 crc kubenswrapper[4599]: I1012 07:35:41.038442 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:41Z","lastTransitionTime":"2025-10-12T07:35:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:41 crc kubenswrapper[4599]: I1012 07:35:41.140680 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:41 crc kubenswrapper[4599]: I1012 07:35:41.140713 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:41 crc kubenswrapper[4599]: I1012 07:35:41.140721 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:41 crc kubenswrapper[4599]: I1012 07:35:41.140732 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:41 crc kubenswrapper[4599]: I1012 07:35:41.140741 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:41Z","lastTransitionTime":"2025-10-12T07:35:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:41 crc kubenswrapper[4599]: I1012 07:35:41.243701 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:41 crc kubenswrapper[4599]: I1012 07:35:41.243762 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:41 crc kubenswrapper[4599]: I1012 07:35:41.243775 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:41 crc kubenswrapper[4599]: I1012 07:35:41.243804 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:41 crc kubenswrapper[4599]: I1012 07:35:41.243819 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:41Z","lastTransitionTime":"2025-10-12T07:35:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:41 crc kubenswrapper[4599]: I1012 07:35:41.347661 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:41 crc kubenswrapper[4599]: I1012 07:35:41.347769 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:41 crc kubenswrapper[4599]: I1012 07:35:41.347788 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:41 crc kubenswrapper[4599]: I1012 07:35:41.347812 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:41 crc kubenswrapper[4599]: I1012 07:35:41.347824 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:41Z","lastTransitionTime":"2025-10-12T07:35:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:41 crc kubenswrapper[4599]: I1012 07:35:41.450594 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:41 crc kubenswrapper[4599]: I1012 07:35:41.450651 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:41 crc kubenswrapper[4599]: I1012 07:35:41.450663 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:41 crc kubenswrapper[4599]: I1012 07:35:41.450685 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:41 crc kubenswrapper[4599]: I1012 07:35:41.450698 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:41Z","lastTransitionTime":"2025-10-12T07:35:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:41 crc kubenswrapper[4599]: I1012 07:35:41.545420 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 07:35:41 crc kubenswrapper[4599]: I1012 07:35:41.545464 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 07:35:41 crc kubenswrapper[4599]: I1012 07:35:41.545423 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 07:35:41 crc kubenswrapper[4599]: E1012 07:35:41.545619 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 07:35:41 crc kubenswrapper[4599]: E1012 07:35:41.545772 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 07:35:41 crc kubenswrapper[4599]: E1012 07:35:41.545869 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 07:35:41 crc kubenswrapper[4599]: I1012 07:35:41.554620 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:41 crc kubenswrapper[4599]: I1012 07:35:41.554664 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:41 crc kubenswrapper[4599]: I1012 07:35:41.554706 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:41 crc kubenswrapper[4599]: I1012 07:35:41.554725 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:41 crc kubenswrapper[4599]: I1012 07:35:41.554736 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:41Z","lastTransitionTime":"2025-10-12T07:35:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:41 crc kubenswrapper[4599]: I1012 07:35:41.658050 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:41 crc kubenswrapper[4599]: I1012 07:35:41.658093 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:41 crc kubenswrapper[4599]: I1012 07:35:41.658121 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:41 crc kubenswrapper[4599]: I1012 07:35:41.658138 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:41 crc kubenswrapper[4599]: I1012 07:35:41.658148 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:41Z","lastTransitionTime":"2025-10-12T07:35:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:41 crc kubenswrapper[4599]: I1012 07:35:41.760857 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:41 crc kubenswrapper[4599]: I1012 07:35:41.760912 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:41 crc kubenswrapper[4599]: I1012 07:35:41.760924 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:41 crc kubenswrapper[4599]: I1012 07:35:41.760946 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:41 crc kubenswrapper[4599]: I1012 07:35:41.760955 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:41Z","lastTransitionTime":"2025-10-12T07:35:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:41 crc kubenswrapper[4599]: I1012 07:35:41.863160 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:41 crc kubenswrapper[4599]: I1012 07:35:41.863225 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:41 crc kubenswrapper[4599]: I1012 07:35:41.863238 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:41 crc kubenswrapper[4599]: I1012 07:35:41.863258 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:41 crc kubenswrapper[4599]: I1012 07:35:41.863269 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:41Z","lastTransitionTime":"2025-10-12T07:35:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:41 crc kubenswrapper[4599]: I1012 07:35:41.967515 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:41 crc kubenswrapper[4599]: I1012 07:35:41.967587 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:41 crc kubenswrapper[4599]: I1012 07:35:41.967600 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:41 crc kubenswrapper[4599]: I1012 07:35:41.967621 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:41 crc kubenswrapper[4599]: I1012 07:35:41.967637 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:41Z","lastTransitionTime":"2025-10-12T07:35:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:42 crc kubenswrapper[4599]: I1012 07:35:42.069656 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:42 crc kubenswrapper[4599]: I1012 07:35:42.069701 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:42 crc kubenswrapper[4599]: I1012 07:35:42.069709 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:42 crc kubenswrapper[4599]: I1012 07:35:42.069724 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:42 crc kubenswrapper[4599]: I1012 07:35:42.069736 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:42Z","lastTransitionTime":"2025-10-12T07:35:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:42 crc kubenswrapper[4599]: I1012 07:35:42.172767 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:42 crc kubenswrapper[4599]: I1012 07:35:42.172825 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:42 crc kubenswrapper[4599]: I1012 07:35:42.172835 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:42 crc kubenswrapper[4599]: I1012 07:35:42.172858 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:42 crc kubenswrapper[4599]: I1012 07:35:42.172878 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:42Z","lastTransitionTime":"2025-10-12T07:35:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:42 crc kubenswrapper[4599]: I1012 07:35:42.274693 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:42 crc kubenswrapper[4599]: I1012 07:35:42.274746 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:42 crc kubenswrapper[4599]: I1012 07:35:42.274761 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:42 crc kubenswrapper[4599]: I1012 07:35:42.274781 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:42 crc kubenswrapper[4599]: I1012 07:35:42.274791 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:42Z","lastTransitionTime":"2025-10-12T07:35:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:42 crc kubenswrapper[4599]: I1012 07:35:42.376473 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:42 crc kubenswrapper[4599]: I1012 07:35:42.376519 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:42 crc kubenswrapper[4599]: I1012 07:35:42.376529 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:42 crc kubenswrapper[4599]: I1012 07:35:42.376548 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:42 crc kubenswrapper[4599]: I1012 07:35:42.376559 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:42Z","lastTransitionTime":"2025-10-12T07:35:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:42 crc kubenswrapper[4599]: I1012 07:35:42.380116 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3c3e76cc-139b-4a2a-b96b-6077e3706376-metrics-certs\") pod \"network-metrics-daemon-kwphq\" (UID: \"3c3e76cc-139b-4a2a-b96b-6077e3706376\") " pod="openshift-multus/network-metrics-daemon-kwphq" Oct 12 07:35:42 crc kubenswrapper[4599]: E1012 07:35:42.380376 4599 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 12 07:35:42 crc kubenswrapper[4599]: E1012 07:35:42.380487 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c3e76cc-139b-4a2a-b96b-6077e3706376-metrics-certs podName:3c3e76cc-139b-4a2a-b96b-6077e3706376 nodeName:}" failed. No retries permitted until 2025-10-12 07:35:50.3804607 +0000 UTC m=+47.169656202 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3c3e76cc-139b-4a2a-b96b-6077e3706376-metrics-certs") pod "network-metrics-daemon-kwphq" (UID: "3c3e76cc-139b-4a2a-b96b-6077e3706376") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 12 07:35:42 crc kubenswrapper[4599]: I1012 07:35:42.479021 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:42 crc kubenswrapper[4599]: I1012 07:35:42.479065 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:42 crc kubenswrapper[4599]: I1012 07:35:42.479077 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:42 crc kubenswrapper[4599]: I1012 07:35:42.479095 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:42 crc kubenswrapper[4599]: I1012 07:35:42.479106 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:42Z","lastTransitionTime":"2025-10-12T07:35:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:42 crc kubenswrapper[4599]: I1012 07:35:42.545124 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kwphq" Oct 12 07:35:42 crc kubenswrapper[4599]: E1012 07:35:42.545303 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kwphq" podUID="3c3e76cc-139b-4a2a-b96b-6077e3706376" Oct 12 07:35:42 crc kubenswrapper[4599]: I1012 07:35:42.581558 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:42 crc kubenswrapper[4599]: I1012 07:35:42.581619 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:42 crc kubenswrapper[4599]: I1012 07:35:42.581633 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:42 crc kubenswrapper[4599]: I1012 07:35:42.581654 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:42 crc kubenswrapper[4599]: I1012 07:35:42.581667 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:42Z","lastTransitionTime":"2025-10-12T07:35:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:42 crc kubenswrapper[4599]: I1012 07:35:42.683827 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:42 crc kubenswrapper[4599]: I1012 07:35:42.683893 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:42 crc kubenswrapper[4599]: I1012 07:35:42.683907 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:42 crc kubenswrapper[4599]: I1012 07:35:42.683930 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:42 crc kubenswrapper[4599]: I1012 07:35:42.683945 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:42Z","lastTransitionTime":"2025-10-12T07:35:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:42 crc kubenswrapper[4599]: I1012 07:35:42.786373 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:42 crc kubenswrapper[4599]: I1012 07:35:42.786425 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:42 crc kubenswrapper[4599]: I1012 07:35:42.786436 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:42 crc kubenswrapper[4599]: I1012 07:35:42.786453 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:42 crc kubenswrapper[4599]: I1012 07:35:42.786469 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:42Z","lastTransitionTime":"2025-10-12T07:35:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:42 crc kubenswrapper[4599]: I1012 07:35:42.888923 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:42 crc kubenswrapper[4599]: I1012 07:35:42.888984 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:42 crc kubenswrapper[4599]: I1012 07:35:42.888997 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:42 crc kubenswrapper[4599]: I1012 07:35:42.889018 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:42 crc kubenswrapper[4599]: I1012 07:35:42.889031 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:42Z","lastTransitionTime":"2025-10-12T07:35:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:42 crc kubenswrapper[4599]: I1012 07:35:42.991436 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:42 crc kubenswrapper[4599]: I1012 07:35:42.991491 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:42 crc kubenswrapper[4599]: I1012 07:35:42.991505 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:42 crc kubenswrapper[4599]: I1012 07:35:42.991526 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:42 crc kubenswrapper[4599]: I1012 07:35:42.991542 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:42Z","lastTransitionTime":"2025-10-12T07:35:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:43 crc kubenswrapper[4599]: I1012 07:35:43.093746 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:43 crc kubenswrapper[4599]: I1012 07:35:43.093819 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:43 crc kubenswrapper[4599]: I1012 07:35:43.093832 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:43 crc kubenswrapper[4599]: I1012 07:35:43.093851 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:43 crc kubenswrapper[4599]: I1012 07:35:43.093878 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:43Z","lastTransitionTime":"2025-10-12T07:35:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:43 crc kubenswrapper[4599]: I1012 07:35:43.196435 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:43 crc kubenswrapper[4599]: I1012 07:35:43.196470 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:43 crc kubenswrapper[4599]: I1012 07:35:43.196478 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:43 crc kubenswrapper[4599]: I1012 07:35:43.196492 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:43 crc kubenswrapper[4599]: I1012 07:35:43.196501 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:43Z","lastTransitionTime":"2025-10-12T07:35:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:43 crc kubenswrapper[4599]: I1012 07:35:43.298929 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:43 crc kubenswrapper[4599]: I1012 07:35:43.299002 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:43 crc kubenswrapper[4599]: I1012 07:35:43.299014 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:43 crc kubenswrapper[4599]: I1012 07:35:43.299030 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:43 crc kubenswrapper[4599]: I1012 07:35:43.299042 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:43Z","lastTransitionTime":"2025-10-12T07:35:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:43 crc kubenswrapper[4599]: I1012 07:35:43.401642 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:43 crc kubenswrapper[4599]: I1012 07:35:43.401691 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:43 crc kubenswrapper[4599]: I1012 07:35:43.401702 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:43 crc kubenswrapper[4599]: I1012 07:35:43.401721 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:43 crc kubenswrapper[4599]: I1012 07:35:43.401731 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:43Z","lastTransitionTime":"2025-10-12T07:35:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:43 crc kubenswrapper[4599]: I1012 07:35:43.503575 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:43 crc kubenswrapper[4599]: I1012 07:35:43.503632 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:43 crc kubenswrapper[4599]: I1012 07:35:43.503642 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:43 crc kubenswrapper[4599]: I1012 07:35:43.503655 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:43 crc kubenswrapper[4599]: I1012 07:35:43.503668 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:43Z","lastTransitionTime":"2025-10-12T07:35:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:43 crc kubenswrapper[4599]: I1012 07:35:43.544649 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 07:35:43 crc kubenswrapper[4599]: I1012 07:35:43.544698 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 07:35:43 crc kubenswrapper[4599]: I1012 07:35:43.545102 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 07:35:43 crc kubenswrapper[4599]: E1012 07:35:43.549816 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 07:35:43 crc kubenswrapper[4599]: E1012 07:35:43.551215 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 07:35:43 crc kubenswrapper[4599]: E1012 07:35:43.551869 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 07:35:43 crc kubenswrapper[4599]: I1012 07:35:43.562109 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2869b43-5f5f-4ed7-9dc3-63d210bde137\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21abc118842f09c54256bd05f1e5978c928bad569bc9b52157287576117fb49d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb25175bfe42e193200fbc76f82294afbd5f23d9653637c12ddf67553431748e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da490def77b19758f5479ef7070846dee19a5057bd6b0568d96fd38a8dd6528\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://622eac195ecab593bc5fa71a4a88ac987a45be5fb130336aae7acd680c51abe4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:43Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:43 crc kubenswrapper[4599]: I1012 07:35:43.570627 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8hm26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce311f52-0501-45d3-8209-b1d2aa25028b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://611114a1fecbd461507dd2f566f975c7ef3e8d7655d90c1a5b31cd05a430cb38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8hm26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:43Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:43 crc kubenswrapper[4599]: I1012 07:35:43.579459 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:43Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:43 crc kubenswrapper[4599]: I1012 07:35:43.588109 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:43Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:43 crc kubenswrapper[4599]: I1012 07:35:43.596434 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4b9189620cc5d626fe6c3e7bbe0efd22f7a6f061bc2afd3d33b1e04ebdbfb1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc203371af6667734441e7dca5f3b301365eb1136f363df8c87377ae58168130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:43Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:43 crc kubenswrapper[4599]: I1012 07:35:43.604320 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b574e7cb76b05a81f63a55ce4544b1d4b3a58e6793be2c89620f66282642e39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:43Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:43 crc kubenswrapper[4599]: I1012 07:35:43.605491 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:43 crc kubenswrapper[4599]: I1012 07:35:43.605531 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:43 crc kubenswrapper[4599]: I1012 07:35:43.605542 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:43 crc kubenswrapper[4599]: I1012 07:35:43.605557 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:43 crc kubenswrapper[4599]: I1012 07:35:43.605568 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:43Z","lastTransitionTime":"2025-10-12T07:35:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:43 crc kubenswrapper[4599]: I1012 07:35:43.617612 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9xbn5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18ca8765-c435-4750-b803-14b539958d9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4db4e7d06f60157d91fcb8514d176f78540085701360ebcfa7eabb36fe84d6b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16c75354b9fbe00273ab4aaa5ca1306c5d097996b8ce8074a2cee8185118a001\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16c75354b9fbe00273ab4aaa5ca1306c5d097996b8ce8074a2cee8185118a001\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a7c28a100796e21008c597d15ee36786f7e87553a8a4feb5734ac79cef9b556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a7c28a100796e21008c597d15ee36786f7e87553a8a4feb5734ac79cef9b556\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6388c1e143c0caded216941ec25bd07be0a14f2f97f6328c35d5cc04815a88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6388c1e143c0caded216941ec25bd07be0a14f2f97f6328c35d5cc04815a88a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e201774968890569efafc736378cff0d700067bb0bd39677f42ef65fa2c07646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e201774968890569efafc736378cff0d700067bb0bd39677f42ef65fa2c07646\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f07c63fbebde2f05a5b1682af69eb42b5aec20728e0997f1ebc2f5a59815ad1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f07c63fbebde2f05a5b1682af69eb42b5aec20728e0997f1ebc2f5a59815ad1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f62b824f1e7400e9588b3d3a1afc52f86b3bcae039fbe3c757e1e01733e3a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f62b824f1e7400e9588b3d3a1afc52f86b3bcae039fbe3c757e1e01733e3a69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9xbn5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:43Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:43 crc kubenswrapper[4599]: I1012 07:35:43.625420 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rxr42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92909e59-b659-4fc0-91c0-1880ff96b4f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4162f0c9d7789e80004cf3ea54eed47b4d473832cf68a03e348973c300f9dcc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7hr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rxr42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:43Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:43 crc kubenswrapper[4599]: I1012 07:35:43.632669 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f5988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0147ed28-bc0f-409c-a813-dc2ffffba092\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b06de751793dade650ec18334a6bcf8bb26e5e89af47ede264d087d1a950178a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzz9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f5988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:43Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:43 crc kubenswrapper[4599]: I1012 07:35:43.646898 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a95b7ab-8632-4332-a30f-64f28ef8d313\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://543f674ce98b17c90c5a351ae72f596a2e90006a1b86fcede3359715c67b94bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c150b687d6b0c337604ac14537335d57d9c955a9e31001d274507e5e65ac7bc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f73bbbb549129c48a1421dda3d2385cf3515feafd8215cdac6b792823d614b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a156f480c7252f61ec3dc991814db0e7259fdf7ae618c8f43d86da97f8a6bc5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f99854365302221a36d3d7647292430b5655222c551953b6aa6de9eb1c922bea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://810de96e34e6b028e90bcc5693f1624d337cdd775c01901fc91e6dd2a7f3564d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5b9102b596e6d192154de99c79ffcf4423af88d8c95b78cfc1a27a7651679b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5b9102b596e6d192154de99c79ffcf4423af88d8c95b78cfc1a27a7651679b5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T07:35:32Z\\\",\\\"message\\\":\\\"n network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:32Z is after 2025-08-24T17:21:41Z]\\\\nI1012 07:35:32.411040 6004 services_controller.go:454] Service openshift-machine-api/machine-api-operator-machine-webhook for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI1012 07:35:32.411167 6004 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/machine-config-daemon-5mz5c\\\\nI1012 07:35:32.411173 6004 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-f5988\\\\nI1012 07:35:32.411177 6004 obj_retry.go:365] Adding new object: *v1.Pod openshift-machine-config-operator/machine-config-daemon-5mz5c\\\\nI1012 07:35:32.411180 6004 obj_retry.go:365] Adding new object: *v1.Pod openshift-dns/node-resolver-f5988\\\\nI1012 07:35:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-whk5b_openshift-ovn-kubernetes(1a95b7ab-8632-4332-a30f-64f28ef8d313)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce496909e393b4896eb9ae165317648df3e828d219c231d95c6057cd08b78fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce4f64c7fab07385f9215821c5c75c5b345653fa141a238bb3dd74e7a1a291a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce4f64c7fab07385f9215821c5c75c5b345653fa141a238bb3dd74e7a1a291a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-whk5b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:43Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:43 crc kubenswrapper[4599]: I1012 07:35:43.655786 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kwphq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c3e76cc-139b-4a2a-b96b-6077e3706376\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9v52\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9v52\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kwphq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:43Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:43 crc kubenswrapper[4599]: I1012 07:35:43.665873 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8afb4c88-1806-483a-9957-30833be28202\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c61ee75e4bbe142c54914177759cf0c49ecc36f904c8c6bf421cd4917256d5a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://545c8135b4e5319ac3b9c84ee50556b66e4f5cfe0987f5aa37ef668130f6f267\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40188eb89126b26a3002269d063827de60fa17f72c18fbbd0e15b16c744e8ddb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc5c76ab7433309a1f2c32d2ada884b5b0d926fee8219463079f8689a1f7efef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21d90e31b7ebf70d48881d9d0837e8a1fdeabc3d53988c7ba993ce28e902d3d4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T07:35:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW1012 07:35:20.755648 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1012 07:35:20.755757 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 07:35:20.757774 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2293705747/tls.crt::/tmp/serving-cert-2293705747/tls.key\\\\\\\"\\\\nI1012 07:35:20.975376 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1012 07:35:20.977324 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1012 07:35:20.977357 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1012 07:35:20.977377 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1012 07:35:20.977382 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1012 07:35:20.982144 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1012 07:35:20.982166 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1012 07:35:20.982172 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1012 07:35:20.982212 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1012 07:35:20.982216 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1012 07:35:20.982220 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1012 07:35:20.982223 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1012 07:35:20.982226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1012 07:35:20.985689 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e595e7d1dee4ecff3b5deb6e363b57de09e4bcfddd6ab9ae9be27b6e3578628\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afc60525d60f5b497a75efb631611d658c23f6336c21ca7421bf3efee3ece0ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afc60525d60f5b497a75efb631611d658c23f6336c21ca7421bf3efee3ece0ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:43Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:43 crc kubenswrapper[4599]: I1012 07:35:43.676023 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db79c5e0b03856a5da5fba6b8e6cb77590ef9e29090ac66da24d54ea638a20f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:43Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:43 crc kubenswrapper[4599]: I1012 07:35:43.684591 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:43Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:43 crc kubenswrapper[4599]: I1012 07:35:43.693474 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc694bce-8c25-4729-b452-29d44d3efe6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c222e0cd037e0719d1ea94d4f28639dae42c676fa56b5f57b22cd50d87e6b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25l5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://238fe8d2b2e7f08d70bc5f938aca3c90b68c447db21a713d651d47b47a8127c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25l5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5mz5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:43Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:43 crc kubenswrapper[4599]: I1012 07:35:43.704138 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xl2ns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29fbfc4b-8e32-4132-a34d-48b25ec31428\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e2f2c3c17ada7047edb8e0fcd51227104458618c98e5ddec12123aa7ee173e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqqzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b84de1dd84c3c72376a170b7ccbdfff8c3bdd5e9e18a344002e112e81ca555a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqqzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xl2ns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:43Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:43 crc kubenswrapper[4599]: I1012 07:35:43.708228 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:43 crc kubenswrapper[4599]: I1012 07:35:43.708287 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:43 crc kubenswrapper[4599]: I1012 07:35:43.708304 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:43 crc kubenswrapper[4599]: I1012 07:35:43.708329 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:43 crc kubenswrapper[4599]: I1012 07:35:43.708366 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:43Z","lastTransitionTime":"2025-10-12T07:35:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:43 crc kubenswrapper[4599]: I1012 07:35:43.811260 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:43 crc kubenswrapper[4599]: I1012 07:35:43.811328 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:43 crc kubenswrapper[4599]: I1012 07:35:43.811365 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:43 crc kubenswrapper[4599]: I1012 07:35:43.811395 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:43 crc kubenswrapper[4599]: I1012 07:35:43.811408 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:43Z","lastTransitionTime":"2025-10-12T07:35:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:43 crc kubenswrapper[4599]: I1012 07:35:43.913837 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:43 crc kubenswrapper[4599]: I1012 07:35:43.913872 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:43 crc kubenswrapper[4599]: I1012 07:35:43.913882 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:43 crc kubenswrapper[4599]: I1012 07:35:43.913897 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:43 crc kubenswrapper[4599]: I1012 07:35:43.913908 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:43Z","lastTransitionTime":"2025-10-12T07:35:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:44 crc kubenswrapper[4599]: I1012 07:35:44.016291 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:44 crc kubenswrapper[4599]: I1012 07:35:44.016321 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:44 crc kubenswrapper[4599]: I1012 07:35:44.016331 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:44 crc kubenswrapper[4599]: I1012 07:35:44.016370 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:44 crc kubenswrapper[4599]: I1012 07:35:44.016383 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:44Z","lastTransitionTime":"2025-10-12T07:35:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:44 crc kubenswrapper[4599]: I1012 07:35:44.119098 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:44 crc kubenswrapper[4599]: I1012 07:35:44.119156 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:44 crc kubenswrapper[4599]: I1012 07:35:44.119166 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:44 crc kubenswrapper[4599]: I1012 07:35:44.119183 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:44 crc kubenswrapper[4599]: I1012 07:35:44.119194 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:44Z","lastTransitionTime":"2025-10-12T07:35:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:44 crc kubenswrapper[4599]: I1012 07:35:44.221436 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:44 crc kubenswrapper[4599]: I1012 07:35:44.221473 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:44 crc kubenswrapper[4599]: I1012 07:35:44.221482 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:44 crc kubenswrapper[4599]: I1012 07:35:44.221496 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:44 crc kubenswrapper[4599]: I1012 07:35:44.221506 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:44Z","lastTransitionTime":"2025-10-12T07:35:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:44 crc kubenswrapper[4599]: I1012 07:35:44.324013 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:44 crc kubenswrapper[4599]: I1012 07:35:44.324057 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:44 crc kubenswrapper[4599]: I1012 07:35:44.324065 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:44 crc kubenswrapper[4599]: I1012 07:35:44.324077 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:44 crc kubenswrapper[4599]: I1012 07:35:44.324086 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:44Z","lastTransitionTime":"2025-10-12T07:35:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:44 crc kubenswrapper[4599]: I1012 07:35:44.426035 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:44 crc kubenswrapper[4599]: I1012 07:35:44.426110 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:44 crc kubenswrapper[4599]: I1012 07:35:44.426121 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:44 crc kubenswrapper[4599]: I1012 07:35:44.426146 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:44 crc kubenswrapper[4599]: I1012 07:35:44.426175 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:44Z","lastTransitionTime":"2025-10-12T07:35:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:44 crc kubenswrapper[4599]: I1012 07:35:44.433761 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:44 crc kubenswrapper[4599]: I1012 07:35:44.433794 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:44 crc kubenswrapper[4599]: I1012 07:35:44.433803 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:44 crc kubenswrapper[4599]: I1012 07:35:44.433833 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:44 crc kubenswrapper[4599]: I1012 07:35:44.433844 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:44Z","lastTransitionTime":"2025-10-12T07:35:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:44 crc kubenswrapper[4599]: E1012 07:35:44.445103 4599 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:35:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:35:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:35:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:35:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b3fd0901-686d-4e85-8767-c56bc470edcc\\\",\\\"systemUUID\\\":\\\"c0d90076-d180-408b-98a9-f48996ced0a6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:44Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:44 crc kubenswrapper[4599]: I1012 07:35:44.449000 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:44 crc kubenswrapper[4599]: I1012 07:35:44.449056 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:44 crc kubenswrapper[4599]: I1012 07:35:44.449075 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:44 crc kubenswrapper[4599]: I1012 07:35:44.449086 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:44 crc kubenswrapper[4599]: I1012 07:35:44.449094 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:44Z","lastTransitionTime":"2025-10-12T07:35:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:44 crc kubenswrapper[4599]: E1012 07:35:44.458800 4599 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:35:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:35:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:35:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:35:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b3fd0901-686d-4e85-8767-c56bc470edcc\\\",\\\"systemUUID\\\":\\\"c0d90076-d180-408b-98a9-f48996ced0a6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:44Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:44 crc kubenswrapper[4599]: I1012 07:35:44.461763 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:44 crc kubenswrapper[4599]: I1012 07:35:44.461822 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:44 crc kubenswrapper[4599]: I1012 07:35:44.461835 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:44 crc kubenswrapper[4599]: I1012 07:35:44.461866 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:44 crc kubenswrapper[4599]: I1012 07:35:44.461879 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:44Z","lastTransitionTime":"2025-10-12T07:35:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:44 crc kubenswrapper[4599]: E1012 07:35:44.472068 4599 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:35:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:35:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:35:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:35:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b3fd0901-686d-4e85-8767-c56bc470edcc\\\",\\\"systemUUID\\\":\\\"c0d90076-d180-408b-98a9-f48996ced0a6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:44Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:44 crc kubenswrapper[4599]: I1012 07:35:44.474747 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:44 crc kubenswrapper[4599]: I1012 07:35:44.474792 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:44 crc kubenswrapper[4599]: I1012 07:35:44.474807 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:44 crc kubenswrapper[4599]: I1012 07:35:44.474824 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:44 crc kubenswrapper[4599]: I1012 07:35:44.474834 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:44Z","lastTransitionTime":"2025-10-12T07:35:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:44 crc kubenswrapper[4599]: E1012 07:35:44.484875 4599 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:35:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:35:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:35:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:35:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b3fd0901-686d-4e85-8767-c56bc470edcc\\\",\\\"systemUUID\\\":\\\"c0d90076-d180-408b-98a9-f48996ced0a6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:44Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:44 crc kubenswrapper[4599]: I1012 07:35:44.487531 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:44 crc kubenswrapper[4599]: I1012 07:35:44.487597 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:44 crc kubenswrapper[4599]: I1012 07:35:44.487621 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:44 crc kubenswrapper[4599]: I1012 07:35:44.487637 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:44 crc kubenswrapper[4599]: I1012 07:35:44.487645 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:44Z","lastTransitionTime":"2025-10-12T07:35:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:44 crc kubenswrapper[4599]: E1012 07:35:44.497172 4599 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:35:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:35:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:35:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:35:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b3fd0901-686d-4e85-8767-c56bc470edcc\\\",\\\"systemUUID\\\":\\\"c0d90076-d180-408b-98a9-f48996ced0a6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:44Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:44 crc kubenswrapper[4599]: E1012 07:35:44.497290 4599 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 12 07:35:44 crc kubenswrapper[4599]: I1012 07:35:44.528959 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:44 crc kubenswrapper[4599]: I1012 07:35:44.529065 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:44 crc kubenswrapper[4599]: I1012 07:35:44.529083 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:44 crc kubenswrapper[4599]: I1012 07:35:44.529097 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:44 crc kubenswrapper[4599]: I1012 07:35:44.529107 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:44Z","lastTransitionTime":"2025-10-12T07:35:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:44 crc kubenswrapper[4599]: I1012 07:35:44.544436 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kwphq" Oct 12 07:35:44 crc kubenswrapper[4599]: E1012 07:35:44.544544 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kwphq" podUID="3c3e76cc-139b-4a2a-b96b-6077e3706376" Oct 12 07:35:44 crc kubenswrapper[4599]: I1012 07:35:44.631388 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:44 crc kubenswrapper[4599]: I1012 07:35:44.631433 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:44 crc kubenswrapper[4599]: I1012 07:35:44.631441 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:44 crc kubenswrapper[4599]: I1012 07:35:44.631451 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:44 crc kubenswrapper[4599]: I1012 07:35:44.631459 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:44Z","lastTransitionTime":"2025-10-12T07:35:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:44 crc kubenswrapper[4599]: I1012 07:35:44.733291 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:44 crc kubenswrapper[4599]: I1012 07:35:44.733365 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:44 crc kubenswrapper[4599]: I1012 07:35:44.733376 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:44 crc kubenswrapper[4599]: I1012 07:35:44.733398 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:44 crc kubenswrapper[4599]: I1012 07:35:44.733409 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:44Z","lastTransitionTime":"2025-10-12T07:35:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:44 crc kubenswrapper[4599]: I1012 07:35:44.836452 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:44 crc kubenswrapper[4599]: I1012 07:35:44.836501 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:44 crc kubenswrapper[4599]: I1012 07:35:44.836512 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:44 crc kubenswrapper[4599]: I1012 07:35:44.836531 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:44 crc kubenswrapper[4599]: I1012 07:35:44.836542 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:44Z","lastTransitionTime":"2025-10-12T07:35:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:44 crc kubenswrapper[4599]: I1012 07:35:44.939033 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:44 crc kubenswrapper[4599]: I1012 07:35:44.939095 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:44 crc kubenswrapper[4599]: I1012 07:35:44.939105 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:44 crc kubenswrapper[4599]: I1012 07:35:44.939129 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:44 crc kubenswrapper[4599]: I1012 07:35:44.939143 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:44Z","lastTransitionTime":"2025-10-12T07:35:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:45 crc kubenswrapper[4599]: I1012 07:35:45.041094 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:45 crc kubenswrapper[4599]: I1012 07:35:45.041140 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:45 crc kubenswrapper[4599]: I1012 07:35:45.041149 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:45 crc kubenswrapper[4599]: I1012 07:35:45.041164 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:45 crc kubenswrapper[4599]: I1012 07:35:45.041175 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:45Z","lastTransitionTime":"2025-10-12T07:35:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:45 crc kubenswrapper[4599]: I1012 07:35:45.143698 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:45 crc kubenswrapper[4599]: I1012 07:35:45.144107 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:45 crc kubenswrapper[4599]: I1012 07:35:45.144119 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:45 crc kubenswrapper[4599]: I1012 07:35:45.144138 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:45 crc kubenswrapper[4599]: I1012 07:35:45.144151 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:45Z","lastTransitionTime":"2025-10-12T07:35:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:45 crc kubenswrapper[4599]: I1012 07:35:45.247645 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:45 crc kubenswrapper[4599]: I1012 07:35:45.247693 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:45 crc kubenswrapper[4599]: I1012 07:35:45.247702 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:45 crc kubenswrapper[4599]: I1012 07:35:45.247724 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:45 crc kubenswrapper[4599]: I1012 07:35:45.247736 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:45Z","lastTransitionTime":"2025-10-12T07:35:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:45 crc kubenswrapper[4599]: I1012 07:35:45.351213 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:45 crc kubenswrapper[4599]: I1012 07:35:45.351266 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:45 crc kubenswrapper[4599]: I1012 07:35:45.351285 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:45 crc kubenswrapper[4599]: I1012 07:35:45.351303 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:45 crc kubenswrapper[4599]: I1012 07:35:45.351313 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:45Z","lastTransitionTime":"2025-10-12T07:35:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:45 crc kubenswrapper[4599]: I1012 07:35:45.454132 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:45 crc kubenswrapper[4599]: I1012 07:35:45.454387 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:45 crc kubenswrapper[4599]: I1012 07:35:45.454476 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:45 crc kubenswrapper[4599]: I1012 07:35:45.454559 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:45 crc kubenswrapper[4599]: I1012 07:35:45.454645 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:45Z","lastTransitionTime":"2025-10-12T07:35:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:45 crc kubenswrapper[4599]: I1012 07:35:45.544692 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 07:35:45 crc kubenswrapper[4599]: I1012 07:35:45.544758 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 07:35:45 crc kubenswrapper[4599]: E1012 07:35:45.544870 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 07:35:45 crc kubenswrapper[4599]: I1012 07:35:45.544891 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 07:35:45 crc kubenswrapper[4599]: E1012 07:35:45.545265 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 07:35:45 crc kubenswrapper[4599]: I1012 07:35:45.545458 4599 scope.go:117] "RemoveContainer" containerID="a5b9102b596e6d192154de99c79ffcf4423af88d8c95b78cfc1a27a7651679b5" Oct 12 07:35:45 crc kubenswrapper[4599]: E1012 07:35:45.545505 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 07:35:45 crc kubenswrapper[4599]: I1012 07:35:45.556496 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:45 crc kubenswrapper[4599]: I1012 07:35:45.556520 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:45 crc kubenswrapper[4599]: I1012 07:35:45.556530 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:45 crc kubenswrapper[4599]: I1012 07:35:45.556545 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:45 crc kubenswrapper[4599]: I1012 07:35:45.556558 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:45Z","lastTransitionTime":"2025-10-12T07:35:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:45 crc kubenswrapper[4599]: I1012 07:35:45.658775 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:45 crc kubenswrapper[4599]: I1012 07:35:45.658826 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:45 crc kubenswrapper[4599]: I1012 07:35:45.658836 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:45 crc kubenswrapper[4599]: I1012 07:35:45.658853 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:45 crc kubenswrapper[4599]: I1012 07:35:45.658864 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:45Z","lastTransitionTime":"2025-10-12T07:35:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:45 crc kubenswrapper[4599]: I1012 07:35:45.762084 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:45 crc kubenswrapper[4599]: I1012 07:35:45.762123 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:45 crc kubenswrapper[4599]: I1012 07:35:45.762132 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:45 crc kubenswrapper[4599]: I1012 07:35:45.762145 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:45 crc kubenswrapper[4599]: I1012 07:35:45.762155 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:45Z","lastTransitionTime":"2025-10-12T07:35:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:45 crc kubenswrapper[4599]: I1012 07:35:45.771417 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-whk5b_1a95b7ab-8632-4332-a30f-64f28ef8d313/ovnkube-controller/1.log" Oct 12 07:35:45 crc kubenswrapper[4599]: I1012 07:35:45.773756 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" event={"ID":"1a95b7ab-8632-4332-a30f-64f28ef8d313","Type":"ContainerStarted","Data":"72e00dcebdb38459b2ffe60407ea9c8a343a2a57435d0596134981223e989f8a"} Oct 12 07:35:45 crc kubenswrapper[4599]: I1012 07:35:45.773886 4599 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 12 07:35:45 crc kubenswrapper[4599]: I1012 07:35:45.786061 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc694bce-8c25-4729-b452-29d44d3efe6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c222e0cd037e0719d1ea94d4f28639dae42c676fa56b5f57b22cd50d87e6b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25l5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://238fe8d2b2e7f08d70bc5f938aca3c90b68c447db21a713d651d47b47a8127c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25l5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5mz5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:45Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:45 crc kubenswrapper[4599]: I1012 07:35:45.798982 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xl2ns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29fbfc4b-8e32-4132-a34d-48b25ec31428\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e2f2c3c17ada7047edb8e0fcd51227104458618c98e5ddec12123aa7ee173e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqqzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b84de1dd84c3c72376a170b7ccbdfff8c3bdd5e9e18a344002e112e81ca555a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqqzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xl2ns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:45Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:45 crc kubenswrapper[4599]: I1012 07:35:45.814807 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8afb4c88-1806-483a-9957-30833be28202\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c61ee75e4bbe142c54914177759cf0c49ecc36f904c8c6bf421cd4917256d5a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://545c8135b4e5319ac3b9c84ee50556b66e4f5cfe0987f5aa37ef668130f6f267\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40188eb89126b26a3002269d063827de60fa17f72c18fbbd0e15b16c744e8ddb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc5c76ab7433309a1f2c32d2ada884b5b0d926fee8219463079f8689a1f7efef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21d90e31b7ebf70d48881d9d0837e8a1fdeabc3d53988c7ba993ce28e902d3d4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T07:35:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW1012 07:35:20.755648 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1012 07:35:20.755757 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 07:35:20.757774 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2293705747/tls.crt::/tmp/serving-cert-2293705747/tls.key\\\\\\\"\\\\nI1012 07:35:20.975376 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1012 07:35:20.977324 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1012 07:35:20.977357 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1012 07:35:20.977377 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1012 07:35:20.977382 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1012 07:35:20.982144 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1012 07:35:20.982166 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1012 07:35:20.982172 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1012 07:35:20.982212 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1012 07:35:20.982216 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1012 07:35:20.982220 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1012 07:35:20.982223 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1012 07:35:20.982226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1012 07:35:20.985689 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e595e7d1dee4ecff3b5deb6e363b57de09e4bcfddd6ab9ae9be27b6e3578628\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afc60525d60f5b497a75efb631611d658c23f6336c21ca7421bf3efee3ece0ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afc60525d60f5b497a75efb631611d658c23f6336c21ca7421bf3efee3ece0ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:45Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:45 crc kubenswrapper[4599]: I1012 07:35:45.827164 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db79c5e0b03856a5da5fba6b8e6cb77590ef9e29090ac66da24d54ea638a20f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:45Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:45 crc kubenswrapper[4599]: I1012 07:35:45.836308 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:45Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:45 crc kubenswrapper[4599]: I1012 07:35:45.846160 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2869b43-5f5f-4ed7-9dc3-63d210bde137\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21abc118842f09c54256bd05f1e5978c928bad569bc9b52157287576117fb49d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb25175bfe42e193200fbc76f82294afbd5f23d9653637c12ddf67553431748e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da490def77b19758f5479ef7070846dee19a5057bd6b0568d96fd38a8dd6528\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://622eac195ecab593bc5fa71a4a88ac987a45be5fb130336aae7acd680c51abe4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:45Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:45 crc kubenswrapper[4599]: I1012 07:35:45.855199 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4b9189620cc5d626fe6c3e7bbe0efd22f7a6f061bc2afd3d33b1e04ebdbfb1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc203371af6667734441e7dca5f3b301365eb1136f363df8c87377ae58168130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:45Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:45 crc kubenswrapper[4599]: I1012 07:35:45.863850 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:45 crc kubenswrapper[4599]: I1012 07:35:45.863885 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:45 crc kubenswrapper[4599]: I1012 07:35:45.863894 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:45 crc kubenswrapper[4599]: I1012 07:35:45.863911 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:45 crc kubenswrapper[4599]: I1012 07:35:45.863921 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:45Z","lastTransitionTime":"2025-10-12T07:35:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:45 crc kubenswrapper[4599]: I1012 07:35:45.867796 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b574e7cb76b05a81f63a55ce4544b1d4b3a58e6793be2c89620f66282642e39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:45Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:45 crc kubenswrapper[4599]: I1012 07:35:45.883319 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9xbn5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18ca8765-c435-4750-b803-14b539958d9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4db4e7d06f60157d91fcb8514d176f78540085701360ebcfa7eabb36fe84d6b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16c75354b9fbe00273ab4aaa5ca1306c5d097996b8ce8074a2cee8185118a001\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16c75354b9fbe00273ab4aaa5ca1306c5d097996b8ce8074a2cee8185118a001\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a7c28a100796e21008c597d15ee36786f7e87553a8a4feb5734ac79cef9b556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a7c28a100796e21008c597d15ee36786f7e87553a8a4feb5734ac79cef9b556\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6388c1e143c0caded216941ec25bd07be0a14f2f97f6328c35d5cc04815a88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6388c1e143c0caded216941ec25bd07be0a14f2f97f6328c35d5cc04815a88a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e201774968890569efafc736378cff0d700067bb0bd39677f42ef65fa2c07646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e201774968890569efafc736378cff0d700067bb0bd39677f42ef65fa2c07646\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f07c63fbebde2f05a5b1682af69eb42b5aec20728e0997f1ebc2f5a59815ad1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f07c63fbebde2f05a5b1682af69eb42b5aec20728e0997f1ebc2f5a59815ad1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f62b824f1e7400e9588b3d3a1afc52f86b3bcae039fbe3c757e1e01733e3a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f62b824f1e7400e9588b3d3a1afc52f86b3bcae039fbe3c757e1e01733e3a69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9xbn5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:45Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:45 crc kubenswrapper[4599]: I1012 07:35:45.895836 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8hm26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce311f52-0501-45d3-8209-b1d2aa25028b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://611114a1fecbd461507dd2f566f975c7ef3e8d7655d90c1a5b31cd05a430cb38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8hm26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:45Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:45 crc kubenswrapper[4599]: I1012 07:35:45.907469 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:45Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:45 crc kubenswrapper[4599]: I1012 07:35:45.918820 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:45Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:45 crc kubenswrapper[4599]: I1012 07:35:45.934403 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a95b7ab-8632-4332-a30f-64f28ef8d313\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://543f674ce98b17c90c5a351ae72f596a2e90006a1b86fcede3359715c67b94bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c150b687d6b0c337604ac14537335d57d9c955a9e31001d274507e5e65ac7bc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f73bbbb549129c48a1421dda3d2385cf3515feafd8215cdac6b792823d614b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a156f480c7252f61ec3dc991814db0e7259fdf7ae618c8f43d86da97f8a6bc5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f99854365302221a36d3d7647292430b5655222c551953b6aa6de9eb1c922bea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://810de96e34e6b028e90bcc5693f1624d337cdd775c01901fc91e6dd2a7f3564d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72e00dcebdb38459b2ffe60407ea9c8a343a2a57435d0596134981223e989f8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5b9102b596e6d192154de99c79ffcf4423af88d8c95b78cfc1a27a7651679b5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T07:35:32Z\\\",\\\"message\\\":\\\"n network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:32Z is after 2025-08-24T17:21:41Z]\\\\nI1012 07:35:32.411040 6004 services_controller.go:454] Service openshift-machine-api/machine-api-operator-machine-webhook for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI1012 07:35:32.411167 6004 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/machine-config-daemon-5mz5c\\\\nI1012 07:35:32.411173 6004 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-f5988\\\\nI1012 07:35:32.411177 6004 obj_retry.go:365] Adding new object: *v1.Pod openshift-machine-config-operator/machine-config-daemon-5mz5c\\\\nI1012 07:35:32.411180 6004 obj_retry.go:365] Adding new object: *v1.Pod openshift-dns/node-resolver-f5988\\\\nI1012 07:35:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce496909e393b4896eb9ae165317648df3e828d219c231d95c6057cd08b78fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce4f64c7fab07385f9215821c5c75c5b345653fa141a238bb3dd74e7a1a291a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce4f64c7fab07385f9215821c5c75c5b345653fa141a238bb3dd74e7a1a291a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-whk5b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:45Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:45 crc kubenswrapper[4599]: I1012 07:35:45.943573 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kwphq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c3e76cc-139b-4a2a-b96b-6077e3706376\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9v52\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9v52\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kwphq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:45Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:45 crc kubenswrapper[4599]: I1012 07:35:45.950400 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rxr42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92909e59-b659-4fc0-91c0-1880ff96b4f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4162f0c9d7789e80004cf3ea54eed47b4d473832cf68a03e348973c300f9dcc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7hr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rxr42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:45Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:45 crc kubenswrapper[4599]: I1012 07:35:45.961140 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f5988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0147ed28-bc0f-409c-a813-dc2ffffba092\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b06de751793dade650ec18334a6bcf8bb26e5e89af47ede264d087d1a950178a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzz9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f5988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:45Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:45 crc kubenswrapper[4599]: I1012 07:35:45.965707 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:45 crc kubenswrapper[4599]: I1012 07:35:45.965752 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:45 crc kubenswrapper[4599]: I1012 07:35:45.965763 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:45 crc kubenswrapper[4599]: I1012 07:35:45.965781 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:45 crc kubenswrapper[4599]: I1012 07:35:45.965792 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:45Z","lastTransitionTime":"2025-10-12T07:35:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:46 crc kubenswrapper[4599]: I1012 07:35:46.068520 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:46 crc kubenswrapper[4599]: I1012 07:35:46.068578 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:46 crc kubenswrapper[4599]: I1012 07:35:46.068591 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:46 crc kubenswrapper[4599]: I1012 07:35:46.068611 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:46 crc kubenswrapper[4599]: I1012 07:35:46.068623 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:46Z","lastTransitionTime":"2025-10-12T07:35:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:46 crc kubenswrapper[4599]: I1012 07:35:46.170846 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:46 crc kubenswrapper[4599]: I1012 07:35:46.170889 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:46 crc kubenswrapper[4599]: I1012 07:35:46.170899 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:46 crc kubenswrapper[4599]: I1012 07:35:46.170915 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:46 crc kubenswrapper[4599]: I1012 07:35:46.170925 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:46Z","lastTransitionTime":"2025-10-12T07:35:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:46 crc kubenswrapper[4599]: I1012 07:35:46.273293 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:46 crc kubenswrapper[4599]: I1012 07:35:46.273356 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:46 crc kubenswrapper[4599]: I1012 07:35:46.273369 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:46 crc kubenswrapper[4599]: I1012 07:35:46.273390 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:46 crc kubenswrapper[4599]: I1012 07:35:46.273402 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:46Z","lastTransitionTime":"2025-10-12T07:35:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:46 crc kubenswrapper[4599]: I1012 07:35:46.375769 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:46 crc kubenswrapper[4599]: I1012 07:35:46.375829 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:46 crc kubenswrapper[4599]: I1012 07:35:46.375839 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:46 crc kubenswrapper[4599]: I1012 07:35:46.375861 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:46 crc kubenswrapper[4599]: I1012 07:35:46.375871 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:46Z","lastTransitionTime":"2025-10-12T07:35:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:46 crc kubenswrapper[4599]: I1012 07:35:46.477858 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:46 crc kubenswrapper[4599]: I1012 07:35:46.477902 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:46 crc kubenswrapper[4599]: I1012 07:35:46.477914 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:46 crc kubenswrapper[4599]: I1012 07:35:46.477931 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:46 crc kubenswrapper[4599]: I1012 07:35:46.477942 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:46Z","lastTransitionTime":"2025-10-12T07:35:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:46 crc kubenswrapper[4599]: I1012 07:35:46.544288 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kwphq" Oct 12 07:35:46 crc kubenswrapper[4599]: E1012 07:35:46.544482 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kwphq" podUID="3c3e76cc-139b-4a2a-b96b-6077e3706376" Oct 12 07:35:46 crc kubenswrapper[4599]: I1012 07:35:46.580881 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:46 crc kubenswrapper[4599]: I1012 07:35:46.580912 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:46 crc kubenswrapper[4599]: I1012 07:35:46.580923 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:46 crc kubenswrapper[4599]: I1012 07:35:46.580942 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:46 crc kubenswrapper[4599]: I1012 07:35:46.580953 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:46Z","lastTransitionTime":"2025-10-12T07:35:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:46 crc kubenswrapper[4599]: I1012 07:35:46.683442 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:46 crc kubenswrapper[4599]: I1012 07:35:46.683483 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:46 crc kubenswrapper[4599]: I1012 07:35:46.683492 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:46 crc kubenswrapper[4599]: I1012 07:35:46.683511 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:46 crc kubenswrapper[4599]: I1012 07:35:46.683524 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:46Z","lastTransitionTime":"2025-10-12T07:35:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:46 crc kubenswrapper[4599]: I1012 07:35:46.780779 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-whk5b_1a95b7ab-8632-4332-a30f-64f28ef8d313/ovnkube-controller/2.log" Oct 12 07:35:46 crc kubenswrapper[4599]: I1012 07:35:46.781440 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-whk5b_1a95b7ab-8632-4332-a30f-64f28ef8d313/ovnkube-controller/1.log" Oct 12 07:35:46 crc kubenswrapper[4599]: I1012 07:35:46.784382 4599 generic.go:334] "Generic (PLEG): container finished" podID="1a95b7ab-8632-4332-a30f-64f28ef8d313" containerID="72e00dcebdb38459b2ffe60407ea9c8a343a2a57435d0596134981223e989f8a" exitCode=1 Oct 12 07:35:46 crc kubenswrapper[4599]: I1012 07:35:46.784434 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" event={"ID":"1a95b7ab-8632-4332-a30f-64f28ef8d313","Type":"ContainerDied","Data":"72e00dcebdb38459b2ffe60407ea9c8a343a2a57435d0596134981223e989f8a"} Oct 12 07:35:46 crc kubenswrapper[4599]: I1012 07:35:46.784491 4599 scope.go:117] "RemoveContainer" containerID="a5b9102b596e6d192154de99c79ffcf4423af88d8c95b78cfc1a27a7651679b5" Oct 12 07:35:46 crc kubenswrapper[4599]: I1012 07:35:46.785181 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:46 crc kubenswrapper[4599]: I1012 07:35:46.785240 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:46 crc kubenswrapper[4599]: I1012 07:35:46.785254 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:46 crc kubenswrapper[4599]: I1012 07:35:46.785269 4599 scope.go:117] "RemoveContainer" containerID="72e00dcebdb38459b2ffe60407ea9c8a343a2a57435d0596134981223e989f8a" Oct 12 07:35:46 crc kubenswrapper[4599]: I1012 07:35:46.785281 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:46 crc kubenswrapper[4599]: I1012 07:35:46.785299 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:46Z","lastTransitionTime":"2025-10-12T07:35:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:46 crc kubenswrapper[4599]: E1012 07:35:46.785463 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-whk5b_openshift-ovn-kubernetes(1a95b7ab-8632-4332-a30f-64f28ef8d313)\"" pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" podUID="1a95b7ab-8632-4332-a30f-64f28ef8d313" Oct 12 07:35:46 crc kubenswrapper[4599]: I1012 07:35:46.798433 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8hm26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce311f52-0501-45d3-8209-b1d2aa25028b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://611114a1fecbd461507dd2f566f975c7ef3e8d7655d90c1a5b31cd05a430cb38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8hm26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:46Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:46 crc kubenswrapper[4599]: I1012 07:35:46.811931 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:46Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:46 crc kubenswrapper[4599]: I1012 07:35:46.822611 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:46Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:46 crc kubenswrapper[4599]: I1012 07:35:46.833059 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4b9189620cc5d626fe6c3e7bbe0efd22f7a6f061bc2afd3d33b1e04ebdbfb1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc203371af6667734441e7dca5f3b301365eb1136f363df8c87377ae58168130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:46Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:46 crc kubenswrapper[4599]: I1012 07:35:46.845128 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b574e7cb76b05a81f63a55ce4544b1d4b3a58e6793be2c89620f66282642e39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:46Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:46 crc kubenswrapper[4599]: I1012 07:35:46.857805 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9xbn5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18ca8765-c435-4750-b803-14b539958d9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4db4e7d06f60157d91fcb8514d176f78540085701360ebcfa7eabb36fe84d6b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16c75354b9fbe00273ab4aaa5ca1306c5d097996b8ce8074a2cee8185118a001\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16c75354b9fbe00273ab4aaa5ca1306c5d097996b8ce8074a2cee8185118a001\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a7c28a100796e21008c597d15ee36786f7e87553a8a4feb5734ac79cef9b556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a7c28a100796e21008c597d15ee36786f7e87553a8a4feb5734ac79cef9b556\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6388c1e143c0caded216941ec25bd07be0a14f2f97f6328c35d5cc04815a88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6388c1e143c0caded216941ec25bd07be0a14f2f97f6328c35d5cc04815a88a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e201774968890569efafc736378cff0d700067bb0bd39677f42ef65fa2c07646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e201774968890569efafc736378cff0d700067bb0bd39677f42ef65fa2c07646\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f07c63fbebde2f05a5b1682af69eb42b5aec20728e0997f1ebc2f5a59815ad1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f07c63fbebde2f05a5b1682af69eb42b5aec20728e0997f1ebc2f5a59815ad1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f62b824f1e7400e9588b3d3a1afc52f86b3bcae039fbe3c757e1e01733e3a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f62b824f1e7400e9588b3d3a1afc52f86b3bcae039fbe3c757e1e01733e3a69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9xbn5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:46Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:46 crc kubenswrapper[4599]: I1012 07:35:46.867138 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rxr42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92909e59-b659-4fc0-91c0-1880ff96b4f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4162f0c9d7789e80004cf3ea54eed47b4d473832cf68a03e348973c300f9dcc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7hr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rxr42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:46Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:46 crc kubenswrapper[4599]: I1012 07:35:46.875876 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f5988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0147ed28-bc0f-409c-a813-dc2ffffba092\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b06de751793dade650ec18334a6bcf8bb26e5e89af47ede264d087d1a950178a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzz9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f5988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:46Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:46 crc kubenswrapper[4599]: I1012 07:35:46.888030 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:46 crc kubenswrapper[4599]: I1012 07:35:46.888072 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:46 crc kubenswrapper[4599]: I1012 07:35:46.888084 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:46 crc kubenswrapper[4599]: I1012 07:35:46.888105 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:46 crc kubenswrapper[4599]: I1012 07:35:46.888115 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:46Z","lastTransitionTime":"2025-10-12T07:35:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:46 crc kubenswrapper[4599]: I1012 07:35:46.890736 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a95b7ab-8632-4332-a30f-64f28ef8d313\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://543f674ce98b17c90c5a351ae72f596a2e90006a1b86fcede3359715c67b94bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c150b687d6b0c337604ac14537335d57d9c955a9e31001d274507e5e65ac7bc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f73bbbb549129c48a1421dda3d2385cf3515feafd8215cdac6b792823d614b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a156f480c7252f61ec3dc991814db0e7259fdf7ae618c8f43d86da97f8a6bc5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f99854365302221a36d3d7647292430b5655222c551953b6aa6de9eb1c922bea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://810de96e34e6b028e90bcc5693f1624d337cdd775c01901fc91e6dd2a7f3564d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72e00dcebdb38459b2ffe60407ea9c8a343a2a57435d0596134981223e989f8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5b9102b596e6d192154de99c79ffcf4423af88d8c95b78cfc1a27a7651679b5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T07:35:32Z\\\",\\\"message\\\":\\\"n network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:32Z is after 2025-08-24T17:21:41Z]\\\\nI1012 07:35:32.411040 6004 services_controller.go:454] Service openshift-machine-api/machine-api-operator-machine-webhook for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI1012 07:35:32.411167 6004 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/machine-config-daemon-5mz5c\\\\nI1012 07:35:32.411173 6004 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-f5988\\\\nI1012 07:35:32.411177 6004 obj_retry.go:365] Adding new object: *v1.Pod openshift-machine-config-operator/machine-config-daemon-5mz5c\\\\nI1012 07:35:32.411180 6004 obj_retry.go:365] Adding new object: *v1.Pod openshift-dns/node-resolver-f5988\\\\nI1012 07:35:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72e00dcebdb38459b2ffe60407ea9c8a343a2a57435d0596134981223e989f8a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T07:35:46Z\\\",\\\"message\\\":\\\"achine-config-daemon-5mz5c openshift-multus/multus-8hm26 openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xl2ns openshift-dns/node-resolver-f5988 openshift-multus/network-metrics-daemon-kwphq openshift-network-diagnostics/network-check-target-xd92c openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-kube-apiserver/kube-apiserver-crc openshift-kube-controller-manager/kube-controller-manager-crc openshift-network-operator/iptables-alerter-4ln5h]\\\\nI1012 07:35:46.210430 6238 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nF1012 07:35:46.210433 6238 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired o\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce496909e393b4896eb9ae165317648df3e828d219c231d95c6057cd08b78fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce4f64c7fab07385f9215821c5c75c5b345653fa141a238bb3dd74e7a1a291a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce4f64c7fab07385f9215821c5c75c5b345653fa141a238bb3dd74e7a1a291a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-whk5b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:46Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:46 crc kubenswrapper[4599]: I1012 07:35:46.899025 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kwphq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c3e76cc-139b-4a2a-b96b-6077e3706376\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9v52\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9v52\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kwphq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:46Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:46 crc kubenswrapper[4599]: I1012 07:35:46.909511 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8afb4c88-1806-483a-9957-30833be28202\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c61ee75e4bbe142c54914177759cf0c49ecc36f904c8c6bf421cd4917256d5a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://545c8135b4e5319ac3b9c84ee50556b66e4f5cfe0987f5aa37ef668130f6f267\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40188eb89126b26a3002269d063827de60fa17f72c18fbbd0e15b16c744e8ddb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc5c76ab7433309a1f2c32d2ada884b5b0d926fee8219463079f8689a1f7efef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21d90e31b7ebf70d48881d9d0837e8a1fdeabc3d53988c7ba993ce28e902d3d4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T07:35:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW1012 07:35:20.755648 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1012 07:35:20.755757 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 07:35:20.757774 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2293705747/tls.crt::/tmp/serving-cert-2293705747/tls.key\\\\\\\"\\\\nI1012 07:35:20.975376 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1012 07:35:20.977324 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1012 07:35:20.977357 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1012 07:35:20.977377 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1012 07:35:20.977382 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1012 07:35:20.982144 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1012 07:35:20.982166 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1012 07:35:20.982172 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1012 07:35:20.982212 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1012 07:35:20.982216 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1012 07:35:20.982220 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1012 07:35:20.982223 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1012 07:35:20.982226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1012 07:35:20.985689 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e595e7d1dee4ecff3b5deb6e363b57de09e4bcfddd6ab9ae9be27b6e3578628\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afc60525d60f5b497a75efb631611d658c23f6336c21ca7421bf3efee3ece0ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afc60525d60f5b497a75efb631611d658c23f6336c21ca7421bf3efee3ece0ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:46Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:46 crc kubenswrapper[4599]: I1012 07:35:46.920499 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db79c5e0b03856a5da5fba6b8e6cb77590ef9e29090ac66da24d54ea638a20f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:46Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:46 crc kubenswrapper[4599]: I1012 07:35:46.930293 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:46Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:46 crc kubenswrapper[4599]: I1012 07:35:46.939402 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc694bce-8c25-4729-b452-29d44d3efe6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c222e0cd037e0719d1ea94d4f28639dae42c676fa56b5f57b22cd50d87e6b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25l5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://238fe8d2b2e7f08d70bc5f938aca3c90b68c447db21a713d651d47b47a8127c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25l5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5mz5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:46Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:46 crc kubenswrapper[4599]: I1012 07:35:46.949787 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xl2ns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29fbfc4b-8e32-4132-a34d-48b25ec31428\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e2f2c3c17ada7047edb8e0fcd51227104458618c98e5ddec12123aa7ee173e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqqzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b84de1dd84c3c72376a170b7ccbdfff8c3bdd5e9e18a344002e112e81ca555a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqqzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xl2ns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:46Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:46 crc kubenswrapper[4599]: I1012 07:35:46.960699 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2869b43-5f5f-4ed7-9dc3-63d210bde137\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21abc118842f09c54256bd05f1e5978c928bad569bc9b52157287576117fb49d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb25175bfe42e193200fbc76f82294afbd5f23d9653637c12ddf67553431748e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da490def77b19758f5479ef7070846dee19a5057bd6b0568d96fd38a8dd6528\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://622eac195ecab593bc5fa71a4a88ac987a45be5fb130336aae7acd680c51abe4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:46Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:46 crc kubenswrapper[4599]: I1012 07:35:46.990558 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:46 crc kubenswrapper[4599]: I1012 07:35:46.990634 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:46 crc kubenswrapper[4599]: I1012 07:35:46.990839 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:46 crc kubenswrapper[4599]: I1012 07:35:46.990855 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:46 crc kubenswrapper[4599]: I1012 07:35:46.990866 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:46Z","lastTransitionTime":"2025-10-12T07:35:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:47 crc kubenswrapper[4599]: I1012 07:35:47.093677 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:47 crc kubenswrapper[4599]: I1012 07:35:47.093739 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:47 crc kubenswrapper[4599]: I1012 07:35:47.093751 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:47 crc kubenswrapper[4599]: I1012 07:35:47.093773 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:47 crc kubenswrapper[4599]: I1012 07:35:47.093789 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:47Z","lastTransitionTime":"2025-10-12T07:35:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:47 crc kubenswrapper[4599]: I1012 07:35:47.196193 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:47 crc kubenswrapper[4599]: I1012 07:35:47.196257 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:47 crc kubenswrapper[4599]: I1012 07:35:47.196271 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:47 crc kubenswrapper[4599]: I1012 07:35:47.196294 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:47 crc kubenswrapper[4599]: I1012 07:35:47.196311 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:47Z","lastTransitionTime":"2025-10-12T07:35:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:47 crc kubenswrapper[4599]: I1012 07:35:47.299156 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:47 crc kubenswrapper[4599]: I1012 07:35:47.299240 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:47 crc kubenswrapper[4599]: I1012 07:35:47.299255 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:47 crc kubenswrapper[4599]: I1012 07:35:47.299281 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:47 crc kubenswrapper[4599]: I1012 07:35:47.299295 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:47Z","lastTransitionTime":"2025-10-12T07:35:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:47 crc kubenswrapper[4599]: I1012 07:35:47.402003 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:47 crc kubenswrapper[4599]: I1012 07:35:47.402045 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:47 crc kubenswrapper[4599]: I1012 07:35:47.402055 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:47 crc kubenswrapper[4599]: I1012 07:35:47.402069 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:47 crc kubenswrapper[4599]: I1012 07:35:47.402081 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:47Z","lastTransitionTime":"2025-10-12T07:35:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:47 crc kubenswrapper[4599]: I1012 07:35:47.504434 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:47 crc kubenswrapper[4599]: I1012 07:35:47.504483 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:47 crc kubenswrapper[4599]: I1012 07:35:47.504493 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:47 crc kubenswrapper[4599]: I1012 07:35:47.504510 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:47 crc kubenswrapper[4599]: I1012 07:35:47.504521 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:47Z","lastTransitionTime":"2025-10-12T07:35:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:47 crc kubenswrapper[4599]: I1012 07:35:47.544492 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 07:35:47 crc kubenswrapper[4599]: I1012 07:35:47.544541 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 07:35:47 crc kubenswrapper[4599]: E1012 07:35:47.544689 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 07:35:47 crc kubenswrapper[4599]: E1012 07:35:47.544810 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 07:35:47 crc kubenswrapper[4599]: I1012 07:35:47.544703 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 07:35:47 crc kubenswrapper[4599]: E1012 07:35:47.544938 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 07:35:47 crc kubenswrapper[4599]: I1012 07:35:47.607372 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:47 crc kubenswrapper[4599]: I1012 07:35:47.607445 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:47 crc kubenswrapper[4599]: I1012 07:35:47.607457 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:47 crc kubenswrapper[4599]: I1012 07:35:47.607483 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:47 crc kubenswrapper[4599]: I1012 07:35:47.607496 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:47Z","lastTransitionTime":"2025-10-12T07:35:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:47 crc kubenswrapper[4599]: I1012 07:35:47.709505 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:47 crc kubenswrapper[4599]: I1012 07:35:47.709570 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:47 crc kubenswrapper[4599]: I1012 07:35:47.709580 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:47 crc kubenswrapper[4599]: I1012 07:35:47.709603 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:47 crc kubenswrapper[4599]: I1012 07:35:47.709613 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:47Z","lastTransitionTime":"2025-10-12T07:35:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:47 crc kubenswrapper[4599]: I1012 07:35:47.790362 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-whk5b_1a95b7ab-8632-4332-a30f-64f28ef8d313/ovnkube-controller/2.log" Oct 12 07:35:47 crc kubenswrapper[4599]: I1012 07:35:47.811817 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:47 crc kubenswrapper[4599]: I1012 07:35:47.811869 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:47 crc kubenswrapper[4599]: I1012 07:35:47.811881 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:47 crc kubenswrapper[4599]: I1012 07:35:47.811905 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:47 crc kubenswrapper[4599]: I1012 07:35:47.811917 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:47Z","lastTransitionTime":"2025-10-12T07:35:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:47 crc kubenswrapper[4599]: I1012 07:35:47.914519 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:47 crc kubenswrapper[4599]: I1012 07:35:47.914562 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:47 crc kubenswrapper[4599]: I1012 07:35:47.914572 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:47 crc kubenswrapper[4599]: I1012 07:35:47.914590 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:47 crc kubenswrapper[4599]: I1012 07:35:47.914602 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:47Z","lastTransitionTime":"2025-10-12T07:35:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:48 crc kubenswrapper[4599]: I1012 07:35:48.017363 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:48 crc kubenswrapper[4599]: I1012 07:35:48.017426 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:48 crc kubenswrapper[4599]: I1012 07:35:48.017440 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:48 crc kubenswrapper[4599]: I1012 07:35:48.017462 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:48 crc kubenswrapper[4599]: I1012 07:35:48.017479 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:48Z","lastTransitionTime":"2025-10-12T07:35:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:48 crc kubenswrapper[4599]: I1012 07:35:48.120455 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:48 crc kubenswrapper[4599]: I1012 07:35:48.120534 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:48 crc kubenswrapper[4599]: I1012 07:35:48.120545 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:48 crc kubenswrapper[4599]: I1012 07:35:48.120566 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:48 crc kubenswrapper[4599]: I1012 07:35:48.120580 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:48Z","lastTransitionTime":"2025-10-12T07:35:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:48 crc kubenswrapper[4599]: I1012 07:35:48.222844 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:48 crc kubenswrapper[4599]: I1012 07:35:48.222889 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:48 crc kubenswrapper[4599]: I1012 07:35:48.222900 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:48 crc kubenswrapper[4599]: I1012 07:35:48.222915 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:48 crc kubenswrapper[4599]: I1012 07:35:48.222924 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:48Z","lastTransitionTime":"2025-10-12T07:35:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:48 crc kubenswrapper[4599]: I1012 07:35:48.325578 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:48 crc kubenswrapper[4599]: I1012 07:35:48.325617 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:48 crc kubenswrapper[4599]: I1012 07:35:48.325627 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:48 crc kubenswrapper[4599]: I1012 07:35:48.325643 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:48 crc kubenswrapper[4599]: I1012 07:35:48.325653 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:48Z","lastTransitionTime":"2025-10-12T07:35:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:48 crc kubenswrapper[4599]: I1012 07:35:48.428701 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:48 crc kubenswrapper[4599]: I1012 07:35:48.428756 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:48 crc kubenswrapper[4599]: I1012 07:35:48.428767 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:48 crc kubenswrapper[4599]: I1012 07:35:48.428786 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:48 crc kubenswrapper[4599]: I1012 07:35:48.428801 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:48Z","lastTransitionTime":"2025-10-12T07:35:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:48 crc kubenswrapper[4599]: I1012 07:35:48.531518 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:48 crc kubenswrapper[4599]: I1012 07:35:48.531563 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:48 crc kubenswrapper[4599]: I1012 07:35:48.531573 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:48 crc kubenswrapper[4599]: I1012 07:35:48.531589 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:48 crc kubenswrapper[4599]: I1012 07:35:48.531599 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:48Z","lastTransitionTime":"2025-10-12T07:35:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:48 crc kubenswrapper[4599]: I1012 07:35:48.544989 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kwphq" Oct 12 07:35:48 crc kubenswrapper[4599]: E1012 07:35:48.545143 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kwphq" podUID="3c3e76cc-139b-4a2a-b96b-6077e3706376" Oct 12 07:35:48 crc kubenswrapper[4599]: I1012 07:35:48.633642 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:48 crc kubenswrapper[4599]: I1012 07:35:48.633705 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:48 crc kubenswrapper[4599]: I1012 07:35:48.633716 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:48 crc kubenswrapper[4599]: I1012 07:35:48.633726 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:48 crc kubenswrapper[4599]: I1012 07:35:48.633735 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:48Z","lastTransitionTime":"2025-10-12T07:35:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:48 crc kubenswrapper[4599]: I1012 07:35:48.735328 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:48 crc kubenswrapper[4599]: I1012 07:35:48.735406 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:48 crc kubenswrapper[4599]: I1012 07:35:48.735425 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:48 crc kubenswrapper[4599]: I1012 07:35:48.735447 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:48 crc kubenswrapper[4599]: I1012 07:35:48.735457 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:48Z","lastTransitionTime":"2025-10-12T07:35:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:48 crc kubenswrapper[4599]: I1012 07:35:48.837419 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:48 crc kubenswrapper[4599]: I1012 07:35:48.837483 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:48 crc kubenswrapper[4599]: I1012 07:35:48.837494 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:48 crc kubenswrapper[4599]: I1012 07:35:48.837511 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:48 crc kubenswrapper[4599]: I1012 07:35:48.837525 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:48Z","lastTransitionTime":"2025-10-12T07:35:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:48 crc kubenswrapper[4599]: I1012 07:35:48.940172 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:48 crc kubenswrapper[4599]: I1012 07:35:48.940230 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:48 crc kubenswrapper[4599]: I1012 07:35:48.940240 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:48 crc kubenswrapper[4599]: I1012 07:35:48.940260 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:48 crc kubenswrapper[4599]: I1012 07:35:48.940271 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:48Z","lastTransitionTime":"2025-10-12T07:35:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:49 crc kubenswrapper[4599]: I1012 07:35:49.042853 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:49 crc kubenswrapper[4599]: I1012 07:35:49.042897 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:49 crc kubenswrapper[4599]: I1012 07:35:49.042914 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:49 crc kubenswrapper[4599]: I1012 07:35:49.042931 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:49 crc kubenswrapper[4599]: I1012 07:35:49.042945 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:49Z","lastTransitionTime":"2025-10-12T07:35:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:49 crc kubenswrapper[4599]: I1012 07:35:49.145319 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:49 crc kubenswrapper[4599]: I1012 07:35:49.145377 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:49 crc kubenswrapper[4599]: I1012 07:35:49.145387 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:49 crc kubenswrapper[4599]: I1012 07:35:49.145403 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:49 crc kubenswrapper[4599]: I1012 07:35:49.145413 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:49Z","lastTransitionTime":"2025-10-12T07:35:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:49 crc kubenswrapper[4599]: I1012 07:35:49.247859 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:49 crc kubenswrapper[4599]: I1012 07:35:49.247906 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:49 crc kubenswrapper[4599]: I1012 07:35:49.247916 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:49 crc kubenswrapper[4599]: I1012 07:35:49.247935 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:49 crc kubenswrapper[4599]: I1012 07:35:49.247947 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:49Z","lastTransitionTime":"2025-10-12T07:35:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:49 crc kubenswrapper[4599]: I1012 07:35:49.350762 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:49 crc kubenswrapper[4599]: I1012 07:35:49.350844 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:49 crc kubenswrapper[4599]: I1012 07:35:49.350858 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:49 crc kubenswrapper[4599]: I1012 07:35:49.350882 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:49 crc kubenswrapper[4599]: I1012 07:35:49.350895 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:49Z","lastTransitionTime":"2025-10-12T07:35:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:49 crc kubenswrapper[4599]: I1012 07:35:49.453214 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:49 crc kubenswrapper[4599]: I1012 07:35:49.453261 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:49 crc kubenswrapper[4599]: I1012 07:35:49.453270 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:49 crc kubenswrapper[4599]: I1012 07:35:49.453286 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:49 crc kubenswrapper[4599]: I1012 07:35:49.453296 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:49Z","lastTransitionTime":"2025-10-12T07:35:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:49 crc kubenswrapper[4599]: I1012 07:35:49.544753 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 07:35:49 crc kubenswrapper[4599]: I1012 07:35:49.544817 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 07:35:49 crc kubenswrapper[4599]: I1012 07:35:49.544834 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 07:35:49 crc kubenswrapper[4599]: E1012 07:35:49.544939 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 07:35:49 crc kubenswrapper[4599]: E1012 07:35:49.545044 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 07:35:49 crc kubenswrapper[4599]: E1012 07:35:49.545123 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 07:35:49 crc kubenswrapper[4599]: I1012 07:35:49.555747 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:49 crc kubenswrapper[4599]: I1012 07:35:49.555817 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:49 crc kubenswrapper[4599]: I1012 07:35:49.555832 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:49 crc kubenswrapper[4599]: I1012 07:35:49.555853 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:49 crc kubenswrapper[4599]: I1012 07:35:49.555866 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:49Z","lastTransitionTime":"2025-10-12T07:35:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:49 crc kubenswrapper[4599]: I1012 07:35:49.658588 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:49 crc kubenswrapper[4599]: I1012 07:35:49.658652 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:49 crc kubenswrapper[4599]: I1012 07:35:49.658665 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:49 crc kubenswrapper[4599]: I1012 07:35:49.658685 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:49 crc kubenswrapper[4599]: I1012 07:35:49.658711 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:49Z","lastTransitionTime":"2025-10-12T07:35:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:49 crc kubenswrapper[4599]: I1012 07:35:49.761274 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:49 crc kubenswrapper[4599]: I1012 07:35:49.761315 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:49 crc kubenswrapper[4599]: I1012 07:35:49.761326 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:49 crc kubenswrapper[4599]: I1012 07:35:49.761356 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:49 crc kubenswrapper[4599]: I1012 07:35:49.761368 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:49Z","lastTransitionTime":"2025-10-12T07:35:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:49 crc kubenswrapper[4599]: I1012 07:35:49.863380 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:49 crc kubenswrapper[4599]: I1012 07:35:49.863430 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:49 crc kubenswrapper[4599]: I1012 07:35:49.863439 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:49 crc kubenswrapper[4599]: I1012 07:35:49.863457 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:49 crc kubenswrapper[4599]: I1012 07:35:49.863470 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:49Z","lastTransitionTime":"2025-10-12T07:35:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:49 crc kubenswrapper[4599]: I1012 07:35:49.965767 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:49 crc kubenswrapper[4599]: I1012 07:35:49.965816 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:49 crc kubenswrapper[4599]: I1012 07:35:49.965825 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:49 crc kubenswrapper[4599]: I1012 07:35:49.965841 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:49 crc kubenswrapper[4599]: I1012 07:35:49.965855 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:49Z","lastTransitionTime":"2025-10-12T07:35:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:50 crc kubenswrapper[4599]: I1012 07:35:50.067982 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:50 crc kubenswrapper[4599]: I1012 07:35:50.068015 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:50 crc kubenswrapper[4599]: I1012 07:35:50.068023 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:50 crc kubenswrapper[4599]: I1012 07:35:50.068035 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:50 crc kubenswrapper[4599]: I1012 07:35:50.068044 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:50Z","lastTransitionTime":"2025-10-12T07:35:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:50 crc kubenswrapper[4599]: I1012 07:35:50.169991 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:50 crc kubenswrapper[4599]: I1012 07:35:50.170022 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:50 crc kubenswrapper[4599]: I1012 07:35:50.170033 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:50 crc kubenswrapper[4599]: I1012 07:35:50.170045 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:50 crc kubenswrapper[4599]: I1012 07:35:50.170053 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:50Z","lastTransitionTime":"2025-10-12T07:35:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:50 crc kubenswrapper[4599]: I1012 07:35:50.272515 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:50 crc kubenswrapper[4599]: I1012 07:35:50.272548 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:50 crc kubenswrapper[4599]: I1012 07:35:50.272557 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:50 crc kubenswrapper[4599]: I1012 07:35:50.272569 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:50 crc kubenswrapper[4599]: I1012 07:35:50.272577 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:50Z","lastTransitionTime":"2025-10-12T07:35:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:50 crc kubenswrapper[4599]: I1012 07:35:50.374242 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:50 crc kubenswrapper[4599]: I1012 07:35:50.374273 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:50 crc kubenswrapper[4599]: I1012 07:35:50.374281 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:50 crc kubenswrapper[4599]: I1012 07:35:50.374292 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:50 crc kubenswrapper[4599]: I1012 07:35:50.374299 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:50Z","lastTransitionTime":"2025-10-12T07:35:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:50 crc kubenswrapper[4599]: I1012 07:35:50.451252 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3c3e76cc-139b-4a2a-b96b-6077e3706376-metrics-certs\") pod \"network-metrics-daemon-kwphq\" (UID: \"3c3e76cc-139b-4a2a-b96b-6077e3706376\") " pod="openshift-multus/network-metrics-daemon-kwphq" Oct 12 07:35:50 crc kubenswrapper[4599]: E1012 07:35:50.451444 4599 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 12 07:35:50 crc kubenswrapper[4599]: E1012 07:35:50.451511 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c3e76cc-139b-4a2a-b96b-6077e3706376-metrics-certs podName:3c3e76cc-139b-4a2a-b96b-6077e3706376 nodeName:}" failed. No retries permitted until 2025-10-12 07:36:06.451495538 +0000 UTC m=+63.240691041 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3c3e76cc-139b-4a2a-b96b-6077e3706376-metrics-certs") pod "network-metrics-daemon-kwphq" (UID: "3c3e76cc-139b-4a2a-b96b-6077e3706376") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 12 07:35:50 crc kubenswrapper[4599]: I1012 07:35:50.476563 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:50 crc kubenswrapper[4599]: I1012 07:35:50.476592 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:50 crc kubenswrapper[4599]: I1012 07:35:50.476601 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:50 crc kubenswrapper[4599]: I1012 07:35:50.476612 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:50 crc kubenswrapper[4599]: I1012 07:35:50.476621 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:50Z","lastTransitionTime":"2025-10-12T07:35:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:50 crc kubenswrapper[4599]: I1012 07:35:50.545182 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kwphq" Oct 12 07:35:50 crc kubenswrapper[4599]: E1012 07:35:50.545394 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kwphq" podUID="3c3e76cc-139b-4a2a-b96b-6077e3706376" Oct 12 07:35:50 crc kubenswrapper[4599]: I1012 07:35:50.579264 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:50 crc kubenswrapper[4599]: I1012 07:35:50.579290 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:50 crc kubenswrapper[4599]: I1012 07:35:50.579304 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:50 crc kubenswrapper[4599]: I1012 07:35:50.579316 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:50 crc kubenswrapper[4599]: I1012 07:35:50.579325 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:50Z","lastTransitionTime":"2025-10-12T07:35:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:50 crc kubenswrapper[4599]: I1012 07:35:50.682445 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:50 crc kubenswrapper[4599]: I1012 07:35:50.682516 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:50 crc kubenswrapper[4599]: I1012 07:35:50.682527 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:50 crc kubenswrapper[4599]: I1012 07:35:50.682550 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:50 crc kubenswrapper[4599]: I1012 07:35:50.682563 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:50Z","lastTransitionTime":"2025-10-12T07:35:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:50 crc kubenswrapper[4599]: I1012 07:35:50.702461 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" Oct 12 07:35:50 crc kubenswrapper[4599]: I1012 07:35:50.703185 4599 scope.go:117] "RemoveContainer" containerID="72e00dcebdb38459b2ffe60407ea9c8a343a2a57435d0596134981223e989f8a" Oct 12 07:35:50 crc kubenswrapper[4599]: E1012 07:35:50.703355 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-whk5b_openshift-ovn-kubernetes(1a95b7ab-8632-4332-a30f-64f28ef8d313)\"" pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" podUID="1a95b7ab-8632-4332-a30f-64f28ef8d313" Oct 12 07:35:50 crc kubenswrapper[4599]: I1012 07:35:50.716003 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db79c5e0b03856a5da5fba6b8e6cb77590ef9e29090ac66da24d54ea638a20f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:50Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:50 crc kubenswrapper[4599]: I1012 07:35:50.729089 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:50Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:50 crc kubenswrapper[4599]: I1012 07:35:50.739311 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc694bce-8c25-4729-b452-29d44d3efe6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c222e0cd037e0719d1ea94d4f28639dae42c676fa56b5f57b22cd50d87e6b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25l5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://238fe8d2b2e7f08d70bc5f938aca3c90b68c447db21a713d651d47b47a8127c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25l5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5mz5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:50Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:50 crc kubenswrapper[4599]: I1012 07:35:50.749360 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xl2ns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29fbfc4b-8e32-4132-a34d-48b25ec31428\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e2f2c3c17ada7047edb8e0fcd51227104458618c98e5ddec12123aa7ee173e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqqzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b84de1dd84c3c72376a170b7ccbdfff8c3bdd5e9e18a344002e112e81ca555a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqqzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xl2ns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:50Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:50 crc kubenswrapper[4599]: I1012 07:35:50.760367 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8afb4c88-1806-483a-9957-30833be28202\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c61ee75e4bbe142c54914177759cf0c49ecc36f904c8c6bf421cd4917256d5a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://545c8135b4e5319ac3b9c84ee50556b66e4f5cfe0987f5aa37ef668130f6f267\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40188eb89126b26a3002269d063827de60fa17f72c18fbbd0e15b16c744e8ddb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc5c76ab7433309a1f2c32d2ada884b5b0d926fee8219463079f8689a1f7efef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21d90e31b7ebf70d48881d9d0837e8a1fdeabc3d53988c7ba993ce28e902d3d4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T07:35:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW1012 07:35:20.755648 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1012 07:35:20.755757 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 07:35:20.757774 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2293705747/tls.crt::/tmp/serving-cert-2293705747/tls.key\\\\\\\"\\\\nI1012 07:35:20.975376 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1012 07:35:20.977324 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1012 07:35:20.977357 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1012 07:35:20.977377 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1012 07:35:20.977382 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1012 07:35:20.982144 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1012 07:35:20.982166 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1012 07:35:20.982172 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1012 07:35:20.982212 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1012 07:35:20.982216 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1012 07:35:20.982220 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1012 07:35:20.982223 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1012 07:35:20.982226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1012 07:35:20.985689 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e595e7d1dee4ecff3b5deb6e363b57de09e4bcfddd6ab9ae9be27b6e3578628\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afc60525d60f5b497a75efb631611d658c23f6336c21ca7421bf3efee3ece0ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afc60525d60f5b497a75efb631611d658c23f6336c21ca7421bf3efee3ece0ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:50Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:50 crc kubenswrapper[4599]: I1012 07:35:50.771170 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2869b43-5f5f-4ed7-9dc3-63d210bde137\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21abc118842f09c54256bd05f1e5978c928bad569bc9b52157287576117fb49d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb25175bfe42e193200fbc76f82294afbd5f23d9653637c12ddf67553431748e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da490def77b19758f5479ef7070846dee19a5057bd6b0568d96fd38a8dd6528\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://622eac195ecab593bc5fa71a4a88ac987a45be5fb130336aae7acd680c51abe4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:50Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:50 crc kubenswrapper[4599]: I1012 07:35:50.780066 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:50Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:50 crc kubenswrapper[4599]: I1012 07:35:50.785680 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:50 crc kubenswrapper[4599]: I1012 07:35:50.785751 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:50 crc kubenswrapper[4599]: I1012 07:35:50.785775 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:50 crc kubenswrapper[4599]: I1012 07:35:50.785810 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:50 crc kubenswrapper[4599]: I1012 07:35:50.785826 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:50Z","lastTransitionTime":"2025-10-12T07:35:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:50 crc kubenswrapper[4599]: I1012 07:35:50.790079 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:50Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:50 crc kubenswrapper[4599]: I1012 07:35:50.802293 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4b9189620cc5d626fe6c3e7bbe0efd22f7a6f061bc2afd3d33b1e04ebdbfb1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc203371af6667734441e7dca5f3b301365eb1136f363df8c87377ae58168130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:50Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:50 crc kubenswrapper[4599]: I1012 07:35:50.813492 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b574e7cb76b05a81f63a55ce4544b1d4b3a58e6793be2c89620f66282642e39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:50Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:50 crc kubenswrapper[4599]: I1012 07:35:50.823503 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9xbn5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18ca8765-c435-4750-b803-14b539958d9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4db4e7d06f60157d91fcb8514d176f78540085701360ebcfa7eabb36fe84d6b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16c75354b9fbe00273ab4aaa5ca1306c5d097996b8ce8074a2cee8185118a001\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16c75354b9fbe00273ab4aaa5ca1306c5d097996b8ce8074a2cee8185118a001\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a7c28a100796e21008c597d15ee36786f7e87553a8a4feb5734ac79cef9b556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a7c28a100796e21008c597d15ee36786f7e87553a8a4feb5734ac79cef9b556\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6388c1e143c0caded216941ec25bd07be0a14f2f97f6328c35d5cc04815a88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6388c1e143c0caded216941ec25bd07be0a14f2f97f6328c35d5cc04815a88a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e201774968890569efafc736378cff0d700067bb0bd39677f42ef65fa2c07646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e201774968890569efafc736378cff0d700067bb0bd39677f42ef65fa2c07646\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f07c63fbebde2f05a5b1682af69eb42b5aec20728e0997f1ebc2f5a59815ad1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f07c63fbebde2f05a5b1682af69eb42b5aec20728e0997f1ebc2f5a59815ad1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f62b824f1e7400e9588b3d3a1afc52f86b3bcae039fbe3c757e1e01733e3a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f62b824f1e7400e9588b3d3a1afc52f86b3bcae039fbe3c757e1e01733e3a69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9xbn5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:50Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:50 crc kubenswrapper[4599]: I1012 07:35:50.838908 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8hm26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce311f52-0501-45d3-8209-b1d2aa25028b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://611114a1fecbd461507dd2f566f975c7ef3e8d7655d90c1a5b31cd05a430cb38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8hm26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:50Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:50 crc kubenswrapper[4599]: I1012 07:35:50.846555 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rxr42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92909e59-b659-4fc0-91c0-1880ff96b4f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4162f0c9d7789e80004cf3ea54eed47b4d473832cf68a03e348973c300f9dcc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7hr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rxr42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:50Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:50 crc kubenswrapper[4599]: I1012 07:35:50.856581 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f5988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0147ed28-bc0f-409c-a813-dc2ffffba092\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b06de751793dade650ec18334a6bcf8bb26e5e89af47ede264d087d1a950178a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzz9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f5988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:50Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:50 crc kubenswrapper[4599]: I1012 07:35:50.871921 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a95b7ab-8632-4332-a30f-64f28ef8d313\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://543f674ce98b17c90c5a351ae72f596a2e90006a1b86fcede3359715c67b94bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c150b687d6b0c337604ac14537335d57d9c955a9e31001d274507e5e65ac7bc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f73bbbb549129c48a1421dda3d2385cf3515feafd8215cdac6b792823d614b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a156f480c7252f61ec3dc991814db0e7259fdf7ae618c8f43d86da97f8a6bc5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f99854365302221a36d3d7647292430b5655222c551953b6aa6de9eb1c922bea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://810de96e34e6b028e90bcc5693f1624d337cdd775c01901fc91e6dd2a7f3564d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72e00dcebdb38459b2ffe60407ea9c8a343a2a57435d0596134981223e989f8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72e00dcebdb38459b2ffe60407ea9c8a343a2a57435d0596134981223e989f8a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T07:35:46Z\\\",\\\"message\\\":\\\"achine-config-daemon-5mz5c openshift-multus/multus-8hm26 openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xl2ns openshift-dns/node-resolver-f5988 openshift-multus/network-metrics-daemon-kwphq openshift-network-diagnostics/network-check-target-xd92c openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-kube-apiserver/kube-apiserver-crc openshift-kube-controller-manager/kube-controller-manager-crc openshift-network-operator/iptables-alerter-4ln5h]\\\\nI1012 07:35:46.210430 6238 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nF1012 07:35:46.210433 6238 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired o\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-whk5b_openshift-ovn-kubernetes(1a95b7ab-8632-4332-a30f-64f28ef8d313)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce496909e393b4896eb9ae165317648df3e828d219c231d95c6057cd08b78fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce4f64c7fab07385f9215821c5c75c5b345653fa141a238bb3dd74e7a1a291a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce4f64c7fab07385f9215821c5c75c5b345653fa141a238bb3dd74e7a1a291a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-whk5b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:50Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:50 crc kubenswrapper[4599]: I1012 07:35:50.882419 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kwphq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c3e76cc-139b-4a2a-b96b-6077e3706376\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9v52\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9v52\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kwphq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:50Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:50 crc kubenswrapper[4599]: I1012 07:35:50.889081 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:50 crc kubenswrapper[4599]: I1012 07:35:50.889126 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:50 crc kubenswrapper[4599]: I1012 07:35:50.889155 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:50 crc kubenswrapper[4599]: I1012 07:35:50.889174 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:50 crc kubenswrapper[4599]: I1012 07:35:50.889188 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:50Z","lastTransitionTime":"2025-10-12T07:35:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:50 crc kubenswrapper[4599]: I1012 07:35:50.992317 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:50 crc kubenswrapper[4599]: I1012 07:35:50.992387 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:50 crc kubenswrapper[4599]: I1012 07:35:50.992397 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:50 crc kubenswrapper[4599]: I1012 07:35:50.992415 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:50 crc kubenswrapper[4599]: I1012 07:35:50.992446 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:50Z","lastTransitionTime":"2025-10-12T07:35:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:51 crc kubenswrapper[4599]: I1012 07:35:51.096004 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:51 crc kubenswrapper[4599]: I1012 07:35:51.096064 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:51 crc kubenswrapper[4599]: I1012 07:35:51.096074 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:51 crc kubenswrapper[4599]: I1012 07:35:51.096094 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:51 crc kubenswrapper[4599]: I1012 07:35:51.096108 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:51Z","lastTransitionTime":"2025-10-12T07:35:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:51 crc kubenswrapper[4599]: I1012 07:35:51.198359 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:51 crc kubenswrapper[4599]: I1012 07:35:51.198414 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:51 crc kubenswrapper[4599]: I1012 07:35:51.198423 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:51 crc kubenswrapper[4599]: I1012 07:35:51.198439 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:51 crc kubenswrapper[4599]: I1012 07:35:51.198452 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:51Z","lastTransitionTime":"2025-10-12T07:35:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:51 crc kubenswrapper[4599]: I1012 07:35:51.301006 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:51 crc kubenswrapper[4599]: I1012 07:35:51.301060 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:51 crc kubenswrapper[4599]: I1012 07:35:51.301072 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:51 crc kubenswrapper[4599]: I1012 07:35:51.301088 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:51 crc kubenswrapper[4599]: I1012 07:35:51.301099 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:51Z","lastTransitionTime":"2025-10-12T07:35:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:51 crc kubenswrapper[4599]: I1012 07:35:51.403804 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:51 crc kubenswrapper[4599]: I1012 07:35:51.403852 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:51 crc kubenswrapper[4599]: I1012 07:35:51.403864 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:51 crc kubenswrapper[4599]: I1012 07:35:51.403880 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:51 crc kubenswrapper[4599]: I1012 07:35:51.403890 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:51Z","lastTransitionTime":"2025-10-12T07:35:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:51 crc kubenswrapper[4599]: I1012 07:35:51.506874 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:51 crc kubenswrapper[4599]: I1012 07:35:51.506931 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:51 crc kubenswrapper[4599]: I1012 07:35:51.506943 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:51 crc kubenswrapper[4599]: I1012 07:35:51.506960 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:51 crc kubenswrapper[4599]: I1012 07:35:51.506972 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:51Z","lastTransitionTime":"2025-10-12T07:35:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:51 crc kubenswrapper[4599]: I1012 07:35:51.545292 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 07:35:51 crc kubenswrapper[4599]: I1012 07:35:51.545355 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 07:35:51 crc kubenswrapper[4599]: I1012 07:35:51.545375 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 07:35:51 crc kubenswrapper[4599]: E1012 07:35:51.545465 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 07:35:51 crc kubenswrapper[4599]: E1012 07:35:51.545612 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 07:35:51 crc kubenswrapper[4599]: E1012 07:35:51.545806 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 07:35:51 crc kubenswrapper[4599]: I1012 07:35:51.609533 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:51 crc kubenswrapper[4599]: I1012 07:35:51.609582 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:51 crc kubenswrapper[4599]: I1012 07:35:51.609592 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:51 crc kubenswrapper[4599]: I1012 07:35:51.609614 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:51 crc kubenswrapper[4599]: I1012 07:35:51.609625 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:51Z","lastTransitionTime":"2025-10-12T07:35:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:51 crc kubenswrapper[4599]: I1012 07:35:51.711805 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:51 crc kubenswrapper[4599]: I1012 07:35:51.711852 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:51 crc kubenswrapper[4599]: I1012 07:35:51.711864 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:51 crc kubenswrapper[4599]: I1012 07:35:51.711883 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:51 crc kubenswrapper[4599]: I1012 07:35:51.711896 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:51Z","lastTransitionTime":"2025-10-12T07:35:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:51 crc kubenswrapper[4599]: I1012 07:35:51.814155 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:51 crc kubenswrapper[4599]: I1012 07:35:51.814202 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:51 crc kubenswrapper[4599]: I1012 07:35:51.814215 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:51 crc kubenswrapper[4599]: I1012 07:35:51.814233 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:51 crc kubenswrapper[4599]: I1012 07:35:51.814243 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:51Z","lastTransitionTime":"2025-10-12T07:35:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:51 crc kubenswrapper[4599]: I1012 07:35:51.917009 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:51 crc kubenswrapper[4599]: I1012 07:35:51.917069 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:51 crc kubenswrapper[4599]: I1012 07:35:51.917081 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:51 crc kubenswrapper[4599]: I1012 07:35:51.917105 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:51 crc kubenswrapper[4599]: I1012 07:35:51.917119 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:51Z","lastTransitionTime":"2025-10-12T07:35:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:52 crc kubenswrapper[4599]: I1012 07:35:52.019787 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:52 crc kubenswrapper[4599]: I1012 07:35:52.019851 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:52 crc kubenswrapper[4599]: I1012 07:35:52.019863 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:52 crc kubenswrapper[4599]: I1012 07:35:52.019885 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:52 crc kubenswrapper[4599]: I1012 07:35:52.019899 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:52Z","lastTransitionTime":"2025-10-12T07:35:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:52 crc kubenswrapper[4599]: I1012 07:35:52.122063 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:52 crc kubenswrapper[4599]: I1012 07:35:52.122117 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:52 crc kubenswrapper[4599]: I1012 07:35:52.122127 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:52 crc kubenswrapper[4599]: I1012 07:35:52.122144 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:52 crc kubenswrapper[4599]: I1012 07:35:52.122156 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:52Z","lastTransitionTime":"2025-10-12T07:35:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:52 crc kubenswrapper[4599]: I1012 07:35:52.224495 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:52 crc kubenswrapper[4599]: I1012 07:35:52.224541 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:52 crc kubenswrapper[4599]: I1012 07:35:52.224555 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:52 crc kubenswrapper[4599]: I1012 07:35:52.224572 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:52 crc kubenswrapper[4599]: I1012 07:35:52.224582 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:52Z","lastTransitionTime":"2025-10-12T07:35:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:52 crc kubenswrapper[4599]: I1012 07:35:52.327017 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:52 crc kubenswrapper[4599]: I1012 07:35:52.327079 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:52 crc kubenswrapper[4599]: I1012 07:35:52.327089 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:52 crc kubenswrapper[4599]: I1012 07:35:52.327107 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:52 crc kubenswrapper[4599]: I1012 07:35:52.327119 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:52Z","lastTransitionTime":"2025-10-12T07:35:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:52 crc kubenswrapper[4599]: I1012 07:35:52.429067 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:52 crc kubenswrapper[4599]: I1012 07:35:52.429109 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:52 crc kubenswrapper[4599]: I1012 07:35:52.429120 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:52 crc kubenswrapper[4599]: I1012 07:35:52.429134 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:52 crc kubenswrapper[4599]: I1012 07:35:52.429144 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:52Z","lastTransitionTime":"2025-10-12T07:35:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:52 crc kubenswrapper[4599]: I1012 07:35:52.531663 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:52 crc kubenswrapper[4599]: I1012 07:35:52.531705 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:52 crc kubenswrapper[4599]: I1012 07:35:52.531739 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:52 crc kubenswrapper[4599]: I1012 07:35:52.531765 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:52 crc kubenswrapper[4599]: I1012 07:35:52.531774 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:52Z","lastTransitionTime":"2025-10-12T07:35:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:52 crc kubenswrapper[4599]: I1012 07:35:52.544449 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kwphq" Oct 12 07:35:52 crc kubenswrapper[4599]: E1012 07:35:52.544605 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kwphq" podUID="3c3e76cc-139b-4a2a-b96b-6077e3706376" Oct 12 07:35:52 crc kubenswrapper[4599]: I1012 07:35:52.634580 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:52 crc kubenswrapper[4599]: I1012 07:35:52.634629 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:52 crc kubenswrapper[4599]: I1012 07:35:52.634640 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:52 crc kubenswrapper[4599]: I1012 07:35:52.634656 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:52 crc kubenswrapper[4599]: I1012 07:35:52.634668 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:52Z","lastTransitionTime":"2025-10-12T07:35:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:52 crc kubenswrapper[4599]: I1012 07:35:52.732225 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 12 07:35:52 crc kubenswrapper[4599]: I1012 07:35:52.741517 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:52 crc kubenswrapper[4599]: I1012 07:35:52.741591 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:52 crc kubenswrapper[4599]: I1012 07:35:52.741609 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:52 crc kubenswrapper[4599]: I1012 07:35:52.741631 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:52 crc kubenswrapper[4599]: I1012 07:35:52.741648 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:52Z","lastTransitionTime":"2025-10-12T07:35:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:52 crc kubenswrapper[4599]: I1012 07:35:52.744907 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 12 07:35:52 crc kubenswrapper[4599]: I1012 07:35:52.745596 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4b9189620cc5d626fe6c3e7bbe0efd22f7a6f061bc2afd3d33b1e04ebdbfb1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc203371af6667734441e7dca5f3b301365eb1136f363df8c87377ae58168130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:52Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:52 crc kubenswrapper[4599]: I1012 07:35:52.754779 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b574e7cb76b05a81f63a55ce4544b1d4b3a58e6793be2c89620f66282642e39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:52Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:52 crc kubenswrapper[4599]: I1012 07:35:52.765775 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9xbn5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18ca8765-c435-4750-b803-14b539958d9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4db4e7d06f60157d91fcb8514d176f78540085701360ebcfa7eabb36fe84d6b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16c75354b9fbe00273ab4aaa5ca1306c5d097996b8ce8074a2cee8185118a001\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16c75354b9fbe00273ab4aaa5ca1306c5d097996b8ce8074a2cee8185118a001\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a7c28a100796e21008c597d15ee36786f7e87553a8a4feb5734ac79cef9b556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a7c28a100796e21008c597d15ee36786f7e87553a8a4feb5734ac79cef9b556\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6388c1e143c0caded216941ec25bd07be0a14f2f97f6328c35d5cc04815a88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6388c1e143c0caded216941ec25bd07be0a14f2f97f6328c35d5cc04815a88a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e201774968890569efafc736378cff0d700067bb0bd39677f42ef65fa2c07646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e201774968890569efafc736378cff0d700067bb0bd39677f42ef65fa2c07646\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f07c63fbebde2f05a5b1682af69eb42b5aec20728e0997f1ebc2f5a59815ad1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f07c63fbebde2f05a5b1682af69eb42b5aec20728e0997f1ebc2f5a59815ad1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f62b824f1e7400e9588b3d3a1afc52f86b3bcae039fbe3c757e1e01733e3a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f62b824f1e7400e9588b3d3a1afc52f86b3bcae039fbe3c757e1e01733e3a69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9xbn5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:52Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:52 crc kubenswrapper[4599]: I1012 07:35:52.774404 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8hm26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce311f52-0501-45d3-8209-b1d2aa25028b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://611114a1fecbd461507dd2f566f975c7ef3e8d7655d90c1a5b31cd05a430cb38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8hm26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:52Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:52 crc kubenswrapper[4599]: I1012 07:35:52.782816 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:52Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:52 crc kubenswrapper[4599]: I1012 07:35:52.791420 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:52Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:52 crc kubenswrapper[4599]: I1012 07:35:52.804064 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a95b7ab-8632-4332-a30f-64f28ef8d313\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://543f674ce98b17c90c5a351ae72f596a2e90006a1b86fcede3359715c67b94bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c150b687d6b0c337604ac14537335d57d9c955a9e31001d274507e5e65ac7bc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f73bbbb549129c48a1421dda3d2385cf3515feafd8215cdac6b792823d614b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a156f480c7252f61ec3dc991814db0e7259fdf7ae618c8f43d86da97f8a6bc5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f99854365302221a36d3d7647292430b5655222c551953b6aa6de9eb1c922bea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://810de96e34e6b028e90bcc5693f1624d337cdd775c01901fc91e6dd2a7f3564d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72e00dcebdb38459b2ffe60407ea9c8a343a2a57435d0596134981223e989f8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72e00dcebdb38459b2ffe60407ea9c8a343a2a57435d0596134981223e989f8a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T07:35:46Z\\\",\\\"message\\\":\\\"achine-config-daemon-5mz5c openshift-multus/multus-8hm26 openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xl2ns openshift-dns/node-resolver-f5988 openshift-multus/network-metrics-daemon-kwphq openshift-network-diagnostics/network-check-target-xd92c openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-kube-apiserver/kube-apiserver-crc openshift-kube-controller-manager/kube-controller-manager-crc openshift-network-operator/iptables-alerter-4ln5h]\\\\nI1012 07:35:46.210430 6238 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nF1012 07:35:46.210433 6238 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired o\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-whk5b_openshift-ovn-kubernetes(1a95b7ab-8632-4332-a30f-64f28ef8d313)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce496909e393b4896eb9ae165317648df3e828d219c231d95c6057cd08b78fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce4f64c7fab07385f9215821c5c75c5b345653fa141a238bb3dd74e7a1a291a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce4f64c7fab07385f9215821c5c75c5b345653fa141a238bb3dd74e7a1a291a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-whk5b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:52Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:52 crc kubenswrapper[4599]: I1012 07:35:52.812063 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kwphq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c3e76cc-139b-4a2a-b96b-6077e3706376\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9v52\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9v52\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kwphq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:52Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:52 crc kubenswrapper[4599]: I1012 07:35:52.820210 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rxr42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92909e59-b659-4fc0-91c0-1880ff96b4f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4162f0c9d7789e80004cf3ea54eed47b4d473832cf68a03e348973c300f9dcc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7hr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rxr42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:52Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:52 crc kubenswrapper[4599]: I1012 07:35:52.826835 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f5988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0147ed28-bc0f-409c-a813-dc2ffffba092\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b06de751793dade650ec18334a6bcf8bb26e5e89af47ede264d087d1a950178a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzz9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f5988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:52Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:52 crc kubenswrapper[4599]: I1012 07:35:52.834633 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc694bce-8c25-4729-b452-29d44d3efe6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c222e0cd037e0719d1ea94d4f28639dae42c676fa56b5f57b22cd50d87e6b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25l5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://238fe8d2b2e7f08d70bc5f938aca3c90b68c447db21a713d651d47b47a8127c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25l5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5mz5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:52Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:52 crc kubenswrapper[4599]: I1012 07:35:52.842627 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xl2ns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29fbfc4b-8e32-4132-a34d-48b25ec31428\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e2f2c3c17ada7047edb8e0fcd51227104458618c98e5ddec12123aa7ee173e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqqzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b84de1dd84c3c72376a170b7ccbdfff8c3bdd5e9e18a344002e112e81ca555a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqqzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xl2ns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:52Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:52 crc kubenswrapper[4599]: I1012 07:35:52.843693 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:52 crc kubenswrapper[4599]: I1012 07:35:52.843829 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:52 crc kubenswrapper[4599]: I1012 07:35:52.844092 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:52 crc kubenswrapper[4599]: I1012 07:35:52.844174 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:52 crc kubenswrapper[4599]: I1012 07:35:52.844236 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:52Z","lastTransitionTime":"2025-10-12T07:35:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:52 crc kubenswrapper[4599]: I1012 07:35:52.854986 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8afb4c88-1806-483a-9957-30833be28202\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c61ee75e4bbe142c54914177759cf0c49ecc36f904c8c6bf421cd4917256d5a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://545c8135b4e5319ac3b9c84ee50556b66e4f5cfe0987f5aa37ef668130f6f267\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40188eb89126b26a3002269d063827de60fa17f72c18fbbd0e15b16c744e8ddb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc5c76ab7433309a1f2c32d2ada884b5b0d926fee8219463079f8689a1f7efef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21d90e31b7ebf70d48881d9d0837e8a1fdeabc3d53988c7ba993ce28e902d3d4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T07:35:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW1012 07:35:20.755648 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1012 07:35:20.755757 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 07:35:20.757774 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2293705747/tls.crt::/tmp/serving-cert-2293705747/tls.key\\\\\\\"\\\\nI1012 07:35:20.975376 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1012 07:35:20.977324 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1012 07:35:20.977357 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1012 07:35:20.977377 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1012 07:35:20.977382 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1012 07:35:20.982144 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1012 07:35:20.982166 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1012 07:35:20.982172 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1012 07:35:20.982212 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1012 07:35:20.982216 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1012 07:35:20.982220 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1012 07:35:20.982223 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1012 07:35:20.982226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1012 07:35:20.985689 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e595e7d1dee4ecff3b5deb6e363b57de09e4bcfddd6ab9ae9be27b6e3578628\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afc60525d60f5b497a75efb631611d658c23f6336c21ca7421bf3efee3ece0ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afc60525d60f5b497a75efb631611d658c23f6336c21ca7421bf3efee3ece0ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:52Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:52 crc kubenswrapper[4599]: I1012 07:35:52.865779 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db79c5e0b03856a5da5fba6b8e6cb77590ef9e29090ac66da24d54ea638a20f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:52Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:52 crc kubenswrapper[4599]: I1012 07:35:52.876873 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:52Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:52 crc kubenswrapper[4599]: I1012 07:35:52.887586 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2869b43-5f5f-4ed7-9dc3-63d210bde137\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21abc118842f09c54256bd05f1e5978c928bad569bc9b52157287576117fb49d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb25175bfe42e193200fbc76f82294afbd5f23d9653637c12ddf67553431748e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da490def77b19758f5479ef7070846dee19a5057bd6b0568d96fd38a8dd6528\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://622eac195ecab593bc5fa71a4a88ac987a45be5fb130336aae7acd680c51abe4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:52Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:52 crc kubenswrapper[4599]: I1012 07:35:52.946828 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:52 crc kubenswrapper[4599]: I1012 07:35:52.947166 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:52 crc kubenswrapper[4599]: I1012 07:35:52.947240 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:52 crc kubenswrapper[4599]: I1012 07:35:52.947309 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:52 crc kubenswrapper[4599]: I1012 07:35:52.947403 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:52Z","lastTransitionTime":"2025-10-12T07:35:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:53 crc kubenswrapper[4599]: I1012 07:35:53.050041 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:53 crc kubenswrapper[4599]: I1012 07:35:53.050363 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:53 crc kubenswrapper[4599]: I1012 07:35:53.050448 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:53 crc kubenswrapper[4599]: I1012 07:35:53.050525 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:53 crc kubenswrapper[4599]: I1012 07:35:53.050599 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:53Z","lastTransitionTime":"2025-10-12T07:35:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:53 crc kubenswrapper[4599]: I1012 07:35:53.153476 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:53 crc kubenswrapper[4599]: I1012 07:35:53.153526 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:53 crc kubenswrapper[4599]: I1012 07:35:53.153537 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:53 crc kubenswrapper[4599]: I1012 07:35:53.153554 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:53 crc kubenswrapper[4599]: I1012 07:35:53.153566 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:53Z","lastTransitionTime":"2025-10-12T07:35:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:53 crc kubenswrapper[4599]: I1012 07:35:53.256039 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:53 crc kubenswrapper[4599]: I1012 07:35:53.256075 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:53 crc kubenswrapper[4599]: I1012 07:35:53.256087 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:53 crc kubenswrapper[4599]: I1012 07:35:53.256100 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:53 crc kubenswrapper[4599]: I1012 07:35:53.256108 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:53Z","lastTransitionTime":"2025-10-12T07:35:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:53 crc kubenswrapper[4599]: I1012 07:35:53.278019 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 07:35:53 crc kubenswrapper[4599]: E1012 07:35:53.278224 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 07:36:25.2781974 +0000 UTC m=+82.067392902 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 07:35:53 crc kubenswrapper[4599]: I1012 07:35:53.358484 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:53 crc kubenswrapper[4599]: I1012 07:35:53.358539 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:53 crc kubenswrapper[4599]: I1012 07:35:53.358564 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:53 crc kubenswrapper[4599]: I1012 07:35:53.358582 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:53 crc kubenswrapper[4599]: I1012 07:35:53.358592 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:53Z","lastTransitionTime":"2025-10-12T07:35:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:53 crc kubenswrapper[4599]: I1012 07:35:53.379070 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 07:35:53 crc kubenswrapper[4599]: I1012 07:35:53.379109 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 07:35:53 crc kubenswrapper[4599]: I1012 07:35:53.379128 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 07:35:53 crc kubenswrapper[4599]: I1012 07:35:53.379148 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 07:35:53 crc kubenswrapper[4599]: E1012 07:35:53.379226 4599 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 12 07:35:53 crc kubenswrapper[4599]: E1012 07:35:53.379251 4599 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 12 07:35:53 crc kubenswrapper[4599]: E1012 07:35:53.379280 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-12 07:36:25.379264261 +0000 UTC m=+82.168459764 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 12 07:35:53 crc kubenswrapper[4599]: E1012 07:35:53.379294 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-12 07:36:25.379285381 +0000 UTC m=+82.168480883 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 12 07:35:53 crc kubenswrapper[4599]: E1012 07:35:53.379409 4599 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 12 07:35:53 crc kubenswrapper[4599]: E1012 07:35:53.379449 4599 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 12 07:35:53 crc kubenswrapper[4599]: E1012 07:35:53.379469 4599 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 12 07:35:53 crc kubenswrapper[4599]: E1012 07:35:53.379558 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-12 07:36:25.37953177 +0000 UTC m=+82.168727282 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 12 07:35:53 crc kubenswrapper[4599]: E1012 07:35:53.379590 4599 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 12 07:35:53 crc kubenswrapper[4599]: E1012 07:35:53.379643 4599 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 12 07:35:53 crc kubenswrapper[4599]: E1012 07:35:53.379661 4599 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 12 07:35:53 crc kubenswrapper[4599]: E1012 07:35:53.379755 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-12 07:36:25.37973697 +0000 UTC m=+82.168932482 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 12 07:35:53 crc kubenswrapper[4599]: I1012 07:35:53.461839 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:53 crc kubenswrapper[4599]: I1012 07:35:53.461942 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:53 crc kubenswrapper[4599]: I1012 07:35:53.461957 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:53 crc kubenswrapper[4599]: I1012 07:35:53.461983 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:53 crc kubenswrapper[4599]: I1012 07:35:53.462000 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:53Z","lastTransitionTime":"2025-10-12T07:35:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:53 crc kubenswrapper[4599]: I1012 07:35:53.544853 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 07:35:53 crc kubenswrapper[4599]: I1012 07:35:53.544945 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 07:35:53 crc kubenswrapper[4599]: I1012 07:35:53.544862 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 07:35:53 crc kubenswrapper[4599]: E1012 07:35:53.545026 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 07:35:53 crc kubenswrapper[4599]: E1012 07:35:53.545149 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 07:35:53 crc kubenswrapper[4599]: E1012 07:35:53.545388 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 07:35:53 crc kubenswrapper[4599]: I1012 07:35:53.559851 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8afb4c88-1806-483a-9957-30833be28202\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c61ee75e4bbe142c54914177759cf0c49ecc36f904c8c6bf421cd4917256d5a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://545c8135b4e5319ac3b9c84ee50556b66e4f5cfe0987f5aa37ef668130f6f267\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40188eb89126b26a3002269d063827de60fa17f72c18fbbd0e15b16c744e8ddb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc5c76ab7433309a1f2c32d2ada884b5b0d926fee8219463079f8689a1f7efef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21d90e31b7ebf70d48881d9d0837e8a1fdeabc3d53988c7ba993ce28e902d3d4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T07:35:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW1012 07:35:20.755648 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1012 07:35:20.755757 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 07:35:20.757774 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2293705747/tls.crt::/tmp/serving-cert-2293705747/tls.key\\\\\\\"\\\\nI1012 07:35:20.975376 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1012 07:35:20.977324 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1012 07:35:20.977357 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1012 07:35:20.977377 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1012 07:35:20.977382 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1012 07:35:20.982144 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1012 07:35:20.982166 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1012 07:35:20.982172 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1012 07:35:20.982212 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1012 07:35:20.982216 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1012 07:35:20.982220 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1012 07:35:20.982223 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1012 07:35:20.982226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1012 07:35:20.985689 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e595e7d1dee4ecff3b5deb6e363b57de09e4bcfddd6ab9ae9be27b6e3578628\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afc60525d60f5b497a75efb631611d658c23f6336c21ca7421bf3efee3ece0ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afc60525d60f5b497a75efb631611d658c23f6336c21ca7421bf3efee3ece0ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:53Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:53 crc kubenswrapper[4599]: I1012 07:35:53.564827 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:53 crc kubenswrapper[4599]: I1012 07:35:53.564875 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:53 crc kubenswrapper[4599]: I1012 07:35:53.564887 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:53 crc kubenswrapper[4599]: I1012 07:35:53.564906 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:53 crc kubenswrapper[4599]: I1012 07:35:53.564916 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:53Z","lastTransitionTime":"2025-10-12T07:35:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:53 crc kubenswrapper[4599]: I1012 07:35:53.573165 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db79c5e0b03856a5da5fba6b8e6cb77590ef9e29090ac66da24d54ea638a20f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:53Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:53 crc kubenswrapper[4599]: I1012 07:35:53.584661 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:53Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:53 crc kubenswrapper[4599]: I1012 07:35:53.594437 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc694bce-8c25-4729-b452-29d44d3efe6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c222e0cd037e0719d1ea94d4f28639dae42c676fa56b5f57b22cd50d87e6b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25l5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://238fe8d2b2e7f08d70bc5f938aca3c90b68c447db21a713d651d47b47a8127c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25l5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5mz5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:53Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:53 crc kubenswrapper[4599]: I1012 07:35:53.606102 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xl2ns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29fbfc4b-8e32-4132-a34d-48b25ec31428\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e2f2c3c17ada7047edb8e0fcd51227104458618c98e5ddec12123aa7ee173e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqqzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b84de1dd84c3c72376a170b7ccbdfff8c3bdd5e9e18a344002e112e81ca555a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqqzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xl2ns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:53Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:53 crc kubenswrapper[4599]: I1012 07:35:53.619921 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2869b43-5f5f-4ed7-9dc3-63d210bde137\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21abc118842f09c54256bd05f1e5978c928bad569bc9b52157287576117fb49d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb25175bfe42e193200fbc76f82294afbd5f23d9653637c12ddf67553431748e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da490def77b19758f5479ef7070846dee19a5057bd6b0568d96fd38a8dd6528\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://622eac195ecab593bc5fa71a4a88ac987a45be5fb130336aae7acd680c51abe4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:53Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:53 crc kubenswrapper[4599]: I1012 07:35:53.630419 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:53Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:53 crc kubenswrapper[4599]: I1012 07:35:53.644674 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:53Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:53 crc kubenswrapper[4599]: I1012 07:35:53.656870 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4b9189620cc5d626fe6c3e7bbe0efd22f7a6f061bc2afd3d33b1e04ebdbfb1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc203371af6667734441e7dca5f3b301365eb1136f363df8c87377ae58168130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:53Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:53 crc kubenswrapper[4599]: I1012 07:35:53.666811 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b574e7cb76b05a81f63a55ce4544b1d4b3a58e6793be2c89620f66282642e39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:53Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:53 crc kubenswrapper[4599]: I1012 07:35:53.667061 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:53 crc kubenswrapper[4599]: I1012 07:35:53.667145 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:53 crc kubenswrapper[4599]: I1012 07:35:53.667214 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:53 crc kubenswrapper[4599]: I1012 07:35:53.667289 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:53 crc kubenswrapper[4599]: I1012 07:35:53.667390 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:53Z","lastTransitionTime":"2025-10-12T07:35:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:53 crc kubenswrapper[4599]: I1012 07:35:53.680440 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9xbn5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18ca8765-c435-4750-b803-14b539958d9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4db4e7d06f60157d91fcb8514d176f78540085701360ebcfa7eabb36fe84d6b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16c75354b9fbe00273ab4aaa5ca1306c5d097996b8ce8074a2cee8185118a001\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16c75354b9fbe00273ab4aaa5ca1306c5d097996b8ce8074a2cee8185118a001\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a7c28a100796e21008c597d15ee36786f7e87553a8a4feb5734ac79cef9b556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a7c28a100796e21008c597d15ee36786f7e87553a8a4feb5734ac79cef9b556\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6388c1e143c0caded216941ec25bd07be0a14f2f97f6328c35d5cc04815a88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6388c1e143c0caded216941ec25bd07be0a14f2f97f6328c35d5cc04815a88a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e201774968890569efafc736378cff0d700067bb0bd39677f42ef65fa2c07646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e201774968890569efafc736378cff0d700067bb0bd39677f42ef65fa2c07646\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f07c63fbebde2f05a5b1682af69eb42b5aec20728e0997f1ebc2f5a59815ad1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f07c63fbebde2f05a5b1682af69eb42b5aec20728e0997f1ebc2f5a59815ad1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f62b824f1e7400e9588b3d3a1afc52f86b3bcae039fbe3c757e1e01733e3a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f62b824f1e7400e9588b3d3a1afc52f86b3bcae039fbe3c757e1e01733e3a69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9xbn5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:53Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:53 crc kubenswrapper[4599]: I1012 07:35:53.691914 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8hm26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce311f52-0501-45d3-8209-b1d2aa25028b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://611114a1fecbd461507dd2f566f975c7ef3e8d7655d90c1a5b31cd05a430cb38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8hm26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:53Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:53 crc kubenswrapper[4599]: I1012 07:35:53.702269 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"636df6e3-ed6a-452a-9d5d-e26139f62951\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f52b4e0ae3f90600ec5e799352a1f70b4fad54a4805616d1508c22af008a0f8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56aa5b66fc1ca787e84c8ca2a717038f9e522862fb680d5271921e410d5841f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://091875c6e2e9f643a0c31783ab1905c5fd448d20e98b9b4de70be991e7e8f1c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d08b26fb90464c31907b23edfd9f4997375c3178e8699cf951af85424d9354d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d08b26fb90464c31907b23edfd9f4997375c3178e8699cf951af85424d9354d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:03Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:53Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:53 crc kubenswrapper[4599]: I1012 07:35:53.713377 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rxr42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92909e59-b659-4fc0-91c0-1880ff96b4f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4162f0c9d7789e80004cf3ea54eed47b4d473832cf68a03e348973c300f9dcc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7hr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rxr42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:53Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:53 crc kubenswrapper[4599]: I1012 07:35:53.724827 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f5988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0147ed28-bc0f-409c-a813-dc2ffffba092\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b06de751793dade650ec18334a6bcf8bb26e5e89af47ede264d087d1a950178a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzz9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f5988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:53Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:53 crc kubenswrapper[4599]: I1012 07:35:53.738957 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a95b7ab-8632-4332-a30f-64f28ef8d313\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://543f674ce98b17c90c5a351ae72f596a2e90006a1b86fcede3359715c67b94bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c150b687d6b0c337604ac14537335d57d9c955a9e31001d274507e5e65ac7bc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f73bbbb549129c48a1421dda3d2385cf3515feafd8215cdac6b792823d614b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a156f480c7252f61ec3dc991814db0e7259fdf7ae618c8f43d86da97f8a6bc5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f99854365302221a36d3d7647292430b5655222c551953b6aa6de9eb1c922bea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://810de96e34e6b028e90bcc5693f1624d337cdd775c01901fc91e6dd2a7f3564d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72e00dcebdb38459b2ffe60407ea9c8a343a2a57435d0596134981223e989f8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72e00dcebdb38459b2ffe60407ea9c8a343a2a57435d0596134981223e989f8a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T07:35:46Z\\\",\\\"message\\\":\\\"achine-config-daemon-5mz5c openshift-multus/multus-8hm26 openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xl2ns openshift-dns/node-resolver-f5988 openshift-multus/network-metrics-daemon-kwphq openshift-network-diagnostics/network-check-target-xd92c openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-kube-apiserver/kube-apiserver-crc openshift-kube-controller-manager/kube-controller-manager-crc openshift-network-operator/iptables-alerter-4ln5h]\\\\nI1012 07:35:46.210430 6238 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nF1012 07:35:46.210433 6238 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired o\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-whk5b_openshift-ovn-kubernetes(1a95b7ab-8632-4332-a30f-64f28ef8d313)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce496909e393b4896eb9ae165317648df3e828d219c231d95c6057cd08b78fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce4f64c7fab07385f9215821c5c75c5b345653fa141a238bb3dd74e7a1a291a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce4f64c7fab07385f9215821c5c75c5b345653fa141a238bb3dd74e7a1a291a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-whk5b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:53Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:53 crc kubenswrapper[4599]: I1012 07:35:53.748236 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kwphq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c3e76cc-139b-4a2a-b96b-6077e3706376\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9v52\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9v52\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kwphq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:53Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:53 crc kubenswrapper[4599]: I1012 07:35:53.769124 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:53 crc kubenswrapper[4599]: I1012 07:35:53.769161 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:53 crc kubenswrapper[4599]: I1012 07:35:53.769171 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:53 crc kubenswrapper[4599]: I1012 07:35:53.769186 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:53 crc kubenswrapper[4599]: I1012 07:35:53.769197 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:53Z","lastTransitionTime":"2025-10-12T07:35:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:53 crc kubenswrapper[4599]: I1012 07:35:53.870842 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:53 crc kubenswrapper[4599]: I1012 07:35:53.870889 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:53 crc kubenswrapper[4599]: I1012 07:35:53.870898 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:53 crc kubenswrapper[4599]: I1012 07:35:53.870914 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:53 crc kubenswrapper[4599]: I1012 07:35:53.870925 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:53Z","lastTransitionTime":"2025-10-12T07:35:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:53 crc kubenswrapper[4599]: I1012 07:35:53.973327 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:53 crc kubenswrapper[4599]: I1012 07:35:53.973404 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:53 crc kubenswrapper[4599]: I1012 07:35:53.973418 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:53 crc kubenswrapper[4599]: I1012 07:35:53.973434 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:53 crc kubenswrapper[4599]: I1012 07:35:53.973447 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:53Z","lastTransitionTime":"2025-10-12T07:35:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:54 crc kubenswrapper[4599]: I1012 07:35:54.075851 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:54 crc kubenswrapper[4599]: I1012 07:35:54.075905 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:54 crc kubenswrapper[4599]: I1012 07:35:54.075914 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:54 crc kubenswrapper[4599]: I1012 07:35:54.075929 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:54 crc kubenswrapper[4599]: I1012 07:35:54.075939 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:54Z","lastTransitionTime":"2025-10-12T07:35:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:54 crc kubenswrapper[4599]: I1012 07:35:54.178588 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:54 crc kubenswrapper[4599]: I1012 07:35:54.178642 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:54 crc kubenswrapper[4599]: I1012 07:35:54.178651 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:54 crc kubenswrapper[4599]: I1012 07:35:54.178671 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:54 crc kubenswrapper[4599]: I1012 07:35:54.178684 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:54Z","lastTransitionTime":"2025-10-12T07:35:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:54 crc kubenswrapper[4599]: I1012 07:35:54.281439 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:54 crc kubenswrapper[4599]: I1012 07:35:54.281484 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:54 crc kubenswrapper[4599]: I1012 07:35:54.281493 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:54 crc kubenswrapper[4599]: I1012 07:35:54.281511 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:54 crc kubenswrapper[4599]: I1012 07:35:54.281524 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:54Z","lastTransitionTime":"2025-10-12T07:35:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:54 crc kubenswrapper[4599]: I1012 07:35:54.384148 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:54 crc kubenswrapper[4599]: I1012 07:35:54.384194 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:54 crc kubenswrapper[4599]: I1012 07:35:54.384206 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:54 crc kubenswrapper[4599]: I1012 07:35:54.384225 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:54 crc kubenswrapper[4599]: I1012 07:35:54.384235 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:54Z","lastTransitionTime":"2025-10-12T07:35:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:54 crc kubenswrapper[4599]: I1012 07:35:54.486668 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:54 crc kubenswrapper[4599]: I1012 07:35:54.486726 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:54 crc kubenswrapper[4599]: I1012 07:35:54.486736 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:54 crc kubenswrapper[4599]: I1012 07:35:54.486760 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:54 crc kubenswrapper[4599]: I1012 07:35:54.486792 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:54Z","lastTransitionTime":"2025-10-12T07:35:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:54 crc kubenswrapper[4599]: I1012 07:35:54.544525 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kwphq" Oct 12 07:35:54 crc kubenswrapper[4599]: E1012 07:35:54.544701 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kwphq" podUID="3c3e76cc-139b-4a2a-b96b-6077e3706376" Oct 12 07:35:54 crc kubenswrapper[4599]: I1012 07:35:54.589656 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:54 crc kubenswrapper[4599]: I1012 07:35:54.589689 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:54 crc kubenswrapper[4599]: I1012 07:35:54.589699 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:54 crc kubenswrapper[4599]: I1012 07:35:54.589712 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:54 crc kubenswrapper[4599]: I1012 07:35:54.589723 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:54Z","lastTransitionTime":"2025-10-12T07:35:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:54 crc kubenswrapper[4599]: I1012 07:35:54.651576 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:54 crc kubenswrapper[4599]: I1012 07:35:54.651623 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:54 crc kubenswrapper[4599]: I1012 07:35:54.651634 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:54 crc kubenswrapper[4599]: I1012 07:35:54.651651 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:54 crc kubenswrapper[4599]: I1012 07:35:54.651665 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:54Z","lastTransitionTime":"2025-10-12T07:35:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:54 crc kubenswrapper[4599]: E1012 07:35:54.662436 4599 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:35:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:35:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:35:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:35:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b3fd0901-686d-4e85-8767-c56bc470edcc\\\",\\\"systemUUID\\\":\\\"c0d90076-d180-408b-98a9-f48996ced0a6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:54Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:54 crc kubenswrapper[4599]: I1012 07:35:54.666157 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:54 crc kubenswrapper[4599]: I1012 07:35:54.666217 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:54 crc kubenswrapper[4599]: I1012 07:35:54.666230 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:54 crc kubenswrapper[4599]: I1012 07:35:54.666251 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:54 crc kubenswrapper[4599]: I1012 07:35:54.666264 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:54Z","lastTransitionTime":"2025-10-12T07:35:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:54 crc kubenswrapper[4599]: E1012 07:35:54.675596 4599 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:35:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:35:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:35:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:35:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b3fd0901-686d-4e85-8767-c56bc470edcc\\\",\\\"systemUUID\\\":\\\"c0d90076-d180-408b-98a9-f48996ced0a6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:54Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:54 crc kubenswrapper[4599]: I1012 07:35:54.684696 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:54 crc kubenswrapper[4599]: I1012 07:35:54.684742 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:54 crc kubenswrapper[4599]: I1012 07:35:54.684751 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:54 crc kubenswrapper[4599]: I1012 07:35:54.684766 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:54 crc kubenswrapper[4599]: I1012 07:35:54.684776 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:54Z","lastTransitionTime":"2025-10-12T07:35:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:54 crc kubenswrapper[4599]: E1012 07:35:54.709980 4599 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:35:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:35:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:35:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:35:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b3fd0901-686d-4e85-8767-c56bc470edcc\\\",\\\"systemUUID\\\":\\\"c0d90076-d180-408b-98a9-f48996ced0a6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:54Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:54 crc kubenswrapper[4599]: I1012 07:35:54.715044 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:54 crc kubenswrapper[4599]: I1012 07:35:54.715077 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:54 crc kubenswrapper[4599]: I1012 07:35:54.715085 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:54 crc kubenswrapper[4599]: I1012 07:35:54.715100 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:54 crc kubenswrapper[4599]: I1012 07:35:54.715112 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:54Z","lastTransitionTime":"2025-10-12T07:35:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:54 crc kubenswrapper[4599]: E1012 07:35:54.728628 4599 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:35:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:35:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:35:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:35:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b3fd0901-686d-4e85-8767-c56bc470edcc\\\",\\\"systemUUID\\\":\\\"c0d90076-d180-408b-98a9-f48996ced0a6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:54Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:54 crc kubenswrapper[4599]: I1012 07:35:54.735172 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:54 crc kubenswrapper[4599]: I1012 07:35:54.735218 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:54 crc kubenswrapper[4599]: I1012 07:35:54.735227 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:54 crc kubenswrapper[4599]: I1012 07:35:54.735245 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:54 crc kubenswrapper[4599]: I1012 07:35:54.735255 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:54Z","lastTransitionTime":"2025-10-12T07:35:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:54 crc kubenswrapper[4599]: E1012 07:35:54.746656 4599 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:35:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:35:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:35:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:35:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b3fd0901-686d-4e85-8767-c56bc470edcc\\\",\\\"systemUUID\\\":\\\"c0d90076-d180-408b-98a9-f48996ced0a6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:35:54Z is after 2025-08-24T17:21:41Z" Oct 12 07:35:54 crc kubenswrapper[4599]: E1012 07:35:54.746770 4599 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 12 07:35:54 crc kubenswrapper[4599]: I1012 07:35:54.748074 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:54 crc kubenswrapper[4599]: I1012 07:35:54.748105 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:54 crc kubenswrapper[4599]: I1012 07:35:54.748114 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:54 crc kubenswrapper[4599]: I1012 07:35:54.748130 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:54 crc kubenswrapper[4599]: I1012 07:35:54.748141 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:54Z","lastTransitionTime":"2025-10-12T07:35:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:54 crc kubenswrapper[4599]: I1012 07:35:54.850863 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:54 crc kubenswrapper[4599]: I1012 07:35:54.850896 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:54 crc kubenswrapper[4599]: I1012 07:35:54.850907 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:54 crc kubenswrapper[4599]: I1012 07:35:54.850921 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:54 crc kubenswrapper[4599]: I1012 07:35:54.850931 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:54Z","lastTransitionTime":"2025-10-12T07:35:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:54 crc kubenswrapper[4599]: I1012 07:35:54.953727 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:54 crc kubenswrapper[4599]: I1012 07:35:54.953807 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:54 crc kubenswrapper[4599]: I1012 07:35:54.953822 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:54 crc kubenswrapper[4599]: I1012 07:35:54.953847 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:54 crc kubenswrapper[4599]: I1012 07:35:54.953860 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:54Z","lastTransitionTime":"2025-10-12T07:35:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:55 crc kubenswrapper[4599]: I1012 07:35:55.056779 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:55 crc kubenswrapper[4599]: I1012 07:35:55.056840 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:55 crc kubenswrapper[4599]: I1012 07:35:55.056849 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:55 crc kubenswrapper[4599]: I1012 07:35:55.056869 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:55 crc kubenswrapper[4599]: I1012 07:35:55.056882 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:55Z","lastTransitionTime":"2025-10-12T07:35:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:55 crc kubenswrapper[4599]: I1012 07:35:55.159668 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:55 crc kubenswrapper[4599]: I1012 07:35:55.159722 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:55 crc kubenswrapper[4599]: I1012 07:35:55.159735 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:55 crc kubenswrapper[4599]: I1012 07:35:55.159752 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:55 crc kubenswrapper[4599]: I1012 07:35:55.159766 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:55Z","lastTransitionTime":"2025-10-12T07:35:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:55 crc kubenswrapper[4599]: I1012 07:35:55.262375 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:55 crc kubenswrapper[4599]: I1012 07:35:55.262424 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:55 crc kubenswrapper[4599]: I1012 07:35:55.262436 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:55 crc kubenswrapper[4599]: I1012 07:35:55.262457 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:55 crc kubenswrapper[4599]: I1012 07:35:55.262467 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:55Z","lastTransitionTime":"2025-10-12T07:35:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:55 crc kubenswrapper[4599]: I1012 07:35:55.365095 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:55 crc kubenswrapper[4599]: I1012 07:35:55.365146 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:55 crc kubenswrapper[4599]: I1012 07:35:55.365157 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:55 crc kubenswrapper[4599]: I1012 07:35:55.365174 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:55 crc kubenswrapper[4599]: I1012 07:35:55.365185 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:55Z","lastTransitionTime":"2025-10-12T07:35:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:55 crc kubenswrapper[4599]: I1012 07:35:55.467501 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:55 crc kubenswrapper[4599]: I1012 07:35:55.467542 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:55 crc kubenswrapper[4599]: I1012 07:35:55.467551 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:55 crc kubenswrapper[4599]: I1012 07:35:55.467569 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:55 crc kubenswrapper[4599]: I1012 07:35:55.467581 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:55Z","lastTransitionTime":"2025-10-12T07:35:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:55 crc kubenswrapper[4599]: I1012 07:35:55.545252 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 07:35:55 crc kubenswrapper[4599]: I1012 07:35:55.545367 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 07:35:55 crc kubenswrapper[4599]: E1012 07:35:55.545434 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 07:35:55 crc kubenswrapper[4599]: E1012 07:35:55.545565 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 07:35:55 crc kubenswrapper[4599]: I1012 07:35:55.545822 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 07:35:55 crc kubenswrapper[4599]: E1012 07:35:55.546066 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 07:35:55 crc kubenswrapper[4599]: I1012 07:35:55.569546 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:55 crc kubenswrapper[4599]: I1012 07:35:55.569583 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:55 crc kubenswrapper[4599]: I1012 07:35:55.569624 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:55 crc kubenswrapper[4599]: I1012 07:35:55.569639 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:55 crc kubenswrapper[4599]: I1012 07:35:55.569650 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:55Z","lastTransitionTime":"2025-10-12T07:35:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:55 crc kubenswrapper[4599]: I1012 07:35:55.672115 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:55 crc kubenswrapper[4599]: I1012 07:35:55.672169 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:55 crc kubenswrapper[4599]: I1012 07:35:55.672182 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:55 crc kubenswrapper[4599]: I1012 07:35:55.672201 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:55 crc kubenswrapper[4599]: I1012 07:35:55.672212 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:55Z","lastTransitionTime":"2025-10-12T07:35:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:55 crc kubenswrapper[4599]: I1012 07:35:55.774539 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:55 crc kubenswrapper[4599]: I1012 07:35:55.774600 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:55 crc kubenswrapper[4599]: I1012 07:35:55.774613 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:55 crc kubenswrapper[4599]: I1012 07:35:55.774632 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:55 crc kubenswrapper[4599]: I1012 07:35:55.774645 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:55Z","lastTransitionTime":"2025-10-12T07:35:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:55 crc kubenswrapper[4599]: I1012 07:35:55.876985 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:55 crc kubenswrapper[4599]: I1012 07:35:55.877041 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:55 crc kubenswrapper[4599]: I1012 07:35:55.877052 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:55 crc kubenswrapper[4599]: I1012 07:35:55.877075 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:55 crc kubenswrapper[4599]: I1012 07:35:55.877089 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:55Z","lastTransitionTime":"2025-10-12T07:35:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:55 crc kubenswrapper[4599]: I1012 07:35:55.979709 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:55 crc kubenswrapper[4599]: I1012 07:35:55.979762 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:55 crc kubenswrapper[4599]: I1012 07:35:55.979773 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:55 crc kubenswrapper[4599]: I1012 07:35:55.979790 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:55 crc kubenswrapper[4599]: I1012 07:35:55.979804 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:55Z","lastTransitionTime":"2025-10-12T07:35:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:56 crc kubenswrapper[4599]: I1012 07:35:56.082284 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:56 crc kubenswrapper[4599]: I1012 07:35:56.082376 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:56 crc kubenswrapper[4599]: I1012 07:35:56.082386 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:56 crc kubenswrapper[4599]: I1012 07:35:56.082409 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:56 crc kubenswrapper[4599]: I1012 07:35:56.082421 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:56Z","lastTransitionTime":"2025-10-12T07:35:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:56 crc kubenswrapper[4599]: I1012 07:35:56.185174 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:56 crc kubenswrapper[4599]: I1012 07:35:56.185220 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:56 crc kubenswrapper[4599]: I1012 07:35:56.185230 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:56 crc kubenswrapper[4599]: I1012 07:35:56.185248 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:56 crc kubenswrapper[4599]: I1012 07:35:56.185259 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:56Z","lastTransitionTime":"2025-10-12T07:35:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:56 crc kubenswrapper[4599]: I1012 07:35:56.287716 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:56 crc kubenswrapper[4599]: I1012 07:35:56.287749 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:56 crc kubenswrapper[4599]: I1012 07:35:56.287760 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:56 crc kubenswrapper[4599]: I1012 07:35:56.287771 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:56 crc kubenswrapper[4599]: I1012 07:35:56.287779 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:56Z","lastTransitionTime":"2025-10-12T07:35:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:56 crc kubenswrapper[4599]: I1012 07:35:56.390192 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:56 crc kubenswrapper[4599]: I1012 07:35:56.390232 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:56 crc kubenswrapper[4599]: I1012 07:35:56.390242 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:56 crc kubenswrapper[4599]: I1012 07:35:56.390257 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:56 crc kubenswrapper[4599]: I1012 07:35:56.390267 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:56Z","lastTransitionTime":"2025-10-12T07:35:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:56 crc kubenswrapper[4599]: I1012 07:35:56.492019 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:56 crc kubenswrapper[4599]: I1012 07:35:56.492045 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:56 crc kubenswrapper[4599]: I1012 07:35:56.492053 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:56 crc kubenswrapper[4599]: I1012 07:35:56.492067 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:56 crc kubenswrapper[4599]: I1012 07:35:56.492074 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:56Z","lastTransitionTime":"2025-10-12T07:35:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:56 crc kubenswrapper[4599]: I1012 07:35:56.544946 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kwphq" Oct 12 07:35:56 crc kubenswrapper[4599]: E1012 07:35:56.545049 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kwphq" podUID="3c3e76cc-139b-4a2a-b96b-6077e3706376" Oct 12 07:35:56 crc kubenswrapper[4599]: I1012 07:35:56.594577 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:56 crc kubenswrapper[4599]: I1012 07:35:56.594605 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:56 crc kubenswrapper[4599]: I1012 07:35:56.594613 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:56 crc kubenswrapper[4599]: I1012 07:35:56.594623 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:56 crc kubenswrapper[4599]: I1012 07:35:56.594633 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:56Z","lastTransitionTime":"2025-10-12T07:35:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:56 crc kubenswrapper[4599]: I1012 07:35:56.697042 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:56 crc kubenswrapper[4599]: I1012 07:35:56.697068 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:56 crc kubenswrapper[4599]: I1012 07:35:56.697076 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:56 crc kubenswrapper[4599]: I1012 07:35:56.697086 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:56 crc kubenswrapper[4599]: I1012 07:35:56.697094 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:56Z","lastTransitionTime":"2025-10-12T07:35:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:56 crc kubenswrapper[4599]: I1012 07:35:56.800023 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:56 crc kubenswrapper[4599]: I1012 07:35:56.800051 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:56 crc kubenswrapper[4599]: I1012 07:35:56.800059 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:56 crc kubenswrapper[4599]: I1012 07:35:56.800071 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:56 crc kubenswrapper[4599]: I1012 07:35:56.800079 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:56Z","lastTransitionTime":"2025-10-12T07:35:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:56 crc kubenswrapper[4599]: I1012 07:35:56.901874 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:56 crc kubenswrapper[4599]: I1012 07:35:56.901908 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:56 crc kubenswrapper[4599]: I1012 07:35:56.901917 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:56 crc kubenswrapper[4599]: I1012 07:35:56.901929 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:56 crc kubenswrapper[4599]: I1012 07:35:56.901936 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:56Z","lastTransitionTime":"2025-10-12T07:35:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:57 crc kubenswrapper[4599]: I1012 07:35:57.004842 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:57 crc kubenswrapper[4599]: I1012 07:35:57.004876 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:57 crc kubenswrapper[4599]: I1012 07:35:57.004887 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:57 crc kubenswrapper[4599]: I1012 07:35:57.004901 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:57 crc kubenswrapper[4599]: I1012 07:35:57.004913 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:57Z","lastTransitionTime":"2025-10-12T07:35:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:57 crc kubenswrapper[4599]: I1012 07:35:57.107463 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:57 crc kubenswrapper[4599]: I1012 07:35:57.107513 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:57 crc kubenswrapper[4599]: I1012 07:35:57.107530 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:57 crc kubenswrapper[4599]: I1012 07:35:57.107549 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:57 crc kubenswrapper[4599]: I1012 07:35:57.107563 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:57Z","lastTransitionTime":"2025-10-12T07:35:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:57 crc kubenswrapper[4599]: I1012 07:35:57.210141 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:57 crc kubenswrapper[4599]: I1012 07:35:57.210198 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:57 crc kubenswrapper[4599]: I1012 07:35:57.210210 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:57 crc kubenswrapper[4599]: I1012 07:35:57.210230 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:57 crc kubenswrapper[4599]: I1012 07:35:57.210241 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:57Z","lastTransitionTime":"2025-10-12T07:35:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:57 crc kubenswrapper[4599]: I1012 07:35:57.312211 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:57 crc kubenswrapper[4599]: I1012 07:35:57.312256 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:57 crc kubenswrapper[4599]: I1012 07:35:57.312265 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:57 crc kubenswrapper[4599]: I1012 07:35:57.312282 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:57 crc kubenswrapper[4599]: I1012 07:35:57.312291 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:57Z","lastTransitionTime":"2025-10-12T07:35:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:57 crc kubenswrapper[4599]: I1012 07:35:57.414768 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:57 crc kubenswrapper[4599]: I1012 07:35:57.414809 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:57 crc kubenswrapper[4599]: I1012 07:35:57.414821 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:57 crc kubenswrapper[4599]: I1012 07:35:57.415042 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:57 crc kubenswrapper[4599]: I1012 07:35:57.415051 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:57Z","lastTransitionTime":"2025-10-12T07:35:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:57 crc kubenswrapper[4599]: I1012 07:35:57.517296 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:57 crc kubenswrapper[4599]: I1012 07:35:57.517377 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:57 crc kubenswrapper[4599]: I1012 07:35:57.517389 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:57 crc kubenswrapper[4599]: I1012 07:35:57.517406 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:57 crc kubenswrapper[4599]: I1012 07:35:57.517416 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:57Z","lastTransitionTime":"2025-10-12T07:35:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:57 crc kubenswrapper[4599]: I1012 07:35:57.544997 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 07:35:57 crc kubenswrapper[4599]: I1012 07:35:57.545108 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 07:35:57 crc kubenswrapper[4599]: E1012 07:35:57.545214 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 07:35:57 crc kubenswrapper[4599]: I1012 07:35:57.545250 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 07:35:57 crc kubenswrapper[4599]: E1012 07:35:57.545350 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 07:35:57 crc kubenswrapper[4599]: E1012 07:35:57.545452 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 07:35:57 crc kubenswrapper[4599]: I1012 07:35:57.620260 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:57 crc kubenswrapper[4599]: I1012 07:35:57.620294 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:57 crc kubenswrapper[4599]: I1012 07:35:57.620307 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:57 crc kubenswrapper[4599]: I1012 07:35:57.620322 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:57 crc kubenswrapper[4599]: I1012 07:35:57.620414 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:57Z","lastTransitionTime":"2025-10-12T07:35:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:57 crc kubenswrapper[4599]: I1012 07:35:57.723209 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:57 crc kubenswrapper[4599]: I1012 07:35:57.723265 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:57 crc kubenswrapper[4599]: I1012 07:35:57.723279 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:57 crc kubenswrapper[4599]: I1012 07:35:57.723296 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:57 crc kubenswrapper[4599]: I1012 07:35:57.723305 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:57Z","lastTransitionTime":"2025-10-12T07:35:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:57 crc kubenswrapper[4599]: I1012 07:35:57.826413 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:57 crc kubenswrapper[4599]: I1012 07:35:57.826485 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:57 crc kubenswrapper[4599]: I1012 07:35:57.826497 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:57 crc kubenswrapper[4599]: I1012 07:35:57.826525 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:57 crc kubenswrapper[4599]: I1012 07:35:57.826540 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:57Z","lastTransitionTime":"2025-10-12T07:35:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:57 crc kubenswrapper[4599]: I1012 07:35:57.929905 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:57 crc kubenswrapper[4599]: I1012 07:35:57.929956 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:57 crc kubenswrapper[4599]: I1012 07:35:57.929967 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:57 crc kubenswrapper[4599]: I1012 07:35:57.929990 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:57 crc kubenswrapper[4599]: I1012 07:35:57.930003 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:57Z","lastTransitionTime":"2025-10-12T07:35:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:58 crc kubenswrapper[4599]: I1012 07:35:58.032452 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:58 crc kubenswrapper[4599]: I1012 07:35:58.032508 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:58 crc kubenswrapper[4599]: I1012 07:35:58.032518 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:58 crc kubenswrapper[4599]: I1012 07:35:58.032536 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:58 crc kubenswrapper[4599]: I1012 07:35:58.032547 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:58Z","lastTransitionTime":"2025-10-12T07:35:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:58 crc kubenswrapper[4599]: I1012 07:35:58.134803 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:58 crc kubenswrapper[4599]: I1012 07:35:58.134846 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:58 crc kubenswrapper[4599]: I1012 07:35:58.134868 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:58 crc kubenswrapper[4599]: I1012 07:35:58.134883 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:58 crc kubenswrapper[4599]: I1012 07:35:58.134906 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:58Z","lastTransitionTime":"2025-10-12T07:35:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:58 crc kubenswrapper[4599]: I1012 07:35:58.237668 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:58 crc kubenswrapper[4599]: I1012 07:35:58.237724 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:58 crc kubenswrapper[4599]: I1012 07:35:58.237734 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:58 crc kubenswrapper[4599]: I1012 07:35:58.237755 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:58 crc kubenswrapper[4599]: I1012 07:35:58.237769 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:58Z","lastTransitionTime":"2025-10-12T07:35:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:58 crc kubenswrapper[4599]: I1012 07:35:58.340908 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:58 crc kubenswrapper[4599]: I1012 07:35:58.340969 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:58 crc kubenswrapper[4599]: I1012 07:35:58.340980 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:58 crc kubenswrapper[4599]: I1012 07:35:58.340998 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:58 crc kubenswrapper[4599]: I1012 07:35:58.341009 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:58Z","lastTransitionTime":"2025-10-12T07:35:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:58 crc kubenswrapper[4599]: I1012 07:35:58.444216 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:58 crc kubenswrapper[4599]: I1012 07:35:58.444269 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:58 crc kubenswrapper[4599]: I1012 07:35:58.444281 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:58 crc kubenswrapper[4599]: I1012 07:35:58.444304 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:58 crc kubenswrapper[4599]: I1012 07:35:58.444318 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:58Z","lastTransitionTime":"2025-10-12T07:35:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:58 crc kubenswrapper[4599]: I1012 07:35:58.545218 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kwphq" Oct 12 07:35:58 crc kubenswrapper[4599]: E1012 07:35:58.545403 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kwphq" podUID="3c3e76cc-139b-4a2a-b96b-6077e3706376" Oct 12 07:35:58 crc kubenswrapper[4599]: I1012 07:35:58.547093 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:58 crc kubenswrapper[4599]: I1012 07:35:58.547136 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:58 crc kubenswrapper[4599]: I1012 07:35:58.547147 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:58 crc kubenswrapper[4599]: I1012 07:35:58.547166 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:58 crc kubenswrapper[4599]: I1012 07:35:58.547175 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:58Z","lastTransitionTime":"2025-10-12T07:35:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:58 crc kubenswrapper[4599]: I1012 07:35:58.649536 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:58 crc kubenswrapper[4599]: I1012 07:35:58.649574 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:58 crc kubenswrapper[4599]: I1012 07:35:58.649584 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:58 crc kubenswrapper[4599]: I1012 07:35:58.649597 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:58 crc kubenswrapper[4599]: I1012 07:35:58.649606 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:58Z","lastTransitionTime":"2025-10-12T07:35:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:58 crc kubenswrapper[4599]: I1012 07:35:58.752439 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:58 crc kubenswrapper[4599]: I1012 07:35:58.752526 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:58 crc kubenswrapper[4599]: I1012 07:35:58.752539 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:58 crc kubenswrapper[4599]: I1012 07:35:58.752566 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:58 crc kubenswrapper[4599]: I1012 07:35:58.752581 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:58Z","lastTransitionTime":"2025-10-12T07:35:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:58 crc kubenswrapper[4599]: I1012 07:35:58.854312 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:58 crc kubenswrapper[4599]: I1012 07:35:58.854404 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:58 crc kubenswrapper[4599]: I1012 07:35:58.854417 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:58 crc kubenswrapper[4599]: I1012 07:35:58.854438 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:58 crc kubenswrapper[4599]: I1012 07:35:58.854453 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:58Z","lastTransitionTime":"2025-10-12T07:35:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:58 crc kubenswrapper[4599]: I1012 07:35:58.957012 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:58 crc kubenswrapper[4599]: I1012 07:35:58.957075 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:58 crc kubenswrapper[4599]: I1012 07:35:58.957092 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:58 crc kubenswrapper[4599]: I1012 07:35:58.957113 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:58 crc kubenswrapper[4599]: I1012 07:35:58.957124 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:58Z","lastTransitionTime":"2025-10-12T07:35:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:59 crc kubenswrapper[4599]: I1012 07:35:59.060053 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:59 crc kubenswrapper[4599]: I1012 07:35:59.060111 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:59 crc kubenswrapper[4599]: I1012 07:35:59.060124 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:59 crc kubenswrapper[4599]: I1012 07:35:59.060145 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:59 crc kubenswrapper[4599]: I1012 07:35:59.060157 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:59Z","lastTransitionTime":"2025-10-12T07:35:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:59 crc kubenswrapper[4599]: I1012 07:35:59.162548 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:59 crc kubenswrapper[4599]: I1012 07:35:59.162604 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:59 crc kubenswrapper[4599]: I1012 07:35:59.162616 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:59 crc kubenswrapper[4599]: I1012 07:35:59.162634 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:59 crc kubenswrapper[4599]: I1012 07:35:59.162650 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:59Z","lastTransitionTime":"2025-10-12T07:35:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:59 crc kubenswrapper[4599]: I1012 07:35:59.264837 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:59 crc kubenswrapper[4599]: I1012 07:35:59.264891 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:59 crc kubenswrapper[4599]: I1012 07:35:59.264901 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:59 crc kubenswrapper[4599]: I1012 07:35:59.264917 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:59 crc kubenswrapper[4599]: I1012 07:35:59.264931 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:59Z","lastTransitionTime":"2025-10-12T07:35:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:59 crc kubenswrapper[4599]: I1012 07:35:59.367374 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:59 crc kubenswrapper[4599]: I1012 07:35:59.367428 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:59 crc kubenswrapper[4599]: I1012 07:35:59.367440 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:59 crc kubenswrapper[4599]: I1012 07:35:59.367461 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:59 crc kubenswrapper[4599]: I1012 07:35:59.367474 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:59Z","lastTransitionTime":"2025-10-12T07:35:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:59 crc kubenswrapper[4599]: I1012 07:35:59.469631 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:59 crc kubenswrapper[4599]: I1012 07:35:59.469676 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:59 crc kubenswrapper[4599]: I1012 07:35:59.469687 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:59 crc kubenswrapper[4599]: I1012 07:35:59.469702 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:59 crc kubenswrapper[4599]: I1012 07:35:59.469715 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:59Z","lastTransitionTime":"2025-10-12T07:35:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:59 crc kubenswrapper[4599]: I1012 07:35:59.545314 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 07:35:59 crc kubenswrapper[4599]: I1012 07:35:59.545323 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 07:35:59 crc kubenswrapper[4599]: E1012 07:35:59.545485 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 07:35:59 crc kubenswrapper[4599]: I1012 07:35:59.545515 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 07:35:59 crc kubenswrapper[4599]: E1012 07:35:59.545686 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 07:35:59 crc kubenswrapper[4599]: E1012 07:35:59.545809 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 07:35:59 crc kubenswrapper[4599]: I1012 07:35:59.571249 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:59 crc kubenswrapper[4599]: I1012 07:35:59.571292 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:59 crc kubenswrapper[4599]: I1012 07:35:59.571302 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:59 crc kubenswrapper[4599]: I1012 07:35:59.571319 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:59 crc kubenswrapper[4599]: I1012 07:35:59.571350 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:59Z","lastTransitionTime":"2025-10-12T07:35:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:59 crc kubenswrapper[4599]: I1012 07:35:59.673822 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:59 crc kubenswrapper[4599]: I1012 07:35:59.673857 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:59 crc kubenswrapper[4599]: I1012 07:35:59.673867 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:59 crc kubenswrapper[4599]: I1012 07:35:59.673893 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:59 crc kubenswrapper[4599]: I1012 07:35:59.673904 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:59Z","lastTransitionTime":"2025-10-12T07:35:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:59 crc kubenswrapper[4599]: I1012 07:35:59.777142 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:59 crc kubenswrapper[4599]: I1012 07:35:59.777212 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:59 crc kubenswrapper[4599]: I1012 07:35:59.777227 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:59 crc kubenswrapper[4599]: I1012 07:35:59.777251 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:59 crc kubenswrapper[4599]: I1012 07:35:59.777264 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:59Z","lastTransitionTime":"2025-10-12T07:35:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:59 crc kubenswrapper[4599]: I1012 07:35:59.879608 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:59 crc kubenswrapper[4599]: I1012 07:35:59.879677 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:59 crc kubenswrapper[4599]: I1012 07:35:59.879688 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:59 crc kubenswrapper[4599]: I1012 07:35:59.879707 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:59 crc kubenswrapper[4599]: I1012 07:35:59.879718 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:59Z","lastTransitionTime":"2025-10-12T07:35:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:35:59 crc kubenswrapper[4599]: I1012 07:35:59.982776 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:35:59 crc kubenswrapper[4599]: I1012 07:35:59.982828 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:35:59 crc kubenswrapper[4599]: I1012 07:35:59.982838 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:35:59 crc kubenswrapper[4599]: I1012 07:35:59.982857 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:35:59 crc kubenswrapper[4599]: I1012 07:35:59.982868 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:35:59Z","lastTransitionTime":"2025-10-12T07:35:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:00 crc kubenswrapper[4599]: I1012 07:36:00.086123 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:00 crc kubenswrapper[4599]: I1012 07:36:00.086175 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:00 crc kubenswrapper[4599]: I1012 07:36:00.086186 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:00 crc kubenswrapper[4599]: I1012 07:36:00.086207 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:00 crc kubenswrapper[4599]: I1012 07:36:00.086219 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:00Z","lastTransitionTime":"2025-10-12T07:36:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:00 crc kubenswrapper[4599]: I1012 07:36:00.188481 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:00 crc kubenswrapper[4599]: I1012 07:36:00.188529 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:00 crc kubenswrapper[4599]: I1012 07:36:00.188540 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:00 crc kubenswrapper[4599]: I1012 07:36:00.188559 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:00 crc kubenswrapper[4599]: I1012 07:36:00.188569 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:00Z","lastTransitionTime":"2025-10-12T07:36:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:00 crc kubenswrapper[4599]: I1012 07:36:00.290660 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:00 crc kubenswrapper[4599]: I1012 07:36:00.290699 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:00 crc kubenswrapper[4599]: I1012 07:36:00.290709 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:00 crc kubenswrapper[4599]: I1012 07:36:00.290722 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:00 crc kubenswrapper[4599]: I1012 07:36:00.290732 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:00Z","lastTransitionTime":"2025-10-12T07:36:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:00 crc kubenswrapper[4599]: I1012 07:36:00.393035 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:00 crc kubenswrapper[4599]: I1012 07:36:00.393076 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:00 crc kubenswrapper[4599]: I1012 07:36:00.393086 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:00 crc kubenswrapper[4599]: I1012 07:36:00.393103 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:00 crc kubenswrapper[4599]: I1012 07:36:00.393114 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:00Z","lastTransitionTime":"2025-10-12T07:36:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:00 crc kubenswrapper[4599]: I1012 07:36:00.496060 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:00 crc kubenswrapper[4599]: I1012 07:36:00.496106 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:00 crc kubenswrapper[4599]: I1012 07:36:00.496117 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:00 crc kubenswrapper[4599]: I1012 07:36:00.496134 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:00 crc kubenswrapper[4599]: I1012 07:36:00.496147 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:00Z","lastTransitionTime":"2025-10-12T07:36:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:00 crc kubenswrapper[4599]: I1012 07:36:00.545004 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kwphq" Oct 12 07:36:00 crc kubenswrapper[4599]: E1012 07:36:00.545168 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kwphq" podUID="3c3e76cc-139b-4a2a-b96b-6077e3706376" Oct 12 07:36:00 crc kubenswrapper[4599]: I1012 07:36:00.598368 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:00 crc kubenswrapper[4599]: I1012 07:36:00.598417 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:00 crc kubenswrapper[4599]: I1012 07:36:00.598426 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:00 crc kubenswrapper[4599]: I1012 07:36:00.598444 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:00 crc kubenswrapper[4599]: I1012 07:36:00.598455 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:00Z","lastTransitionTime":"2025-10-12T07:36:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:00 crc kubenswrapper[4599]: I1012 07:36:00.700359 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:00 crc kubenswrapper[4599]: I1012 07:36:00.700400 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:00 crc kubenswrapper[4599]: I1012 07:36:00.700409 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:00 crc kubenswrapper[4599]: I1012 07:36:00.700429 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:00 crc kubenswrapper[4599]: I1012 07:36:00.700444 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:00Z","lastTransitionTime":"2025-10-12T07:36:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:00 crc kubenswrapper[4599]: I1012 07:36:00.803026 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:00 crc kubenswrapper[4599]: I1012 07:36:00.803065 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:00 crc kubenswrapper[4599]: I1012 07:36:00.803077 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:00 crc kubenswrapper[4599]: I1012 07:36:00.803094 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:00 crc kubenswrapper[4599]: I1012 07:36:00.803104 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:00Z","lastTransitionTime":"2025-10-12T07:36:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:00 crc kubenswrapper[4599]: I1012 07:36:00.904886 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:00 crc kubenswrapper[4599]: I1012 07:36:00.904954 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:00 crc kubenswrapper[4599]: I1012 07:36:00.904964 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:00 crc kubenswrapper[4599]: I1012 07:36:00.904976 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:00 crc kubenswrapper[4599]: I1012 07:36:00.904984 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:00Z","lastTransitionTime":"2025-10-12T07:36:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:01 crc kubenswrapper[4599]: I1012 07:36:01.006937 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:01 crc kubenswrapper[4599]: I1012 07:36:01.006967 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:01 crc kubenswrapper[4599]: I1012 07:36:01.006977 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:01 crc kubenswrapper[4599]: I1012 07:36:01.006989 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:01 crc kubenswrapper[4599]: I1012 07:36:01.006997 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:01Z","lastTransitionTime":"2025-10-12T07:36:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:01 crc kubenswrapper[4599]: I1012 07:36:01.108702 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:01 crc kubenswrapper[4599]: I1012 07:36:01.108725 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:01 crc kubenswrapper[4599]: I1012 07:36:01.108732 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:01 crc kubenswrapper[4599]: I1012 07:36:01.108743 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:01 crc kubenswrapper[4599]: I1012 07:36:01.108751 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:01Z","lastTransitionTime":"2025-10-12T07:36:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:01 crc kubenswrapper[4599]: I1012 07:36:01.211146 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:01 crc kubenswrapper[4599]: I1012 07:36:01.211173 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:01 crc kubenswrapper[4599]: I1012 07:36:01.211183 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:01 crc kubenswrapper[4599]: I1012 07:36:01.211194 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:01 crc kubenswrapper[4599]: I1012 07:36:01.211201 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:01Z","lastTransitionTime":"2025-10-12T07:36:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:01 crc kubenswrapper[4599]: I1012 07:36:01.312617 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:01 crc kubenswrapper[4599]: I1012 07:36:01.312640 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:01 crc kubenswrapper[4599]: I1012 07:36:01.312649 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:01 crc kubenswrapper[4599]: I1012 07:36:01.312661 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:01 crc kubenswrapper[4599]: I1012 07:36:01.312861 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:01Z","lastTransitionTime":"2025-10-12T07:36:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:01 crc kubenswrapper[4599]: I1012 07:36:01.414583 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:01 crc kubenswrapper[4599]: I1012 07:36:01.414607 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:01 crc kubenswrapper[4599]: I1012 07:36:01.414615 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:01 crc kubenswrapper[4599]: I1012 07:36:01.414627 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:01 crc kubenswrapper[4599]: I1012 07:36:01.414635 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:01Z","lastTransitionTime":"2025-10-12T07:36:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:01 crc kubenswrapper[4599]: I1012 07:36:01.516819 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:01 crc kubenswrapper[4599]: I1012 07:36:01.516845 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:01 crc kubenswrapper[4599]: I1012 07:36:01.516853 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:01 crc kubenswrapper[4599]: I1012 07:36:01.516862 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:01 crc kubenswrapper[4599]: I1012 07:36:01.516869 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:01Z","lastTransitionTime":"2025-10-12T07:36:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:01 crc kubenswrapper[4599]: I1012 07:36:01.544886 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 07:36:01 crc kubenswrapper[4599]: E1012 07:36:01.544988 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 07:36:01 crc kubenswrapper[4599]: I1012 07:36:01.545118 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 07:36:01 crc kubenswrapper[4599]: E1012 07:36:01.545166 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 07:36:01 crc kubenswrapper[4599]: I1012 07:36:01.545124 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 07:36:01 crc kubenswrapper[4599]: E1012 07:36:01.545250 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 07:36:01 crc kubenswrapper[4599]: I1012 07:36:01.618709 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:01 crc kubenswrapper[4599]: I1012 07:36:01.618734 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:01 crc kubenswrapper[4599]: I1012 07:36:01.618744 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:01 crc kubenswrapper[4599]: I1012 07:36:01.618755 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:01 crc kubenswrapper[4599]: I1012 07:36:01.618762 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:01Z","lastTransitionTime":"2025-10-12T07:36:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:01 crc kubenswrapper[4599]: I1012 07:36:01.721481 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:01 crc kubenswrapper[4599]: I1012 07:36:01.721535 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:01 crc kubenswrapper[4599]: I1012 07:36:01.721545 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:01 crc kubenswrapper[4599]: I1012 07:36:01.721567 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:01 crc kubenswrapper[4599]: I1012 07:36:01.721580 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:01Z","lastTransitionTime":"2025-10-12T07:36:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:01 crc kubenswrapper[4599]: I1012 07:36:01.823489 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:01 crc kubenswrapper[4599]: I1012 07:36:01.823517 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:01 crc kubenswrapper[4599]: I1012 07:36:01.823529 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:01 crc kubenswrapper[4599]: I1012 07:36:01.823542 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:01 crc kubenswrapper[4599]: I1012 07:36:01.823551 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:01Z","lastTransitionTime":"2025-10-12T07:36:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:01 crc kubenswrapper[4599]: I1012 07:36:01.925496 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:01 crc kubenswrapper[4599]: I1012 07:36:01.925545 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:01 crc kubenswrapper[4599]: I1012 07:36:01.925555 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:01 crc kubenswrapper[4599]: I1012 07:36:01.925571 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:01 crc kubenswrapper[4599]: I1012 07:36:01.925579 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:01Z","lastTransitionTime":"2025-10-12T07:36:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:02 crc kubenswrapper[4599]: I1012 07:36:02.027704 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:02 crc kubenswrapper[4599]: I1012 07:36:02.027750 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:02 crc kubenswrapper[4599]: I1012 07:36:02.027760 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:02 crc kubenswrapper[4599]: I1012 07:36:02.027772 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:02 crc kubenswrapper[4599]: I1012 07:36:02.027782 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:02Z","lastTransitionTime":"2025-10-12T07:36:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:02 crc kubenswrapper[4599]: I1012 07:36:02.129393 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:02 crc kubenswrapper[4599]: I1012 07:36:02.129422 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:02 crc kubenswrapper[4599]: I1012 07:36:02.129430 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:02 crc kubenswrapper[4599]: I1012 07:36:02.129440 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:02 crc kubenswrapper[4599]: I1012 07:36:02.129450 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:02Z","lastTransitionTime":"2025-10-12T07:36:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:02 crc kubenswrapper[4599]: I1012 07:36:02.230986 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:02 crc kubenswrapper[4599]: I1012 07:36:02.231016 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:02 crc kubenswrapper[4599]: I1012 07:36:02.231025 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:02 crc kubenswrapper[4599]: I1012 07:36:02.231036 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:02 crc kubenswrapper[4599]: I1012 07:36:02.231043 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:02Z","lastTransitionTime":"2025-10-12T07:36:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:02 crc kubenswrapper[4599]: I1012 07:36:02.332862 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:02 crc kubenswrapper[4599]: I1012 07:36:02.332897 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:02 crc kubenswrapper[4599]: I1012 07:36:02.332908 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:02 crc kubenswrapper[4599]: I1012 07:36:02.332921 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:02 crc kubenswrapper[4599]: I1012 07:36:02.332939 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:02Z","lastTransitionTime":"2025-10-12T07:36:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:02 crc kubenswrapper[4599]: I1012 07:36:02.434691 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:02 crc kubenswrapper[4599]: I1012 07:36:02.434752 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:02 crc kubenswrapper[4599]: I1012 07:36:02.434783 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:02 crc kubenswrapper[4599]: I1012 07:36:02.434815 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:02 crc kubenswrapper[4599]: I1012 07:36:02.434824 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:02Z","lastTransitionTime":"2025-10-12T07:36:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:02 crc kubenswrapper[4599]: I1012 07:36:02.536317 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:02 crc kubenswrapper[4599]: I1012 07:36:02.536379 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:02 crc kubenswrapper[4599]: I1012 07:36:02.536484 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:02 crc kubenswrapper[4599]: I1012 07:36:02.536506 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:02 crc kubenswrapper[4599]: I1012 07:36:02.536517 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:02Z","lastTransitionTime":"2025-10-12T07:36:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:02 crc kubenswrapper[4599]: I1012 07:36:02.544820 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kwphq" Oct 12 07:36:02 crc kubenswrapper[4599]: E1012 07:36:02.545159 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kwphq" podUID="3c3e76cc-139b-4a2a-b96b-6077e3706376" Oct 12 07:36:02 crc kubenswrapper[4599]: I1012 07:36:02.545313 4599 scope.go:117] "RemoveContainer" containerID="72e00dcebdb38459b2ffe60407ea9c8a343a2a57435d0596134981223e989f8a" Oct 12 07:36:02 crc kubenswrapper[4599]: E1012 07:36:02.545470 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-whk5b_openshift-ovn-kubernetes(1a95b7ab-8632-4332-a30f-64f28ef8d313)\"" pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" podUID="1a95b7ab-8632-4332-a30f-64f28ef8d313" Oct 12 07:36:02 crc kubenswrapper[4599]: I1012 07:36:02.638813 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:02 crc kubenswrapper[4599]: I1012 07:36:02.638857 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:02 crc kubenswrapper[4599]: I1012 07:36:02.638867 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:02 crc kubenswrapper[4599]: I1012 07:36:02.638885 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:02 crc kubenswrapper[4599]: I1012 07:36:02.638899 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:02Z","lastTransitionTime":"2025-10-12T07:36:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:02 crc kubenswrapper[4599]: I1012 07:36:02.741291 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:02 crc kubenswrapper[4599]: I1012 07:36:02.741358 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:02 crc kubenswrapper[4599]: I1012 07:36:02.741370 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:02 crc kubenswrapper[4599]: I1012 07:36:02.741388 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:02 crc kubenswrapper[4599]: I1012 07:36:02.741402 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:02Z","lastTransitionTime":"2025-10-12T07:36:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:02 crc kubenswrapper[4599]: I1012 07:36:02.843602 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:02 crc kubenswrapper[4599]: I1012 07:36:02.843652 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:02 crc kubenswrapper[4599]: I1012 07:36:02.843662 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:02 crc kubenswrapper[4599]: I1012 07:36:02.843680 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:02 crc kubenswrapper[4599]: I1012 07:36:02.843695 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:02Z","lastTransitionTime":"2025-10-12T07:36:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:02 crc kubenswrapper[4599]: I1012 07:36:02.946019 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:02 crc kubenswrapper[4599]: I1012 07:36:02.946075 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:02 crc kubenswrapper[4599]: I1012 07:36:02.946084 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:02 crc kubenswrapper[4599]: I1012 07:36:02.946103 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:02 crc kubenswrapper[4599]: I1012 07:36:02.946113 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:02Z","lastTransitionTime":"2025-10-12T07:36:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:03 crc kubenswrapper[4599]: I1012 07:36:03.048769 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:03 crc kubenswrapper[4599]: I1012 07:36:03.048845 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:03 crc kubenswrapper[4599]: I1012 07:36:03.048856 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:03 crc kubenswrapper[4599]: I1012 07:36:03.048883 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:03 crc kubenswrapper[4599]: I1012 07:36:03.048899 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:03Z","lastTransitionTime":"2025-10-12T07:36:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:03 crc kubenswrapper[4599]: I1012 07:36:03.150860 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:03 crc kubenswrapper[4599]: I1012 07:36:03.150902 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:03 crc kubenswrapper[4599]: I1012 07:36:03.150911 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:03 crc kubenswrapper[4599]: I1012 07:36:03.150926 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:03 crc kubenswrapper[4599]: I1012 07:36:03.150944 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:03Z","lastTransitionTime":"2025-10-12T07:36:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:03 crc kubenswrapper[4599]: I1012 07:36:03.253026 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:03 crc kubenswrapper[4599]: I1012 07:36:03.253076 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:03 crc kubenswrapper[4599]: I1012 07:36:03.253088 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:03 crc kubenswrapper[4599]: I1012 07:36:03.253113 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:03 crc kubenswrapper[4599]: I1012 07:36:03.253127 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:03Z","lastTransitionTime":"2025-10-12T07:36:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:03 crc kubenswrapper[4599]: I1012 07:36:03.355173 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:03 crc kubenswrapper[4599]: I1012 07:36:03.355224 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:03 crc kubenswrapper[4599]: I1012 07:36:03.355233 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:03 crc kubenswrapper[4599]: I1012 07:36:03.355252 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:03 crc kubenswrapper[4599]: I1012 07:36:03.355264 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:03Z","lastTransitionTime":"2025-10-12T07:36:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:03 crc kubenswrapper[4599]: I1012 07:36:03.456956 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:03 crc kubenswrapper[4599]: I1012 07:36:03.456989 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:03 crc kubenswrapper[4599]: I1012 07:36:03.456999 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:03 crc kubenswrapper[4599]: I1012 07:36:03.457014 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:03 crc kubenswrapper[4599]: I1012 07:36:03.457024 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:03Z","lastTransitionTime":"2025-10-12T07:36:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:03 crc kubenswrapper[4599]: I1012 07:36:03.545112 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 07:36:03 crc kubenswrapper[4599]: E1012 07:36:03.545266 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 07:36:03 crc kubenswrapper[4599]: I1012 07:36:03.545487 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 07:36:03 crc kubenswrapper[4599]: I1012 07:36:03.545487 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 07:36:03 crc kubenswrapper[4599]: E1012 07:36:03.545647 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 07:36:03 crc kubenswrapper[4599]: E1012 07:36:03.545679 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 07:36:03 crc kubenswrapper[4599]: I1012 07:36:03.558821 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:03 crc kubenswrapper[4599]: I1012 07:36:03.558866 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:03 crc kubenswrapper[4599]: I1012 07:36:03.558878 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:03 crc kubenswrapper[4599]: I1012 07:36:03.558894 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:03 crc kubenswrapper[4599]: I1012 07:36:03.558908 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:03Z","lastTransitionTime":"2025-10-12T07:36:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:03 crc kubenswrapper[4599]: I1012 07:36:03.558844 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2869b43-5f5f-4ed7-9dc3-63d210bde137\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21abc118842f09c54256bd05f1e5978c928bad569bc9b52157287576117fb49d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb25175bfe42e193200fbc76f82294afbd5f23d9653637c12ddf67553431748e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da490def77b19758f5479ef7070846dee19a5057bd6b0568d96fd38a8dd6528\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://622eac195ecab593bc5fa71a4a88ac987a45be5fb130336aae7acd680c51abe4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:03Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:03 crc kubenswrapper[4599]: I1012 07:36:03.569519 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:03Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:03 crc kubenswrapper[4599]: I1012 07:36:03.579246 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:03Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:03 crc kubenswrapper[4599]: I1012 07:36:03.588575 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4b9189620cc5d626fe6c3e7bbe0efd22f7a6f061bc2afd3d33b1e04ebdbfb1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc203371af6667734441e7dca5f3b301365eb1136f363df8c87377ae58168130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:03Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:03 crc kubenswrapper[4599]: I1012 07:36:03.596560 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b574e7cb76b05a81f63a55ce4544b1d4b3a58e6793be2c89620f66282642e39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:03Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:03 crc kubenswrapper[4599]: I1012 07:36:03.607984 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9xbn5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18ca8765-c435-4750-b803-14b539958d9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4db4e7d06f60157d91fcb8514d176f78540085701360ebcfa7eabb36fe84d6b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16c75354b9fbe00273ab4aaa5ca1306c5d097996b8ce8074a2cee8185118a001\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16c75354b9fbe00273ab4aaa5ca1306c5d097996b8ce8074a2cee8185118a001\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a7c28a100796e21008c597d15ee36786f7e87553a8a4feb5734ac79cef9b556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a7c28a100796e21008c597d15ee36786f7e87553a8a4feb5734ac79cef9b556\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6388c1e143c0caded216941ec25bd07be0a14f2f97f6328c35d5cc04815a88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6388c1e143c0caded216941ec25bd07be0a14f2f97f6328c35d5cc04815a88a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e201774968890569efafc736378cff0d700067bb0bd39677f42ef65fa2c07646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e201774968890569efafc736378cff0d700067bb0bd39677f42ef65fa2c07646\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f07c63fbebde2f05a5b1682af69eb42b5aec20728e0997f1ebc2f5a59815ad1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f07c63fbebde2f05a5b1682af69eb42b5aec20728e0997f1ebc2f5a59815ad1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f62b824f1e7400e9588b3d3a1afc52f86b3bcae039fbe3c757e1e01733e3a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f62b824f1e7400e9588b3d3a1afc52f86b3bcae039fbe3c757e1e01733e3a69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9xbn5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:03Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:03 crc kubenswrapper[4599]: I1012 07:36:03.618739 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8hm26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce311f52-0501-45d3-8209-b1d2aa25028b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://611114a1fecbd461507dd2f566f975c7ef3e8d7655d90c1a5b31cd05a430cb38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8hm26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:03Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:03 crc kubenswrapper[4599]: I1012 07:36:03.627282 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"636df6e3-ed6a-452a-9d5d-e26139f62951\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f52b4e0ae3f90600ec5e799352a1f70b4fad54a4805616d1508c22af008a0f8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56aa5b66fc1ca787e84c8ca2a717038f9e522862fb680d5271921e410d5841f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://091875c6e2e9f643a0c31783ab1905c5fd448d20e98b9b4de70be991e7e8f1c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d08b26fb90464c31907b23edfd9f4997375c3178e8699cf951af85424d9354d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d08b26fb90464c31907b23edfd9f4997375c3178e8699cf951af85424d9354d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:03Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:03Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:03 crc kubenswrapper[4599]: I1012 07:36:03.634237 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rxr42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92909e59-b659-4fc0-91c0-1880ff96b4f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4162f0c9d7789e80004cf3ea54eed47b4d473832cf68a03e348973c300f9dcc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7hr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rxr42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:03Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:03 crc kubenswrapper[4599]: I1012 07:36:03.641361 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f5988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0147ed28-bc0f-409c-a813-dc2ffffba092\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b06de751793dade650ec18334a6bcf8bb26e5e89af47ede264d087d1a950178a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzz9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f5988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:03Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:03 crc kubenswrapper[4599]: I1012 07:36:03.656929 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a95b7ab-8632-4332-a30f-64f28ef8d313\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://543f674ce98b17c90c5a351ae72f596a2e90006a1b86fcede3359715c67b94bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c150b687d6b0c337604ac14537335d57d9c955a9e31001d274507e5e65ac7bc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f73bbbb549129c48a1421dda3d2385cf3515feafd8215cdac6b792823d614b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a156f480c7252f61ec3dc991814db0e7259fdf7ae618c8f43d86da97f8a6bc5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f99854365302221a36d3d7647292430b5655222c551953b6aa6de9eb1c922bea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://810de96e34e6b028e90bcc5693f1624d337cdd775c01901fc91e6dd2a7f3564d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72e00dcebdb38459b2ffe60407ea9c8a343a2a57435d0596134981223e989f8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72e00dcebdb38459b2ffe60407ea9c8a343a2a57435d0596134981223e989f8a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T07:35:46Z\\\",\\\"message\\\":\\\"achine-config-daemon-5mz5c openshift-multus/multus-8hm26 openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xl2ns openshift-dns/node-resolver-f5988 openshift-multus/network-metrics-daemon-kwphq openshift-network-diagnostics/network-check-target-xd92c openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-kube-apiserver/kube-apiserver-crc openshift-kube-controller-manager/kube-controller-manager-crc openshift-network-operator/iptables-alerter-4ln5h]\\\\nI1012 07:35:46.210430 6238 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nF1012 07:35:46.210433 6238 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired o\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-whk5b_openshift-ovn-kubernetes(1a95b7ab-8632-4332-a30f-64f28ef8d313)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce496909e393b4896eb9ae165317648df3e828d219c231d95c6057cd08b78fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce4f64c7fab07385f9215821c5c75c5b345653fa141a238bb3dd74e7a1a291a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce4f64c7fab07385f9215821c5c75c5b345653fa141a238bb3dd74e7a1a291a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-whk5b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:03Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:03 crc kubenswrapper[4599]: I1012 07:36:03.660469 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:03 crc kubenswrapper[4599]: I1012 07:36:03.660502 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:03 crc kubenswrapper[4599]: I1012 07:36:03.660512 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:03 crc kubenswrapper[4599]: I1012 07:36:03.660525 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:03 crc kubenswrapper[4599]: I1012 07:36:03.660534 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:03Z","lastTransitionTime":"2025-10-12T07:36:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:03 crc kubenswrapper[4599]: I1012 07:36:03.663876 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kwphq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c3e76cc-139b-4a2a-b96b-6077e3706376\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9v52\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9v52\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kwphq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:03Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:03 crc kubenswrapper[4599]: I1012 07:36:03.672375 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8afb4c88-1806-483a-9957-30833be28202\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c61ee75e4bbe142c54914177759cf0c49ecc36f904c8c6bf421cd4917256d5a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://545c8135b4e5319ac3b9c84ee50556b66e4f5cfe0987f5aa37ef668130f6f267\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40188eb89126b26a3002269d063827de60fa17f72c18fbbd0e15b16c744e8ddb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc5c76ab7433309a1f2c32d2ada884b5b0d926fee8219463079f8689a1f7efef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21d90e31b7ebf70d48881d9d0837e8a1fdeabc3d53988c7ba993ce28e902d3d4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T07:35:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW1012 07:35:20.755648 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1012 07:35:20.755757 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 07:35:20.757774 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2293705747/tls.crt::/tmp/serving-cert-2293705747/tls.key\\\\\\\"\\\\nI1012 07:35:20.975376 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1012 07:35:20.977324 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1012 07:35:20.977357 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1012 07:35:20.977377 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1012 07:35:20.977382 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1012 07:35:20.982144 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1012 07:35:20.982166 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1012 07:35:20.982172 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1012 07:35:20.982212 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1012 07:35:20.982216 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1012 07:35:20.982220 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1012 07:35:20.982223 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1012 07:35:20.982226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1012 07:35:20.985689 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e595e7d1dee4ecff3b5deb6e363b57de09e4bcfddd6ab9ae9be27b6e3578628\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afc60525d60f5b497a75efb631611d658c23f6336c21ca7421bf3efee3ece0ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afc60525d60f5b497a75efb631611d658c23f6336c21ca7421bf3efee3ece0ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:03Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:03 crc kubenswrapper[4599]: I1012 07:36:03.680543 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db79c5e0b03856a5da5fba6b8e6cb77590ef9e29090ac66da24d54ea638a20f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:03Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:03 crc kubenswrapper[4599]: I1012 07:36:03.688176 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:03Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:03 crc kubenswrapper[4599]: I1012 07:36:03.696287 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc694bce-8c25-4729-b452-29d44d3efe6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c222e0cd037e0719d1ea94d4f28639dae42c676fa56b5f57b22cd50d87e6b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25l5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://238fe8d2b2e7f08d70bc5f938aca3c90b68c447db21a713d651d47b47a8127c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25l5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5mz5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:03Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:03 crc kubenswrapper[4599]: I1012 07:36:03.703556 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xl2ns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29fbfc4b-8e32-4132-a34d-48b25ec31428\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e2f2c3c17ada7047edb8e0fcd51227104458618c98e5ddec12123aa7ee173e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqqzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b84de1dd84c3c72376a170b7ccbdfff8c3bdd5e9e18a344002e112e81ca555a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqqzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xl2ns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:03Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:03 crc kubenswrapper[4599]: I1012 07:36:03.763044 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:03 crc kubenswrapper[4599]: I1012 07:36:03.763222 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:03 crc kubenswrapper[4599]: I1012 07:36:03.763494 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:03 crc kubenswrapper[4599]: I1012 07:36:03.763693 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:03 crc kubenswrapper[4599]: I1012 07:36:03.763881 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:03Z","lastTransitionTime":"2025-10-12T07:36:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:03 crc kubenswrapper[4599]: I1012 07:36:03.866541 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:03 crc kubenswrapper[4599]: I1012 07:36:03.866619 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:03 crc kubenswrapper[4599]: I1012 07:36:03.866633 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:03 crc kubenswrapper[4599]: I1012 07:36:03.866655 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:03 crc kubenswrapper[4599]: I1012 07:36:03.866675 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:03Z","lastTransitionTime":"2025-10-12T07:36:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:03 crc kubenswrapper[4599]: I1012 07:36:03.970470 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:03 crc kubenswrapper[4599]: I1012 07:36:03.970721 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:03 crc kubenswrapper[4599]: I1012 07:36:03.970805 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:03 crc kubenswrapper[4599]: I1012 07:36:03.970880 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:03 crc kubenswrapper[4599]: I1012 07:36:03.970939 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:03Z","lastTransitionTime":"2025-10-12T07:36:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:04 crc kubenswrapper[4599]: I1012 07:36:04.073480 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:04 crc kubenswrapper[4599]: I1012 07:36:04.074047 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:04 crc kubenswrapper[4599]: I1012 07:36:04.074120 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:04 crc kubenswrapper[4599]: I1012 07:36:04.074191 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:04 crc kubenswrapper[4599]: I1012 07:36:04.074267 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:04Z","lastTransitionTime":"2025-10-12T07:36:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:04 crc kubenswrapper[4599]: I1012 07:36:04.176976 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:04 crc kubenswrapper[4599]: I1012 07:36:04.177032 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:04 crc kubenswrapper[4599]: I1012 07:36:04.177042 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:04 crc kubenswrapper[4599]: I1012 07:36:04.177063 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:04 crc kubenswrapper[4599]: I1012 07:36:04.177078 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:04Z","lastTransitionTime":"2025-10-12T07:36:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:04 crc kubenswrapper[4599]: I1012 07:36:04.278974 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:04 crc kubenswrapper[4599]: I1012 07:36:04.279159 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:04 crc kubenswrapper[4599]: I1012 07:36:04.279257 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:04 crc kubenswrapper[4599]: I1012 07:36:04.279353 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:04 crc kubenswrapper[4599]: I1012 07:36:04.279430 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:04Z","lastTransitionTime":"2025-10-12T07:36:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:04 crc kubenswrapper[4599]: I1012 07:36:04.382156 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:04 crc kubenswrapper[4599]: I1012 07:36:04.382203 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:04 crc kubenswrapper[4599]: I1012 07:36:04.382215 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:04 crc kubenswrapper[4599]: I1012 07:36:04.382231 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:04 crc kubenswrapper[4599]: I1012 07:36:04.382241 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:04Z","lastTransitionTime":"2025-10-12T07:36:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:04 crc kubenswrapper[4599]: I1012 07:36:04.484099 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:04 crc kubenswrapper[4599]: I1012 07:36:04.484159 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:04 crc kubenswrapper[4599]: I1012 07:36:04.484169 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:04 crc kubenswrapper[4599]: I1012 07:36:04.484196 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:04 crc kubenswrapper[4599]: I1012 07:36:04.484211 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:04Z","lastTransitionTime":"2025-10-12T07:36:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:04 crc kubenswrapper[4599]: I1012 07:36:04.545172 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kwphq" Oct 12 07:36:04 crc kubenswrapper[4599]: E1012 07:36:04.545350 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kwphq" podUID="3c3e76cc-139b-4a2a-b96b-6077e3706376" Oct 12 07:36:04 crc kubenswrapper[4599]: I1012 07:36:04.587081 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:04 crc kubenswrapper[4599]: I1012 07:36:04.587138 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:04 crc kubenswrapper[4599]: I1012 07:36:04.587147 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:04 crc kubenswrapper[4599]: I1012 07:36:04.587170 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:04 crc kubenswrapper[4599]: I1012 07:36:04.587183 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:04Z","lastTransitionTime":"2025-10-12T07:36:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:04 crc kubenswrapper[4599]: I1012 07:36:04.689610 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:04 crc kubenswrapper[4599]: I1012 07:36:04.689676 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:04 crc kubenswrapper[4599]: I1012 07:36:04.689688 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:04 crc kubenswrapper[4599]: I1012 07:36:04.689712 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:04 crc kubenswrapper[4599]: I1012 07:36:04.689725 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:04Z","lastTransitionTime":"2025-10-12T07:36:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:04 crc kubenswrapper[4599]: I1012 07:36:04.791669 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:04 crc kubenswrapper[4599]: I1012 07:36:04.791717 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:04 crc kubenswrapper[4599]: I1012 07:36:04.791746 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:04 crc kubenswrapper[4599]: I1012 07:36:04.791764 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:04 crc kubenswrapper[4599]: I1012 07:36:04.791774 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:04Z","lastTransitionTime":"2025-10-12T07:36:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:04 crc kubenswrapper[4599]: I1012 07:36:04.893911 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:04 crc kubenswrapper[4599]: I1012 07:36:04.893964 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:04 crc kubenswrapper[4599]: I1012 07:36:04.893983 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:04 crc kubenswrapper[4599]: I1012 07:36:04.894002 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:04 crc kubenswrapper[4599]: I1012 07:36:04.894014 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:04Z","lastTransitionTime":"2025-10-12T07:36:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:04 crc kubenswrapper[4599]: I1012 07:36:04.997255 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:04 crc kubenswrapper[4599]: I1012 07:36:04.997305 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:04 crc kubenswrapper[4599]: I1012 07:36:04.997317 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:04 crc kubenswrapper[4599]: I1012 07:36:04.997360 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:04 crc kubenswrapper[4599]: I1012 07:36:04.997372 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:04Z","lastTransitionTime":"2025-10-12T07:36:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:05 crc kubenswrapper[4599]: I1012 07:36:05.054525 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:05 crc kubenswrapper[4599]: I1012 07:36:05.054597 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:05 crc kubenswrapper[4599]: I1012 07:36:05.054610 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:05 crc kubenswrapper[4599]: I1012 07:36:05.054633 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:05 crc kubenswrapper[4599]: I1012 07:36:05.054645 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:05Z","lastTransitionTime":"2025-10-12T07:36:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:05 crc kubenswrapper[4599]: E1012 07:36:05.067242 4599 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:36:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:36:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:36:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:36:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:36:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:36:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:36:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:36:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b3fd0901-686d-4e85-8767-c56bc470edcc\\\",\\\"systemUUID\\\":\\\"c0d90076-d180-408b-98a9-f48996ced0a6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:05Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:05 crc kubenswrapper[4599]: I1012 07:36:05.071438 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:05 crc kubenswrapper[4599]: I1012 07:36:05.071474 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:05 crc kubenswrapper[4599]: I1012 07:36:05.071484 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:05 crc kubenswrapper[4599]: I1012 07:36:05.071503 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:05 crc kubenswrapper[4599]: I1012 07:36:05.071515 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:05Z","lastTransitionTime":"2025-10-12T07:36:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:05 crc kubenswrapper[4599]: E1012 07:36:05.087840 4599 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:36:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:36:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:36:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:36:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:36:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:36:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:36:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:36:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b3fd0901-686d-4e85-8767-c56bc470edcc\\\",\\\"systemUUID\\\":\\\"c0d90076-d180-408b-98a9-f48996ced0a6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:05Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:05 crc kubenswrapper[4599]: I1012 07:36:05.091111 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:05 crc kubenswrapper[4599]: I1012 07:36:05.091160 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:05 crc kubenswrapper[4599]: I1012 07:36:05.091171 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:05 crc kubenswrapper[4599]: I1012 07:36:05.091187 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:05 crc kubenswrapper[4599]: I1012 07:36:05.091198 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:05Z","lastTransitionTime":"2025-10-12T07:36:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:05 crc kubenswrapper[4599]: E1012 07:36:05.102176 4599 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:36:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:36:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:36:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:36:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:36:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:36:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:36:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:36:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b3fd0901-686d-4e85-8767-c56bc470edcc\\\",\\\"systemUUID\\\":\\\"c0d90076-d180-408b-98a9-f48996ced0a6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:05Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:05 crc kubenswrapper[4599]: I1012 07:36:05.105019 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:05 crc kubenswrapper[4599]: I1012 07:36:05.105115 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:05 crc kubenswrapper[4599]: I1012 07:36:05.105193 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:05 crc kubenswrapper[4599]: I1012 07:36:05.105258 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:05 crc kubenswrapper[4599]: I1012 07:36:05.105312 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:05Z","lastTransitionTime":"2025-10-12T07:36:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:05 crc kubenswrapper[4599]: E1012 07:36:05.114763 4599 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:36:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:36:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:36:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:36:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:36:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:36:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:36:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:36:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b3fd0901-686d-4e85-8767-c56bc470edcc\\\",\\\"systemUUID\\\":\\\"c0d90076-d180-408b-98a9-f48996ced0a6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:05Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:05 crc kubenswrapper[4599]: I1012 07:36:05.117864 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:05 crc kubenswrapper[4599]: I1012 07:36:05.118002 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:05 crc kubenswrapper[4599]: I1012 07:36:05.118102 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:05 crc kubenswrapper[4599]: I1012 07:36:05.118180 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:05 crc kubenswrapper[4599]: I1012 07:36:05.118251 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:05Z","lastTransitionTime":"2025-10-12T07:36:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:05 crc kubenswrapper[4599]: E1012 07:36:05.127057 4599 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:36:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:36:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:36:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:36:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:36:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:36:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:36:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:36:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b3fd0901-686d-4e85-8767-c56bc470edcc\\\",\\\"systemUUID\\\":\\\"c0d90076-d180-408b-98a9-f48996ced0a6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:05Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:05 crc kubenswrapper[4599]: E1012 07:36:05.127305 4599 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 12 07:36:05 crc kubenswrapper[4599]: I1012 07:36:05.128400 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:05 crc kubenswrapper[4599]: I1012 07:36:05.128437 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:05 crc kubenswrapper[4599]: I1012 07:36:05.128447 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:05 crc kubenswrapper[4599]: I1012 07:36:05.128461 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:05 crc kubenswrapper[4599]: I1012 07:36:05.128471 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:05Z","lastTransitionTime":"2025-10-12T07:36:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:05 crc kubenswrapper[4599]: I1012 07:36:05.230559 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:05 crc kubenswrapper[4599]: I1012 07:36:05.230845 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:05 crc kubenswrapper[4599]: I1012 07:36:05.230944 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:05 crc kubenswrapper[4599]: I1012 07:36:05.231034 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:05 crc kubenswrapper[4599]: I1012 07:36:05.231100 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:05Z","lastTransitionTime":"2025-10-12T07:36:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:05 crc kubenswrapper[4599]: I1012 07:36:05.337112 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:05 crc kubenswrapper[4599]: I1012 07:36:05.337594 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:05 crc kubenswrapper[4599]: I1012 07:36:05.337659 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:05 crc kubenswrapper[4599]: I1012 07:36:05.337739 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:05 crc kubenswrapper[4599]: I1012 07:36:05.337807 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:05Z","lastTransitionTime":"2025-10-12T07:36:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:05 crc kubenswrapper[4599]: I1012 07:36:05.441925 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:05 crc kubenswrapper[4599]: I1012 07:36:05.441968 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:05 crc kubenswrapper[4599]: I1012 07:36:05.441991 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:05 crc kubenswrapper[4599]: I1012 07:36:05.442014 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:05 crc kubenswrapper[4599]: I1012 07:36:05.442025 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:05Z","lastTransitionTime":"2025-10-12T07:36:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:05 crc kubenswrapper[4599]: I1012 07:36:05.544305 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 07:36:05 crc kubenswrapper[4599]: I1012 07:36:05.544993 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:05 crc kubenswrapper[4599]: I1012 07:36:05.545035 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:05 crc kubenswrapper[4599]: I1012 07:36:05.545049 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:05 crc kubenswrapper[4599]: I1012 07:36:05.545064 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:05 crc kubenswrapper[4599]: I1012 07:36:05.545073 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:05Z","lastTransitionTime":"2025-10-12T07:36:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:05 crc kubenswrapper[4599]: I1012 07:36:05.544941 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 07:36:05 crc kubenswrapper[4599]: I1012 07:36:05.544978 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 07:36:05 crc kubenswrapper[4599]: E1012 07:36:05.545430 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 07:36:05 crc kubenswrapper[4599]: E1012 07:36:05.545531 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 07:36:05 crc kubenswrapper[4599]: E1012 07:36:05.545580 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 07:36:05 crc kubenswrapper[4599]: I1012 07:36:05.647972 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:05 crc kubenswrapper[4599]: I1012 07:36:05.648032 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:05 crc kubenswrapper[4599]: I1012 07:36:05.648043 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:05 crc kubenswrapper[4599]: I1012 07:36:05.648059 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:05 crc kubenswrapper[4599]: I1012 07:36:05.648073 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:05Z","lastTransitionTime":"2025-10-12T07:36:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:05 crc kubenswrapper[4599]: I1012 07:36:05.750655 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:05 crc kubenswrapper[4599]: I1012 07:36:05.750704 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:05 crc kubenswrapper[4599]: I1012 07:36:05.750714 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:05 crc kubenswrapper[4599]: I1012 07:36:05.750734 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:05 crc kubenswrapper[4599]: I1012 07:36:05.750746 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:05Z","lastTransitionTime":"2025-10-12T07:36:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:05 crc kubenswrapper[4599]: I1012 07:36:05.852161 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:05 crc kubenswrapper[4599]: I1012 07:36:05.852408 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:05 crc kubenswrapper[4599]: I1012 07:36:05.852499 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:05 crc kubenswrapper[4599]: I1012 07:36:05.852557 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:05 crc kubenswrapper[4599]: I1012 07:36:05.852616 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:05Z","lastTransitionTime":"2025-10-12T07:36:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:05 crc kubenswrapper[4599]: I1012 07:36:05.955237 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:05 crc kubenswrapper[4599]: I1012 07:36:05.955274 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:05 crc kubenswrapper[4599]: I1012 07:36:05.955283 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:05 crc kubenswrapper[4599]: I1012 07:36:05.955297 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:05 crc kubenswrapper[4599]: I1012 07:36:05.955310 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:05Z","lastTransitionTime":"2025-10-12T07:36:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:06 crc kubenswrapper[4599]: I1012 07:36:06.056975 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:06 crc kubenswrapper[4599]: I1012 07:36:06.057035 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:06 crc kubenswrapper[4599]: I1012 07:36:06.057055 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:06 crc kubenswrapper[4599]: I1012 07:36:06.057075 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:06 crc kubenswrapper[4599]: I1012 07:36:06.057089 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:06Z","lastTransitionTime":"2025-10-12T07:36:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:06 crc kubenswrapper[4599]: I1012 07:36:06.158904 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:06 crc kubenswrapper[4599]: I1012 07:36:06.158952 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:06 crc kubenswrapper[4599]: I1012 07:36:06.158961 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:06 crc kubenswrapper[4599]: I1012 07:36:06.158983 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:06 crc kubenswrapper[4599]: I1012 07:36:06.159004 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:06Z","lastTransitionTime":"2025-10-12T07:36:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:06 crc kubenswrapper[4599]: I1012 07:36:06.260936 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:06 crc kubenswrapper[4599]: I1012 07:36:06.260974 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:06 crc kubenswrapper[4599]: I1012 07:36:06.260984 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:06 crc kubenswrapper[4599]: I1012 07:36:06.261012 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:06 crc kubenswrapper[4599]: I1012 07:36:06.261039 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:06Z","lastTransitionTime":"2025-10-12T07:36:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:06 crc kubenswrapper[4599]: I1012 07:36:06.363122 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:06 crc kubenswrapper[4599]: I1012 07:36:06.363183 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:06 crc kubenswrapper[4599]: I1012 07:36:06.363193 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:06 crc kubenswrapper[4599]: I1012 07:36:06.363213 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:06 crc kubenswrapper[4599]: I1012 07:36:06.363225 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:06Z","lastTransitionTime":"2025-10-12T07:36:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:06 crc kubenswrapper[4599]: I1012 07:36:06.465250 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:06 crc kubenswrapper[4599]: I1012 07:36:06.465300 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:06 crc kubenswrapper[4599]: I1012 07:36:06.465318 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:06 crc kubenswrapper[4599]: I1012 07:36:06.465447 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:06 crc kubenswrapper[4599]: I1012 07:36:06.465474 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:06Z","lastTransitionTime":"2025-10-12T07:36:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:06 crc kubenswrapper[4599]: I1012 07:36:06.498561 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3c3e76cc-139b-4a2a-b96b-6077e3706376-metrics-certs\") pod \"network-metrics-daemon-kwphq\" (UID: \"3c3e76cc-139b-4a2a-b96b-6077e3706376\") " pod="openshift-multus/network-metrics-daemon-kwphq" Oct 12 07:36:06 crc kubenswrapper[4599]: E1012 07:36:06.498667 4599 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 12 07:36:06 crc kubenswrapper[4599]: E1012 07:36:06.498715 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c3e76cc-139b-4a2a-b96b-6077e3706376-metrics-certs podName:3c3e76cc-139b-4a2a-b96b-6077e3706376 nodeName:}" failed. No retries permitted until 2025-10-12 07:36:38.498701756 +0000 UTC m=+95.287897259 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3c3e76cc-139b-4a2a-b96b-6077e3706376-metrics-certs") pod "network-metrics-daemon-kwphq" (UID: "3c3e76cc-139b-4a2a-b96b-6077e3706376") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 12 07:36:06 crc kubenswrapper[4599]: I1012 07:36:06.544532 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kwphq" Oct 12 07:36:06 crc kubenswrapper[4599]: E1012 07:36:06.544639 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kwphq" podUID="3c3e76cc-139b-4a2a-b96b-6077e3706376" Oct 12 07:36:06 crc kubenswrapper[4599]: I1012 07:36:06.567256 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:06 crc kubenswrapper[4599]: I1012 07:36:06.567298 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:06 crc kubenswrapper[4599]: I1012 07:36:06.567307 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:06 crc kubenswrapper[4599]: I1012 07:36:06.567323 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:06 crc kubenswrapper[4599]: I1012 07:36:06.567356 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:06Z","lastTransitionTime":"2025-10-12T07:36:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:06 crc kubenswrapper[4599]: I1012 07:36:06.669450 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:06 crc kubenswrapper[4599]: I1012 07:36:06.669507 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:06 crc kubenswrapper[4599]: I1012 07:36:06.669517 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:06 crc kubenswrapper[4599]: I1012 07:36:06.669537 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:06 crc kubenswrapper[4599]: I1012 07:36:06.669546 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:06Z","lastTransitionTime":"2025-10-12T07:36:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:06 crc kubenswrapper[4599]: I1012 07:36:06.771996 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:06 crc kubenswrapper[4599]: I1012 07:36:06.772071 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:06 crc kubenswrapper[4599]: I1012 07:36:06.772085 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:06 crc kubenswrapper[4599]: I1012 07:36:06.772105 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:06 crc kubenswrapper[4599]: I1012 07:36:06.772121 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:06Z","lastTransitionTime":"2025-10-12T07:36:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:06 crc kubenswrapper[4599]: I1012 07:36:06.874271 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:06 crc kubenswrapper[4599]: I1012 07:36:06.874327 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:06 crc kubenswrapper[4599]: I1012 07:36:06.874354 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:06 crc kubenswrapper[4599]: I1012 07:36:06.874368 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:06 crc kubenswrapper[4599]: I1012 07:36:06.874378 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:06Z","lastTransitionTime":"2025-10-12T07:36:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:06 crc kubenswrapper[4599]: I1012 07:36:06.975916 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:06 crc kubenswrapper[4599]: I1012 07:36:06.975966 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:06 crc kubenswrapper[4599]: I1012 07:36:06.975977 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:06 crc kubenswrapper[4599]: I1012 07:36:06.975990 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:06 crc kubenswrapper[4599]: I1012 07:36:06.975999 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:06Z","lastTransitionTime":"2025-10-12T07:36:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:07 crc kubenswrapper[4599]: I1012 07:36:07.078796 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:07 crc kubenswrapper[4599]: I1012 07:36:07.079107 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:07 crc kubenswrapper[4599]: I1012 07:36:07.079183 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:07 crc kubenswrapper[4599]: I1012 07:36:07.079249 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:07 crc kubenswrapper[4599]: I1012 07:36:07.079317 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:07Z","lastTransitionTime":"2025-10-12T07:36:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:07 crc kubenswrapper[4599]: I1012 07:36:07.181958 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:07 crc kubenswrapper[4599]: I1012 07:36:07.182000 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:07 crc kubenswrapper[4599]: I1012 07:36:07.182041 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:07 crc kubenswrapper[4599]: I1012 07:36:07.182056 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:07 crc kubenswrapper[4599]: I1012 07:36:07.182065 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:07Z","lastTransitionTime":"2025-10-12T07:36:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:07 crc kubenswrapper[4599]: I1012 07:36:07.285092 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:07 crc kubenswrapper[4599]: I1012 07:36:07.285123 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:07 crc kubenswrapper[4599]: I1012 07:36:07.285132 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:07 crc kubenswrapper[4599]: I1012 07:36:07.285148 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:07 crc kubenswrapper[4599]: I1012 07:36:07.285158 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:07Z","lastTransitionTime":"2025-10-12T07:36:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:07 crc kubenswrapper[4599]: I1012 07:36:07.388003 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:07 crc kubenswrapper[4599]: I1012 07:36:07.388067 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:07 crc kubenswrapper[4599]: I1012 07:36:07.388077 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:07 crc kubenswrapper[4599]: I1012 07:36:07.388095 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:07 crc kubenswrapper[4599]: I1012 07:36:07.388109 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:07Z","lastTransitionTime":"2025-10-12T07:36:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:07 crc kubenswrapper[4599]: I1012 07:36:07.490697 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:07 crc kubenswrapper[4599]: I1012 07:36:07.490742 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:07 crc kubenswrapper[4599]: I1012 07:36:07.490752 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:07 crc kubenswrapper[4599]: I1012 07:36:07.490771 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:07 crc kubenswrapper[4599]: I1012 07:36:07.490782 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:07Z","lastTransitionTime":"2025-10-12T07:36:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:07 crc kubenswrapper[4599]: I1012 07:36:07.544541 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 07:36:07 crc kubenswrapper[4599]: I1012 07:36:07.544661 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 07:36:07 crc kubenswrapper[4599]: E1012 07:36:07.544697 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 07:36:07 crc kubenswrapper[4599]: E1012 07:36:07.544842 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 07:36:07 crc kubenswrapper[4599]: I1012 07:36:07.544988 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 07:36:07 crc kubenswrapper[4599]: E1012 07:36:07.545179 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 07:36:07 crc kubenswrapper[4599]: I1012 07:36:07.592959 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:07 crc kubenswrapper[4599]: I1012 07:36:07.593002 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:07 crc kubenswrapper[4599]: I1012 07:36:07.593029 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:07 crc kubenswrapper[4599]: I1012 07:36:07.593056 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:07 crc kubenswrapper[4599]: I1012 07:36:07.593069 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:07Z","lastTransitionTime":"2025-10-12T07:36:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:07 crc kubenswrapper[4599]: I1012 07:36:07.695994 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:07 crc kubenswrapper[4599]: I1012 07:36:07.696098 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:07 crc kubenswrapper[4599]: I1012 07:36:07.696113 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:07 crc kubenswrapper[4599]: I1012 07:36:07.696136 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:07 crc kubenswrapper[4599]: I1012 07:36:07.696148 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:07Z","lastTransitionTime":"2025-10-12T07:36:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:07 crc kubenswrapper[4599]: I1012 07:36:07.799271 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:07 crc kubenswrapper[4599]: I1012 07:36:07.799360 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:07 crc kubenswrapper[4599]: I1012 07:36:07.799375 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:07 crc kubenswrapper[4599]: I1012 07:36:07.799400 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:07 crc kubenswrapper[4599]: I1012 07:36:07.799413 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:07Z","lastTransitionTime":"2025-10-12T07:36:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:07 crc kubenswrapper[4599]: I1012 07:36:07.852851 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8hm26_ce311f52-0501-45d3-8209-b1d2aa25028b/kube-multus/0.log" Oct 12 07:36:07 crc kubenswrapper[4599]: I1012 07:36:07.852933 4599 generic.go:334] "Generic (PLEG): container finished" podID="ce311f52-0501-45d3-8209-b1d2aa25028b" containerID="611114a1fecbd461507dd2f566f975c7ef3e8d7655d90c1a5b31cd05a430cb38" exitCode=1 Oct 12 07:36:07 crc kubenswrapper[4599]: I1012 07:36:07.852975 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8hm26" event={"ID":"ce311f52-0501-45d3-8209-b1d2aa25028b","Type":"ContainerDied","Data":"611114a1fecbd461507dd2f566f975c7ef3e8d7655d90c1a5b31cd05a430cb38"} Oct 12 07:36:07 crc kubenswrapper[4599]: I1012 07:36:07.853570 4599 scope.go:117] "RemoveContainer" containerID="611114a1fecbd461507dd2f566f975c7ef3e8d7655d90c1a5b31cd05a430cb38" Oct 12 07:36:07 crc kubenswrapper[4599]: I1012 07:36:07.867051 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2869b43-5f5f-4ed7-9dc3-63d210bde137\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21abc118842f09c54256bd05f1e5978c928bad569bc9b52157287576117fb49d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb25175bfe42e193200fbc76f82294afbd5f23d9653637c12ddf67553431748e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da490def77b19758f5479ef7070846dee19a5057bd6b0568d96fd38a8dd6528\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://622eac195ecab593bc5fa71a4a88ac987a45be5fb130336aae7acd680c51abe4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:07Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:07 crc kubenswrapper[4599]: I1012 07:36:07.879708 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b574e7cb76b05a81f63a55ce4544b1d4b3a58e6793be2c89620f66282642e39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:07Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:07 crc kubenswrapper[4599]: I1012 07:36:07.890922 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9xbn5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18ca8765-c435-4750-b803-14b539958d9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4db4e7d06f60157d91fcb8514d176f78540085701360ebcfa7eabb36fe84d6b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16c75354b9fbe00273ab4aaa5ca1306c5d097996b8ce8074a2cee8185118a001\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16c75354b9fbe00273ab4aaa5ca1306c5d097996b8ce8074a2cee8185118a001\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a7c28a100796e21008c597d15ee36786f7e87553a8a4feb5734ac79cef9b556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a7c28a100796e21008c597d15ee36786f7e87553a8a4feb5734ac79cef9b556\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6388c1e143c0caded216941ec25bd07be0a14f2f97f6328c35d5cc04815a88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6388c1e143c0caded216941ec25bd07be0a14f2f97f6328c35d5cc04815a88a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e201774968890569efafc736378cff0d700067bb0bd39677f42ef65fa2c07646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e201774968890569efafc736378cff0d700067bb0bd39677f42ef65fa2c07646\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f07c63fbebde2f05a5b1682af69eb42b5aec20728e0997f1ebc2f5a59815ad1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f07c63fbebde2f05a5b1682af69eb42b5aec20728e0997f1ebc2f5a59815ad1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f62b824f1e7400e9588b3d3a1afc52f86b3bcae039fbe3c757e1e01733e3a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f62b824f1e7400e9588b3d3a1afc52f86b3bcae039fbe3c757e1e01733e3a69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9xbn5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:07Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:07 crc kubenswrapper[4599]: I1012 07:36:07.901062 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:07 crc kubenswrapper[4599]: I1012 07:36:07.901106 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:07 crc kubenswrapper[4599]: I1012 07:36:07.901118 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:07 crc kubenswrapper[4599]: I1012 07:36:07.901133 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:07 crc kubenswrapper[4599]: I1012 07:36:07.901143 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:07Z","lastTransitionTime":"2025-10-12T07:36:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:07 crc kubenswrapper[4599]: I1012 07:36:07.901897 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8hm26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce311f52-0501-45d3-8209-b1d2aa25028b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:36:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:36:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://611114a1fecbd461507dd2f566f975c7ef3e8d7655d90c1a5b31cd05a430cb38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://611114a1fecbd461507dd2f566f975c7ef3e8d7655d90c1a5b31cd05a430cb38\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T07:36:07Z\\\",\\\"message\\\":\\\"2025-10-12T07:35:22+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_dbf0399a-84fd-451c-96bc-e20b8ff4f200\\\\n2025-10-12T07:35:22+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_dbf0399a-84fd-451c-96bc-e20b8ff4f200 to /host/opt/cni/bin/\\\\n2025-10-12T07:35:22Z [verbose] multus-daemon started\\\\n2025-10-12T07:35:22Z [verbose] Readiness Indicator file check\\\\n2025-10-12T07:36:07Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8hm26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:07Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:07 crc kubenswrapper[4599]: I1012 07:36:07.911960 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:07Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:07 crc kubenswrapper[4599]: I1012 07:36:07.922188 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:07Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:07 crc kubenswrapper[4599]: I1012 07:36:07.932179 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4b9189620cc5d626fe6c3e7bbe0efd22f7a6f061bc2afd3d33b1e04ebdbfb1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc203371af6667734441e7dca5f3b301365eb1136f363df8c87377ae58168130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:07Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:07 crc kubenswrapper[4599]: I1012 07:36:07.939475 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kwphq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c3e76cc-139b-4a2a-b96b-6077e3706376\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9v52\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9v52\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kwphq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:07Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:07 crc kubenswrapper[4599]: I1012 07:36:07.947993 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"636df6e3-ed6a-452a-9d5d-e26139f62951\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f52b4e0ae3f90600ec5e799352a1f70b4fad54a4805616d1508c22af008a0f8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56aa5b66fc1ca787e84c8ca2a717038f9e522862fb680d5271921e410d5841f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://091875c6e2e9f643a0c31783ab1905c5fd448d20e98b9b4de70be991e7e8f1c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d08b26fb90464c31907b23edfd9f4997375c3178e8699cf951af85424d9354d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d08b26fb90464c31907b23edfd9f4997375c3178e8699cf951af85424d9354d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:03Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:07Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:07 crc kubenswrapper[4599]: I1012 07:36:07.956509 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rxr42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92909e59-b659-4fc0-91c0-1880ff96b4f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4162f0c9d7789e80004cf3ea54eed47b4d473832cf68a03e348973c300f9dcc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7hr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rxr42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:07Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:07 crc kubenswrapper[4599]: I1012 07:36:07.963952 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f5988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0147ed28-bc0f-409c-a813-dc2ffffba092\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b06de751793dade650ec18334a6bcf8bb26e5e89af47ede264d087d1a950178a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzz9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f5988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:07Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:07 crc kubenswrapper[4599]: I1012 07:36:07.980476 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a95b7ab-8632-4332-a30f-64f28ef8d313\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://543f674ce98b17c90c5a351ae72f596a2e90006a1b86fcede3359715c67b94bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c150b687d6b0c337604ac14537335d57d9c955a9e31001d274507e5e65ac7bc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f73bbbb549129c48a1421dda3d2385cf3515feafd8215cdac6b792823d614b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a156f480c7252f61ec3dc991814db0e7259fdf7ae618c8f43d86da97f8a6bc5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f99854365302221a36d3d7647292430b5655222c551953b6aa6de9eb1c922bea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://810de96e34e6b028e90bcc5693f1624d337cdd775c01901fc91e6dd2a7f3564d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72e00dcebdb38459b2ffe60407ea9c8a343a2a57435d0596134981223e989f8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72e00dcebdb38459b2ffe60407ea9c8a343a2a57435d0596134981223e989f8a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T07:35:46Z\\\",\\\"message\\\":\\\"achine-config-daemon-5mz5c openshift-multus/multus-8hm26 openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xl2ns openshift-dns/node-resolver-f5988 openshift-multus/network-metrics-daemon-kwphq openshift-network-diagnostics/network-check-target-xd92c openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-kube-apiserver/kube-apiserver-crc openshift-kube-controller-manager/kube-controller-manager-crc openshift-network-operator/iptables-alerter-4ln5h]\\\\nI1012 07:35:46.210430 6238 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nF1012 07:35:46.210433 6238 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired o\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-whk5b_openshift-ovn-kubernetes(1a95b7ab-8632-4332-a30f-64f28ef8d313)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce496909e393b4896eb9ae165317648df3e828d219c231d95c6057cd08b78fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce4f64c7fab07385f9215821c5c75c5b345653fa141a238bb3dd74e7a1a291a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce4f64c7fab07385f9215821c5c75c5b345653fa141a238bb3dd74e7a1a291a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-whk5b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:07Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:07 crc kubenswrapper[4599]: I1012 07:36:07.988779 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xl2ns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29fbfc4b-8e32-4132-a34d-48b25ec31428\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e2f2c3c17ada7047edb8e0fcd51227104458618c98e5ddec12123aa7ee173e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqqzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b84de1dd84c3c72376a170b7ccbdfff8c3bdd5e9e18a344002e112e81ca555a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqqzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xl2ns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:07Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:07 crc kubenswrapper[4599]: I1012 07:36:07.998514 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8afb4c88-1806-483a-9957-30833be28202\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c61ee75e4bbe142c54914177759cf0c49ecc36f904c8c6bf421cd4917256d5a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://545c8135b4e5319ac3b9c84ee50556b66e4f5cfe0987f5aa37ef668130f6f267\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40188eb89126b26a3002269d063827de60fa17f72c18fbbd0e15b16c744e8ddb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc5c76ab7433309a1f2c32d2ada884b5b0d926fee8219463079f8689a1f7efef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21d90e31b7ebf70d48881d9d0837e8a1fdeabc3d53988c7ba993ce28e902d3d4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T07:35:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW1012 07:35:20.755648 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1012 07:35:20.755757 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 07:35:20.757774 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2293705747/tls.crt::/tmp/serving-cert-2293705747/tls.key\\\\\\\"\\\\nI1012 07:35:20.975376 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1012 07:35:20.977324 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1012 07:35:20.977357 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1012 07:35:20.977377 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1012 07:35:20.977382 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1012 07:35:20.982144 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1012 07:35:20.982166 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1012 07:35:20.982172 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1012 07:35:20.982212 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1012 07:35:20.982216 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1012 07:35:20.982220 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1012 07:35:20.982223 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1012 07:35:20.982226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1012 07:35:20.985689 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e595e7d1dee4ecff3b5deb6e363b57de09e4bcfddd6ab9ae9be27b6e3578628\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afc60525d60f5b497a75efb631611d658c23f6336c21ca7421bf3efee3ece0ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afc60525d60f5b497a75efb631611d658c23f6336c21ca7421bf3efee3ece0ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:07Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:08 crc kubenswrapper[4599]: I1012 07:36:08.003699 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:08 crc kubenswrapper[4599]: I1012 07:36:08.003727 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:08 crc kubenswrapper[4599]: I1012 07:36:08.003756 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:08 crc kubenswrapper[4599]: I1012 07:36:08.003772 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:08 crc kubenswrapper[4599]: I1012 07:36:08.003781 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:08Z","lastTransitionTime":"2025-10-12T07:36:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:08 crc kubenswrapper[4599]: I1012 07:36:08.007140 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db79c5e0b03856a5da5fba6b8e6cb77590ef9e29090ac66da24d54ea638a20f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:08Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:08 crc kubenswrapper[4599]: I1012 07:36:08.018592 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:08Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:08 crc kubenswrapper[4599]: I1012 07:36:08.027268 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc694bce-8c25-4729-b452-29d44d3efe6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c222e0cd037e0719d1ea94d4f28639dae42c676fa56b5f57b22cd50d87e6b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25l5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://238fe8d2b2e7f08d70bc5f938aca3c90b68c447db21a713d651d47b47a8127c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25l5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5mz5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:08Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:08 crc kubenswrapper[4599]: I1012 07:36:08.106638 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:08 crc kubenswrapper[4599]: I1012 07:36:08.106681 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:08 crc kubenswrapper[4599]: I1012 07:36:08.106691 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:08 crc kubenswrapper[4599]: I1012 07:36:08.106712 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:08 crc kubenswrapper[4599]: I1012 07:36:08.106722 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:08Z","lastTransitionTime":"2025-10-12T07:36:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:08 crc kubenswrapper[4599]: I1012 07:36:08.208818 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:08 crc kubenswrapper[4599]: I1012 07:36:08.208871 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:08 crc kubenswrapper[4599]: I1012 07:36:08.208882 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:08 crc kubenswrapper[4599]: I1012 07:36:08.208898 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:08 crc kubenswrapper[4599]: I1012 07:36:08.208909 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:08Z","lastTransitionTime":"2025-10-12T07:36:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:08 crc kubenswrapper[4599]: I1012 07:36:08.311464 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:08 crc kubenswrapper[4599]: I1012 07:36:08.311508 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:08 crc kubenswrapper[4599]: I1012 07:36:08.311518 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:08 crc kubenswrapper[4599]: I1012 07:36:08.311534 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:08 crc kubenswrapper[4599]: I1012 07:36:08.311544 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:08Z","lastTransitionTime":"2025-10-12T07:36:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:08 crc kubenswrapper[4599]: I1012 07:36:08.414203 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:08 crc kubenswrapper[4599]: I1012 07:36:08.414265 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:08 crc kubenswrapper[4599]: I1012 07:36:08.414278 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:08 crc kubenswrapper[4599]: I1012 07:36:08.414300 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:08 crc kubenswrapper[4599]: I1012 07:36:08.414313 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:08Z","lastTransitionTime":"2025-10-12T07:36:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:08 crc kubenswrapper[4599]: I1012 07:36:08.516473 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:08 crc kubenswrapper[4599]: I1012 07:36:08.516531 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:08 crc kubenswrapper[4599]: I1012 07:36:08.516542 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:08 crc kubenswrapper[4599]: I1012 07:36:08.516562 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:08 crc kubenswrapper[4599]: I1012 07:36:08.516576 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:08Z","lastTransitionTime":"2025-10-12T07:36:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:08 crc kubenswrapper[4599]: I1012 07:36:08.545156 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kwphq" Oct 12 07:36:08 crc kubenswrapper[4599]: E1012 07:36:08.545317 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kwphq" podUID="3c3e76cc-139b-4a2a-b96b-6077e3706376" Oct 12 07:36:08 crc kubenswrapper[4599]: I1012 07:36:08.618586 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:08 crc kubenswrapper[4599]: I1012 07:36:08.618633 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:08 crc kubenswrapper[4599]: I1012 07:36:08.618645 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:08 crc kubenswrapper[4599]: I1012 07:36:08.618668 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:08 crc kubenswrapper[4599]: I1012 07:36:08.618682 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:08Z","lastTransitionTime":"2025-10-12T07:36:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:08 crc kubenswrapper[4599]: I1012 07:36:08.720736 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:08 crc kubenswrapper[4599]: I1012 07:36:08.720791 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:08 crc kubenswrapper[4599]: I1012 07:36:08.720803 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:08 crc kubenswrapper[4599]: I1012 07:36:08.720823 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:08 crc kubenswrapper[4599]: I1012 07:36:08.720836 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:08Z","lastTransitionTime":"2025-10-12T07:36:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:08 crc kubenswrapper[4599]: I1012 07:36:08.822858 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:08 crc kubenswrapper[4599]: I1012 07:36:08.822918 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:08 crc kubenswrapper[4599]: I1012 07:36:08.822929 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:08 crc kubenswrapper[4599]: I1012 07:36:08.822951 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:08 crc kubenswrapper[4599]: I1012 07:36:08.822969 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:08Z","lastTransitionTime":"2025-10-12T07:36:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:08 crc kubenswrapper[4599]: I1012 07:36:08.857501 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8hm26_ce311f52-0501-45d3-8209-b1d2aa25028b/kube-multus/0.log" Oct 12 07:36:08 crc kubenswrapper[4599]: I1012 07:36:08.857589 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8hm26" event={"ID":"ce311f52-0501-45d3-8209-b1d2aa25028b","Type":"ContainerStarted","Data":"1d07bb1c67c880fb49b68be21180f0a5a053da62097b38605440625d10650033"} Oct 12 07:36:08 crc kubenswrapper[4599]: I1012 07:36:08.870686 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"636df6e3-ed6a-452a-9d5d-e26139f62951\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f52b4e0ae3f90600ec5e799352a1f70b4fad54a4805616d1508c22af008a0f8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56aa5b66fc1ca787e84c8ca2a717038f9e522862fb680d5271921e410d5841f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://091875c6e2e9f643a0c31783ab1905c5fd448d20e98b9b4de70be991e7e8f1c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d08b26fb90464c31907b23edfd9f4997375c3178e8699cf951af85424d9354d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d08b26fb90464c31907b23edfd9f4997375c3178e8699cf951af85424d9354d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:03Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:08Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:08 crc kubenswrapper[4599]: I1012 07:36:08.880374 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rxr42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92909e59-b659-4fc0-91c0-1880ff96b4f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4162f0c9d7789e80004cf3ea54eed47b4d473832cf68a03e348973c300f9dcc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7hr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rxr42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:08Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:08 crc kubenswrapper[4599]: I1012 07:36:08.891355 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f5988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0147ed28-bc0f-409c-a813-dc2ffffba092\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b06de751793dade650ec18334a6bcf8bb26e5e89af47ede264d087d1a950178a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzz9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f5988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:08Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:08 crc kubenswrapper[4599]: I1012 07:36:08.910245 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a95b7ab-8632-4332-a30f-64f28ef8d313\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://543f674ce98b17c90c5a351ae72f596a2e90006a1b86fcede3359715c67b94bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c150b687d6b0c337604ac14537335d57d9c955a9e31001d274507e5e65ac7bc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f73bbbb549129c48a1421dda3d2385cf3515feafd8215cdac6b792823d614b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a156f480c7252f61ec3dc991814db0e7259fdf7ae618c8f43d86da97f8a6bc5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f99854365302221a36d3d7647292430b5655222c551953b6aa6de9eb1c922bea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://810de96e34e6b028e90bcc5693f1624d337cdd775c01901fc91e6dd2a7f3564d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72e00dcebdb38459b2ffe60407ea9c8a343a2a57435d0596134981223e989f8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72e00dcebdb38459b2ffe60407ea9c8a343a2a57435d0596134981223e989f8a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T07:35:46Z\\\",\\\"message\\\":\\\"achine-config-daemon-5mz5c openshift-multus/multus-8hm26 openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xl2ns openshift-dns/node-resolver-f5988 openshift-multus/network-metrics-daemon-kwphq openshift-network-diagnostics/network-check-target-xd92c openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-kube-apiserver/kube-apiserver-crc openshift-kube-controller-manager/kube-controller-manager-crc openshift-network-operator/iptables-alerter-4ln5h]\\\\nI1012 07:35:46.210430 6238 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nF1012 07:35:46.210433 6238 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired o\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-whk5b_openshift-ovn-kubernetes(1a95b7ab-8632-4332-a30f-64f28ef8d313)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce496909e393b4896eb9ae165317648df3e828d219c231d95c6057cd08b78fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce4f64c7fab07385f9215821c5c75c5b345653fa141a238bb3dd74e7a1a291a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce4f64c7fab07385f9215821c5c75c5b345653fa141a238bb3dd74e7a1a291a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-whk5b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:08Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:08 crc kubenswrapper[4599]: I1012 07:36:08.920859 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kwphq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c3e76cc-139b-4a2a-b96b-6077e3706376\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9v52\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9v52\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kwphq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:08Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:08 crc kubenswrapper[4599]: I1012 07:36:08.925156 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:08 crc kubenswrapper[4599]: I1012 07:36:08.925192 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:08 crc kubenswrapper[4599]: I1012 07:36:08.925210 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:08 crc kubenswrapper[4599]: I1012 07:36:08.925232 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:08 crc kubenswrapper[4599]: I1012 07:36:08.925247 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:08Z","lastTransitionTime":"2025-10-12T07:36:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:08 crc kubenswrapper[4599]: I1012 07:36:08.934786 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8afb4c88-1806-483a-9957-30833be28202\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c61ee75e4bbe142c54914177759cf0c49ecc36f904c8c6bf421cd4917256d5a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://545c8135b4e5319ac3b9c84ee50556b66e4f5cfe0987f5aa37ef668130f6f267\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40188eb89126b26a3002269d063827de60fa17f72c18fbbd0e15b16c744e8ddb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc5c76ab7433309a1f2c32d2ada884b5b0d926fee8219463079f8689a1f7efef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21d90e31b7ebf70d48881d9d0837e8a1fdeabc3d53988c7ba993ce28e902d3d4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T07:35:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW1012 07:35:20.755648 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1012 07:35:20.755757 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 07:35:20.757774 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2293705747/tls.crt::/tmp/serving-cert-2293705747/tls.key\\\\\\\"\\\\nI1012 07:35:20.975376 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1012 07:35:20.977324 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1012 07:35:20.977357 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1012 07:35:20.977377 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1012 07:35:20.977382 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1012 07:35:20.982144 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1012 07:35:20.982166 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1012 07:35:20.982172 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1012 07:35:20.982212 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1012 07:35:20.982216 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1012 07:35:20.982220 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1012 07:35:20.982223 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1012 07:35:20.982226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1012 07:35:20.985689 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e595e7d1dee4ecff3b5deb6e363b57de09e4bcfddd6ab9ae9be27b6e3578628\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afc60525d60f5b497a75efb631611d658c23f6336c21ca7421bf3efee3ece0ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afc60525d60f5b497a75efb631611d658c23f6336c21ca7421bf3efee3ece0ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:08Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:08 crc kubenswrapper[4599]: I1012 07:36:08.946727 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db79c5e0b03856a5da5fba6b8e6cb77590ef9e29090ac66da24d54ea638a20f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:08Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:08 crc kubenswrapper[4599]: I1012 07:36:08.958449 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:08Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:08 crc kubenswrapper[4599]: I1012 07:36:08.968370 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc694bce-8c25-4729-b452-29d44d3efe6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c222e0cd037e0719d1ea94d4f28639dae42c676fa56b5f57b22cd50d87e6b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25l5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://238fe8d2b2e7f08d70bc5f938aca3c90b68c447db21a713d651d47b47a8127c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25l5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5mz5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:08Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:08 crc kubenswrapper[4599]: I1012 07:36:08.976524 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xl2ns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29fbfc4b-8e32-4132-a34d-48b25ec31428\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e2f2c3c17ada7047edb8e0fcd51227104458618c98e5ddec12123aa7ee173e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqqzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b84de1dd84c3c72376a170b7ccbdfff8c3bdd5e9e18a344002e112e81ca555a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqqzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xl2ns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:08Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:08 crc kubenswrapper[4599]: I1012 07:36:08.985188 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2869b43-5f5f-4ed7-9dc3-63d210bde137\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21abc118842f09c54256bd05f1e5978c928bad569bc9b52157287576117fb49d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb25175bfe42e193200fbc76f82294afbd5f23d9653637c12ddf67553431748e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da490def77b19758f5479ef7070846dee19a5057bd6b0568d96fd38a8dd6528\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://622eac195ecab593bc5fa71a4a88ac987a45be5fb130336aae7acd680c51abe4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:08Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:08 crc kubenswrapper[4599]: I1012 07:36:08.995023 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8hm26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce311f52-0501-45d3-8209-b1d2aa25028b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d07bb1c67c880fb49b68be21180f0a5a053da62097b38605440625d10650033\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://611114a1fecbd461507dd2f566f975c7ef3e8d7655d90c1a5b31cd05a430cb38\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T07:36:07Z\\\",\\\"message\\\":\\\"2025-10-12T07:35:22+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_dbf0399a-84fd-451c-96bc-e20b8ff4f200\\\\n2025-10-12T07:35:22+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_dbf0399a-84fd-451c-96bc-e20b8ff4f200 to /host/opt/cni/bin/\\\\n2025-10-12T07:35:22Z [verbose] multus-daemon started\\\\n2025-10-12T07:35:22Z [verbose] Readiness Indicator file check\\\\n2025-10-12T07:36:07Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:36:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8hm26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:08Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:09 crc kubenswrapper[4599]: I1012 07:36:09.004174 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:09Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:09 crc kubenswrapper[4599]: I1012 07:36:09.012683 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:09Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:09 crc kubenswrapper[4599]: I1012 07:36:09.022096 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4b9189620cc5d626fe6c3e7bbe0efd22f7a6f061bc2afd3d33b1e04ebdbfb1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc203371af6667734441e7dca5f3b301365eb1136f363df8c87377ae58168130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:09Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:09 crc kubenswrapper[4599]: I1012 07:36:09.027058 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:09 crc kubenswrapper[4599]: I1012 07:36:09.027092 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:09 crc kubenswrapper[4599]: I1012 07:36:09.027104 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:09 crc kubenswrapper[4599]: I1012 07:36:09.027118 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:09 crc kubenswrapper[4599]: I1012 07:36:09.027127 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:09Z","lastTransitionTime":"2025-10-12T07:36:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:09 crc kubenswrapper[4599]: I1012 07:36:09.031051 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b574e7cb76b05a81f63a55ce4544b1d4b3a58e6793be2c89620f66282642e39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:09Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:09 crc kubenswrapper[4599]: I1012 07:36:09.041827 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9xbn5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18ca8765-c435-4750-b803-14b539958d9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4db4e7d06f60157d91fcb8514d176f78540085701360ebcfa7eabb36fe84d6b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16c75354b9fbe00273ab4aaa5ca1306c5d097996b8ce8074a2cee8185118a001\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16c75354b9fbe00273ab4aaa5ca1306c5d097996b8ce8074a2cee8185118a001\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a7c28a100796e21008c597d15ee36786f7e87553a8a4feb5734ac79cef9b556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a7c28a100796e21008c597d15ee36786f7e87553a8a4feb5734ac79cef9b556\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6388c1e143c0caded216941ec25bd07be0a14f2f97f6328c35d5cc04815a88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6388c1e143c0caded216941ec25bd07be0a14f2f97f6328c35d5cc04815a88a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e201774968890569efafc736378cff0d700067bb0bd39677f42ef65fa2c07646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e201774968890569efafc736378cff0d700067bb0bd39677f42ef65fa2c07646\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f07c63fbebde2f05a5b1682af69eb42b5aec20728e0997f1ebc2f5a59815ad1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f07c63fbebde2f05a5b1682af69eb42b5aec20728e0997f1ebc2f5a59815ad1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f62b824f1e7400e9588b3d3a1afc52f86b3bcae039fbe3c757e1e01733e3a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f62b824f1e7400e9588b3d3a1afc52f86b3bcae039fbe3c757e1e01733e3a69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9xbn5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:09Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:09 crc kubenswrapper[4599]: I1012 07:36:09.129403 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:09 crc kubenswrapper[4599]: I1012 07:36:09.129440 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:09 crc kubenswrapper[4599]: I1012 07:36:09.129451 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:09 crc kubenswrapper[4599]: I1012 07:36:09.129467 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:09 crc kubenswrapper[4599]: I1012 07:36:09.129478 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:09Z","lastTransitionTime":"2025-10-12T07:36:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:09 crc kubenswrapper[4599]: I1012 07:36:09.231501 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:09 crc kubenswrapper[4599]: I1012 07:36:09.231558 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:09 crc kubenswrapper[4599]: I1012 07:36:09.231571 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:09 crc kubenswrapper[4599]: I1012 07:36:09.231589 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:09 crc kubenswrapper[4599]: I1012 07:36:09.231599 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:09Z","lastTransitionTime":"2025-10-12T07:36:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:09 crc kubenswrapper[4599]: I1012 07:36:09.333718 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:09 crc kubenswrapper[4599]: I1012 07:36:09.333759 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:09 crc kubenswrapper[4599]: I1012 07:36:09.333769 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:09 crc kubenswrapper[4599]: I1012 07:36:09.333783 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:09 crc kubenswrapper[4599]: I1012 07:36:09.333793 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:09Z","lastTransitionTime":"2025-10-12T07:36:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:09 crc kubenswrapper[4599]: I1012 07:36:09.436234 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:09 crc kubenswrapper[4599]: I1012 07:36:09.436273 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:09 crc kubenswrapper[4599]: I1012 07:36:09.436282 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:09 crc kubenswrapper[4599]: I1012 07:36:09.436298 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:09 crc kubenswrapper[4599]: I1012 07:36:09.436311 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:09Z","lastTransitionTime":"2025-10-12T07:36:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:09 crc kubenswrapper[4599]: I1012 07:36:09.537936 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:09 crc kubenswrapper[4599]: I1012 07:36:09.537979 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:09 crc kubenswrapper[4599]: I1012 07:36:09.537990 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:09 crc kubenswrapper[4599]: I1012 07:36:09.538005 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:09 crc kubenswrapper[4599]: I1012 07:36:09.538015 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:09Z","lastTransitionTime":"2025-10-12T07:36:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:09 crc kubenswrapper[4599]: I1012 07:36:09.544547 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 07:36:09 crc kubenswrapper[4599]: I1012 07:36:09.544583 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 07:36:09 crc kubenswrapper[4599]: I1012 07:36:09.544547 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 07:36:09 crc kubenswrapper[4599]: E1012 07:36:09.544645 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 07:36:09 crc kubenswrapper[4599]: E1012 07:36:09.544703 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 07:36:09 crc kubenswrapper[4599]: E1012 07:36:09.544784 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 07:36:09 crc kubenswrapper[4599]: I1012 07:36:09.643279 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:09 crc kubenswrapper[4599]: I1012 07:36:09.643350 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:09 crc kubenswrapper[4599]: I1012 07:36:09.643373 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:09 crc kubenswrapper[4599]: I1012 07:36:09.643393 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:09 crc kubenswrapper[4599]: I1012 07:36:09.643405 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:09Z","lastTransitionTime":"2025-10-12T07:36:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:09 crc kubenswrapper[4599]: I1012 07:36:09.746266 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:09 crc kubenswrapper[4599]: I1012 07:36:09.746323 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:09 crc kubenswrapper[4599]: I1012 07:36:09.746352 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:09 crc kubenswrapper[4599]: I1012 07:36:09.746369 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:09 crc kubenswrapper[4599]: I1012 07:36:09.746381 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:09Z","lastTransitionTime":"2025-10-12T07:36:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:09 crc kubenswrapper[4599]: I1012 07:36:09.848684 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:09 crc kubenswrapper[4599]: I1012 07:36:09.848723 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:09 crc kubenswrapper[4599]: I1012 07:36:09.848732 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:09 crc kubenswrapper[4599]: I1012 07:36:09.848750 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:09 crc kubenswrapper[4599]: I1012 07:36:09.848761 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:09Z","lastTransitionTime":"2025-10-12T07:36:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:09 crc kubenswrapper[4599]: I1012 07:36:09.951052 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:09 crc kubenswrapper[4599]: I1012 07:36:09.951103 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:09 crc kubenswrapper[4599]: I1012 07:36:09.951113 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:09 crc kubenswrapper[4599]: I1012 07:36:09.951129 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:09 crc kubenswrapper[4599]: I1012 07:36:09.951139 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:09Z","lastTransitionTime":"2025-10-12T07:36:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:10 crc kubenswrapper[4599]: I1012 07:36:10.053543 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:10 crc kubenswrapper[4599]: I1012 07:36:10.053577 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:10 crc kubenswrapper[4599]: I1012 07:36:10.053586 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:10 crc kubenswrapper[4599]: I1012 07:36:10.053610 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:10 crc kubenswrapper[4599]: I1012 07:36:10.053620 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:10Z","lastTransitionTime":"2025-10-12T07:36:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:10 crc kubenswrapper[4599]: I1012 07:36:10.156174 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:10 crc kubenswrapper[4599]: I1012 07:36:10.156209 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:10 crc kubenswrapper[4599]: I1012 07:36:10.156218 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:10 crc kubenswrapper[4599]: I1012 07:36:10.156229 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:10 crc kubenswrapper[4599]: I1012 07:36:10.156238 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:10Z","lastTransitionTime":"2025-10-12T07:36:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:10 crc kubenswrapper[4599]: I1012 07:36:10.258871 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:10 crc kubenswrapper[4599]: I1012 07:36:10.258900 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:10 crc kubenswrapper[4599]: I1012 07:36:10.258910 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:10 crc kubenswrapper[4599]: I1012 07:36:10.258923 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:10 crc kubenswrapper[4599]: I1012 07:36:10.258931 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:10Z","lastTransitionTime":"2025-10-12T07:36:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:10 crc kubenswrapper[4599]: I1012 07:36:10.361364 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:10 crc kubenswrapper[4599]: I1012 07:36:10.361383 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:10 crc kubenswrapper[4599]: I1012 07:36:10.361393 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:10 crc kubenswrapper[4599]: I1012 07:36:10.361404 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:10 crc kubenswrapper[4599]: I1012 07:36:10.361412 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:10Z","lastTransitionTime":"2025-10-12T07:36:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:10 crc kubenswrapper[4599]: I1012 07:36:10.463608 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:10 crc kubenswrapper[4599]: I1012 07:36:10.463641 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:10 crc kubenswrapper[4599]: I1012 07:36:10.463651 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:10 crc kubenswrapper[4599]: I1012 07:36:10.463664 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:10 crc kubenswrapper[4599]: I1012 07:36:10.463672 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:10Z","lastTransitionTime":"2025-10-12T07:36:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:10 crc kubenswrapper[4599]: I1012 07:36:10.544810 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kwphq" Oct 12 07:36:10 crc kubenswrapper[4599]: E1012 07:36:10.544916 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kwphq" podUID="3c3e76cc-139b-4a2a-b96b-6077e3706376" Oct 12 07:36:10 crc kubenswrapper[4599]: I1012 07:36:10.565518 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:10 crc kubenswrapper[4599]: I1012 07:36:10.565576 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:10 crc kubenswrapper[4599]: I1012 07:36:10.565589 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:10 crc kubenswrapper[4599]: I1012 07:36:10.565609 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:10 crc kubenswrapper[4599]: I1012 07:36:10.565624 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:10Z","lastTransitionTime":"2025-10-12T07:36:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:10 crc kubenswrapper[4599]: I1012 07:36:10.668149 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:10 crc kubenswrapper[4599]: I1012 07:36:10.668177 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:10 crc kubenswrapper[4599]: I1012 07:36:10.668187 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:10 crc kubenswrapper[4599]: I1012 07:36:10.668197 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:10 crc kubenswrapper[4599]: I1012 07:36:10.668204 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:10Z","lastTransitionTime":"2025-10-12T07:36:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:10 crc kubenswrapper[4599]: I1012 07:36:10.770359 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:10 crc kubenswrapper[4599]: I1012 07:36:10.770386 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:10 crc kubenswrapper[4599]: I1012 07:36:10.770395 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:10 crc kubenswrapper[4599]: I1012 07:36:10.770405 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:10 crc kubenswrapper[4599]: I1012 07:36:10.770415 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:10Z","lastTransitionTime":"2025-10-12T07:36:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:10 crc kubenswrapper[4599]: I1012 07:36:10.872945 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:10 crc kubenswrapper[4599]: I1012 07:36:10.873031 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:10 crc kubenswrapper[4599]: I1012 07:36:10.873045 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:10 crc kubenswrapper[4599]: I1012 07:36:10.873066 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:10 crc kubenswrapper[4599]: I1012 07:36:10.873100 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:10Z","lastTransitionTime":"2025-10-12T07:36:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:10 crc kubenswrapper[4599]: I1012 07:36:10.975127 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:10 crc kubenswrapper[4599]: I1012 07:36:10.975173 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:10 crc kubenswrapper[4599]: I1012 07:36:10.975184 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:10 crc kubenswrapper[4599]: I1012 07:36:10.975203 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:10 crc kubenswrapper[4599]: I1012 07:36:10.975213 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:10Z","lastTransitionTime":"2025-10-12T07:36:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:11 crc kubenswrapper[4599]: I1012 07:36:11.077880 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:11 crc kubenswrapper[4599]: I1012 07:36:11.077929 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:11 crc kubenswrapper[4599]: I1012 07:36:11.077938 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:11 crc kubenswrapper[4599]: I1012 07:36:11.077955 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:11 crc kubenswrapper[4599]: I1012 07:36:11.077966 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:11Z","lastTransitionTime":"2025-10-12T07:36:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:11 crc kubenswrapper[4599]: I1012 07:36:11.180381 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:11 crc kubenswrapper[4599]: I1012 07:36:11.180414 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:11 crc kubenswrapper[4599]: I1012 07:36:11.180422 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:11 crc kubenswrapper[4599]: I1012 07:36:11.180438 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:11 crc kubenswrapper[4599]: I1012 07:36:11.180449 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:11Z","lastTransitionTime":"2025-10-12T07:36:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:11 crc kubenswrapper[4599]: I1012 07:36:11.282763 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:11 crc kubenswrapper[4599]: I1012 07:36:11.282802 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:11 crc kubenswrapper[4599]: I1012 07:36:11.282836 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:11 crc kubenswrapper[4599]: I1012 07:36:11.282854 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:11 crc kubenswrapper[4599]: I1012 07:36:11.282865 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:11Z","lastTransitionTime":"2025-10-12T07:36:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:11 crc kubenswrapper[4599]: I1012 07:36:11.385623 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:11 crc kubenswrapper[4599]: I1012 07:36:11.385707 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:11 crc kubenswrapper[4599]: I1012 07:36:11.385720 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:11 crc kubenswrapper[4599]: I1012 07:36:11.385744 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:11 crc kubenswrapper[4599]: I1012 07:36:11.385757 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:11Z","lastTransitionTime":"2025-10-12T07:36:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:11 crc kubenswrapper[4599]: I1012 07:36:11.488083 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:11 crc kubenswrapper[4599]: I1012 07:36:11.488149 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:11 crc kubenswrapper[4599]: I1012 07:36:11.488161 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:11 crc kubenswrapper[4599]: I1012 07:36:11.488185 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:11 crc kubenswrapper[4599]: I1012 07:36:11.488197 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:11Z","lastTransitionTime":"2025-10-12T07:36:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:11 crc kubenswrapper[4599]: I1012 07:36:11.544830 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 07:36:11 crc kubenswrapper[4599]: I1012 07:36:11.544835 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 07:36:11 crc kubenswrapper[4599]: I1012 07:36:11.544961 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 07:36:11 crc kubenswrapper[4599]: E1012 07:36:11.545021 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 07:36:11 crc kubenswrapper[4599]: E1012 07:36:11.545113 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 07:36:11 crc kubenswrapper[4599]: E1012 07:36:11.545285 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 07:36:11 crc kubenswrapper[4599]: I1012 07:36:11.590177 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:11 crc kubenswrapper[4599]: I1012 07:36:11.590221 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:11 crc kubenswrapper[4599]: I1012 07:36:11.590231 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:11 crc kubenswrapper[4599]: I1012 07:36:11.590248 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:11 crc kubenswrapper[4599]: I1012 07:36:11.590259 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:11Z","lastTransitionTime":"2025-10-12T07:36:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:11 crc kubenswrapper[4599]: I1012 07:36:11.692984 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:11 crc kubenswrapper[4599]: I1012 07:36:11.693044 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:11 crc kubenswrapper[4599]: I1012 07:36:11.693055 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:11 crc kubenswrapper[4599]: I1012 07:36:11.693072 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:11 crc kubenswrapper[4599]: I1012 07:36:11.693084 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:11Z","lastTransitionTime":"2025-10-12T07:36:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:11 crc kubenswrapper[4599]: I1012 07:36:11.795315 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:11 crc kubenswrapper[4599]: I1012 07:36:11.795366 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:11 crc kubenswrapper[4599]: I1012 07:36:11.795376 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:11 crc kubenswrapper[4599]: I1012 07:36:11.795391 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:11 crc kubenswrapper[4599]: I1012 07:36:11.795405 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:11Z","lastTransitionTime":"2025-10-12T07:36:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:11 crc kubenswrapper[4599]: I1012 07:36:11.897701 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:11 crc kubenswrapper[4599]: I1012 07:36:11.897742 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:11 crc kubenswrapper[4599]: I1012 07:36:11.897757 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:11 crc kubenswrapper[4599]: I1012 07:36:11.897777 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:11 crc kubenswrapper[4599]: I1012 07:36:11.897790 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:11Z","lastTransitionTime":"2025-10-12T07:36:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:12 crc kubenswrapper[4599]: I1012 07:36:12.000448 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:12 crc kubenswrapper[4599]: I1012 07:36:12.000500 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:12 crc kubenswrapper[4599]: I1012 07:36:12.000511 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:12 crc kubenswrapper[4599]: I1012 07:36:12.000529 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:12 crc kubenswrapper[4599]: I1012 07:36:12.000541 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:12Z","lastTransitionTime":"2025-10-12T07:36:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:12 crc kubenswrapper[4599]: I1012 07:36:12.102360 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:12 crc kubenswrapper[4599]: I1012 07:36:12.102398 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:12 crc kubenswrapper[4599]: I1012 07:36:12.102410 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:12 crc kubenswrapper[4599]: I1012 07:36:12.102454 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:12 crc kubenswrapper[4599]: I1012 07:36:12.102464 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:12Z","lastTransitionTime":"2025-10-12T07:36:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:12 crc kubenswrapper[4599]: I1012 07:36:12.204992 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:12 crc kubenswrapper[4599]: I1012 07:36:12.205018 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:12 crc kubenswrapper[4599]: I1012 07:36:12.205028 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:12 crc kubenswrapper[4599]: I1012 07:36:12.205039 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:12 crc kubenswrapper[4599]: I1012 07:36:12.205046 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:12Z","lastTransitionTime":"2025-10-12T07:36:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:12 crc kubenswrapper[4599]: I1012 07:36:12.307518 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:12 crc kubenswrapper[4599]: I1012 07:36:12.307580 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:12 crc kubenswrapper[4599]: I1012 07:36:12.307595 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:12 crc kubenswrapper[4599]: I1012 07:36:12.307610 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:12 crc kubenswrapper[4599]: I1012 07:36:12.307621 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:12Z","lastTransitionTime":"2025-10-12T07:36:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:12 crc kubenswrapper[4599]: I1012 07:36:12.409668 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:12 crc kubenswrapper[4599]: I1012 07:36:12.409698 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:12 crc kubenswrapper[4599]: I1012 07:36:12.409709 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:12 crc kubenswrapper[4599]: I1012 07:36:12.409721 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:12 crc kubenswrapper[4599]: I1012 07:36:12.409730 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:12Z","lastTransitionTime":"2025-10-12T07:36:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:12 crc kubenswrapper[4599]: I1012 07:36:12.513487 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:12 crc kubenswrapper[4599]: I1012 07:36:12.513517 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:12 crc kubenswrapper[4599]: I1012 07:36:12.513528 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:12 crc kubenswrapper[4599]: I1012 07:36:12.513540 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:12 crc kubenswrapper[4599]: I1012 07:36:12.513549 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:12Z","lastTransitionTime":"2025-10-12T07:36:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:12 crc kubenswrapper[4599]: I1012 07:36:12.544295 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kwphq" Oct 12 07:36:12 crc kubenswrapper[4599]: E1012 07:36:12.544444 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kwphq" podUID="3c3e76cc-139b-4a2a-b96b-6077e3706376" Oct 12 07:36:12 crc kubenswrapper[4599]: I1012 07:36:12.616199 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:12 crc kubenswrapper[4599]: I1012 07:36:12.616233 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:12 crc kubenswrapper[4599]: I1012 07:36:12.616244 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:12 crc kubenswrapper[4599]: I1012 07:36:12.616258 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:12 crc kubenswrapper[4599]: I1012 07:36:12.616266 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:12Z","lastTransitionTime":"2025-10-12T07:36:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:12 crc kubenswrapper[4599]: I1012 07:36:12.718787 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:12 crc kubenswrapper[4599]: I1012 07:36:12.718830 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:12 crc kubenswrapper[4599]: I1012 07:36:12.718841 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:12 crc kubenswrapper[4599]: I1012 07:36:12.718860 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:12 crc kubenswrapper[4599]: I1012 07:36:12.718873 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:12Z","lastTransitionTime":"2025-10-12T07:36:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:12 crc kubenswrapper[4599]: I1012 07:36:12.820945 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:12 crc kubenswrapper[4599]: I1012 07:36:12.821224 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:12 crc kubenswrapper[4599]: I1012 07:36:12.821234 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:12 crc kubenswrapper[4599]: I1012 07:36:12.821250 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:12 crc kubenswrapper[4599]: I1012 07:36:12.821263 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:12Z","lastTransitionTime":"2025-10-12T07:36:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:12 crc kubenswrapper[4599]: I1012 07:36:12.924423 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:12 crc kubenswrapper[4599]: I1012 07:36:12.924462 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:12 crc kubenswrapper[4599]: I1012 07:36:12.924474 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:12 crc kubenswrapper[4599]: I1012 07:36:12.924489 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:12 crc kubenswrapper[4599]: I1012 07:36:12.924502 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:12Z","lastTransitionTime":"2025-10-12T07:36:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:13 crc kubenswrapper[4599]: I1012 07:36:13.026981 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:13 crc kubenswrapper[4599]: I1012 07:36:13.027011 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:13 crc kubenswrapper[4599]: I1012 07:36:13.027020 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:13 crc kubenswrapper[4599]: I1012 07:36:13.027032 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:13 crc kubenswrapper[4599]: I1012 07:36:13.027039 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:13Z","lastTransitionTime":"2025-10-12T07:36:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:13 crc kubenswrapper[4599]: I1012 07:36:13.129280 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:13 crc kubenswrapper[4599]: I1012 07:36:13.129313 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:13 crc kubenswrapper[4599]: I1012 07:36:13.129376 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:13 crc kubenswrapper[4599]: I1012 07:36:13.129391 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:13 crc kubenswrapper[4599]: I1012 07:36:13.129399 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:13Z","lastTransitionTime":"2025-10-12T07:36:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:13 crc kubenswrapper[4599]: I1012 07:36:13.231729 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:13 crc kubenswrapper[4599]: I1012 07:36:13.231789 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:13 crc kubenswrapper[4599]: I1012 07:36:13.231804 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:13 crc kubenswrapper[4599]: I1012 07:36:13.231825 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:13 crc kubenswrapper[4599]: I1012 07:36:13.231838 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:13Z","lastTransitionTime":"2025-10-12T07:36:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:13 crc kubenswrapper[4599]: I1012 07:36:13.333861 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:13 crc kubenswrapper[4599]: I1012 07:36:13.333901 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:13 crc kubenswrapper[4599]: I1012 07:36:13.333911 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:13 crc kubenswrapper[4599]: I1012 07:36:13.333933 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:13 crc kubenswrapper[4599]: I1012 07:36:13.333946 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:13Z","lastTransitionTime":"2025-10-12T07:36:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:13 crc kubenswrapper[4599]: I1012 07:36:13.436022 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:13 crc kubenswrapper[4599]: I1012 07:36:13.436077 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:13 crc kubenswrapper[4599]: I1012 07:36:13.436089 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:13 crc kubenswrapper[4599]: I1012 07:36:13.436102 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:13 crc kubenswrapper[4599]: I1012 07:36:13.436114 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:13Z","lastTransitionTime":"2025-10-12T07:36:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:13 crc kubenswrapper[4599]: I1012 07:36:13.539187 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:13 crc kubenswrapper[4599]: I1012 07:36:13.539246 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:13 crc kubenswrapper[4599]: I1012 07:36:13.539259 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:13 crc kubenswrapper[4599]: I1012 07:36:13.539283 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:13 crc kubenswrapper[4599]: I1012 07:36:13.539297 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:13Z","lastTransitionTime":"2025-10-12T07:36:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:13 crc kubenswrapper[4599]: I1012 07:36:13.544509 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 07:36:13 crc kubenswrapper[4599]: E1012 07:36:13.544674 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 07:36:13 crc kubenswrapper[4599]: I1012 07:36:13.544776 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 07:36:13 crc kubenswrapper[4599]: I1012 07:36:13.544877 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 07:36:13 crc kubenswrapper[4599]: E1012 07:36:13.544938 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 07:36:13 crc kubenswrapper[4599]: E1012 07:36:13.545111 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 07:36:13 crc kubenswrapper[4599]: I1012 07:36:13.556510 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:13Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:13 crc kubenswrapper[4599]: I1012 07:36:13.566479 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:13Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:13 crc kubenswrapper[4599]: I1012 07:36:13.576167 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4b9189620cc5d626fe6c3e7bbe0efd22f7a6f061bc2afd3d33b1e04ebdbfb1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc203371af6667734441e7dca5f3b301365eb1136f363df8c87377ae58168130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:13Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:13 crc kubenswrapper[4599]: I1012 07:36:13.584652 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b574e7cb76b05a81f63a55ce4544b1d4b3a58e6793be2c89620f66282642e39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:13Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:13 crc kubenswrapper[4599]: I1012 07:36:13.594439 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9xbn5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18ca8765-c435-4750-b803-14b539958d9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4db4e7d06f60157d91fcb8514d176f78540085701360ebcfa7eabb36fe84d6b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16c75354b9fbe00273ab4aaa5ca1306c5d097996b8ce8074a2cee8185118a001\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16c75354b9fbe00273ab4aaa5ca1306c5d097996b8ce8074a2cee8185118a001\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a7c28a100796e21008c597d15ee36786f7e87553a8a4feb5734ac79cef9b556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a7c28a100796e21008c597d15ee36786f7e87553a8a4feb5734ac79cef9b556\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6388c1e143c0caded216941ec25bd07be0a14f2f97f6328c35d5cc04815a88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6388c1e143c0caded216941ec25bd07be0a14f2f97f6328c35d5cc04815a88a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e201774968890569efafc736378cff0d700067bb0bd39677f42ef65fa2c07646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e201774968890569efafc736378cff0d700067bb0bd39677f42ef65fa2c07646\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f07c63fbebde2f05a5b1682af69eb42b5aec20728e0997f1ebc2f5a59815ad1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f07c63fbebde2f05a5b1682af69eb42b5aec20728e0997f1ebc2f5a59815ad1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f62b824f1e7400e9588b3d3a1afc52f86b3bcae039fbe3c757e1e01733e3a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f62b824f1e7400e9588b3d3a1afc52f86b3bcae039fbe3c757e1e01733e3a69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9xbn5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:13Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:13 crc kubenswrapper[4599]: I1012 07:36:13.602821 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8hm26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce311f52-0501-45d3-8209-b1d2aa25028b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d07bb1c67c880fb49b68be21180f0a5a053da62097b38605440625d10650033\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://611114a1fecbd461507dd2f566f975c7ef3e8d7655d90c1a5b31cd05a430cb38\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T07:36:07Z\\\",\\\"message\\\":\\\"2025-10-12T07:35:22+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_dbf0399a-84fd-451c-96bc-e20b8ff4f200\\\\n2025-10-12T07:35:22+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_dbf0399a-84fd-451c-96bc-e20b8ff4f200 to /host/opt/cni/bin/\\\\n2025-10-12T07:35:22Z [verbose] multus-daemon started\\\\n2025-10-12T07:35:22Z [verbose] Readiness Indicator file check\\\\n2025-10-12T07:36:07Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:36:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8hm26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:13Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:13 crc kubenswrapper[4599]: I1012 07:36:13.610592 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"636df6e3-ed6a-452a-9d5d-e26139f62951\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f52b4e0ae3f90600ec5e799352a1f70b4fad54a4805616d1508c22af008a0f8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56aa5b66fc1ca787e84c8ca2a717038f9e522862fb680d5271921e410d5841f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://091875c6e2e9f643a0c31783ab1905c5fd448d20e98b9b4de70be991e7e8f1c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d08b26fb90464c31907b23edfd9f4997375c3178e8699cf951af85424d9354d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d08b26fb90464c31907b23edfd9f4997375c3178e8699cf951af85424d9354d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:03Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:13Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:13 crc kubenswrapper[4599]: I1012 07:36:13.619948 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rxr42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92909e59-b659-4fc0-91c0-1880ff96b4f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4162f0c9d7789e80004cf3ea54eed47b4d473832cf68a03e348973c300f9dcc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7hr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rxr42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:13Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:13 crc kubenswrapper[4599]: I1012 07:36:13.626892 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f5988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0147ed28-bc0f-409c-a813-dc2ffffba092\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b06de751793dade650ec18334a6bcf8bb26e5e89af47ede264d087d1a950178a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzz9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f5988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:13Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:13 crc kubenswrapper[4599]: I1012 07:36:13.638937 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a95b7ab-8632-4332-a30f-64f28ef8d313\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://543f674ce98b17c90c5a351ae72f596a2e90006a1b86fcede3359715c67b94bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c150b687d6b0c337604ac14537335d57d9c955a9e31001d274507e5e65ac7bc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f73bbbb549129c48a1421dda3d2385cf3515feafd8215cdac6b792823d614b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a156f480c7252f61ec3dc991814db0e7259fdf7ae618c8f43d86da97f8a6bc5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f99854365302221a36d3d7647292430b5655222c551953b6aa6de9eb1c922bea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://810de96e34e6b028e90bcc5693f1624d337cdd775c01901fc91e6dd2a7f3564d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72e00dcebdb38459b2ffe60407ea9c8a343a2a57435d0596134981223e989f8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72e00dcebdb38459b2ffe60407ea9c8a343a2a57435d0596134981223e989f8a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T07:35:46Z\\\",\\\"message\\\":\\\"achine-config-daemon-5mz5c openshift-multus/multus-8hm26 openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xl2ns openshift-dns/node-resolver-f5988 openshift-multus/network-metrics-daemon-kwphq openshift-network-diagnostics/network-check-target-xd92c openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-kube-apiserver/kube-apiserver-crc openshift-kube-controller-manager/kube-controller-manager-crc openshift-network-operator/iptables-alerter-4ln5h]\\\\nI1012 07:35:46.210430 6238 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nF1012 07:35:46.210433 6238 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired o\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-whk5b_openshift-ovn-kubernetes(1a95b7ab-8632-4332-a30f-64f28ef8d313)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce496909e393b4896eb9ae165317648df3e828d219c231d95c6057cd08b78fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce4f64c7fab07385f9215821c5c75c5b345653fa141a238bb3dd74e7a1a291a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce4f64c7fab07385f9215821c5c75c5b345653fa141a238bb3dd74e7a1a291a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-whk5b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:13Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:13 crc kubenswrapper[4599]: I1012 07:36:13.641118 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:13 crc kubenswrapper[4599]: I1012 07:36:13.641173 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:13 crc kubenswrapper[4599]: I1012 07:36:13.641183 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:13 crc kubenswrapper[4599]: I1012 07:36:13.641200 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:13 crc kubenswrapper[4599]: I1012 07:36:13.641210 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:13Z","lastTransitionTime":"2025-10-12T07:36:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:13 crc kubenswrapper[4599]: I1012 07:36:13.650173 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kwphq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c3e76cc-139b-4a2a-b96b-6077e3706376\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9v52\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9v52\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kwphq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:13Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:13 crc kubenswrapper[4599]: I1012 07:36:13.660726 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8afb4c88-1806-483a-9957-30833be28202\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c61ee75e4bbe142c54914177759cf0c49ecc36f904c8c6bf421cd4917256d5a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://545c8135b4e5319ac3b9c84ee50556b66e4f5cfe0987f5aa37ef668130f6f267\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40188eb89126b26a3002269d063827de60fa17f72c18fbbd0e15b16c744e8ddb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc5c76ab7433309a1f2c32d2ada884b5b0d926fee8219463079f8689a1f7efef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21d90e31b7ebf70d48881d9d0837e8a1fdeabc3d53988c7ba993ce28e902d3d4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T07:35:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW1012 07:35:20.755648 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1012 07:35:20.755757 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 07:35:20.757774 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2293705747/tls.crt::/tmp/serving-cert-2293705747/tls.key\\\\\\\"\\\\nI1012 07:35:20.975376 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1012 07:35:20.977324 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1012 07:35:20.977357 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1012 07:35:20.977377 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1012 07:35:20.977382 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1012 07:35:20.982144 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1012 07:35:20.982166 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1012 07:35:20.982172 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1012 07:35:20.982212 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1012 07:35:20.982216 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1012 07:35:20.982220 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1012 07:35:20.982223 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1012 07:35:20.982226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1012 07:35:20.985689 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e595e7d1dee4ecff3b5deb6e363b57de09e4bcfddd6ab9ae9be27b6e3578628\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afc60525d60f5b497a75efb631611d658c23f6336c21ca7421bf3efee3ece0ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afc60525d60f5b497a75efb631611d658c23f6336c21ca7421bf3efee3ece0ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:13Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:13 crc kubenswrapper[4599]: I1012 07:36:13.670491 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db79c5e0b03856a5da5fba6b8e6cb77590ef9e29090ac66da24d54ea638a20f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:13Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:13 crc kubenswrapper[4599]: I1012 07:36:13.679510 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:13Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:13 crc kubenswrapper[4599]: I1012 07:36:13.687701 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc694bce-8c25-4729-b452-29d44d3efe6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c222e0cd037e0719d1ea94d4f28639dae42c676fa56b5f57b22cd50d87e6b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25l5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://238fe8d2b2e7f08d70bc5f938aca3c90b68c447db21a713d651d47b47a8127c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25l5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5mz5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:13Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:13 crc kubenswrapper[4599]: I1012 07:36:13.696069 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xl2ns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29fbfc4b-8e32-4132-a34d-48b25ec31428\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e2f2c3c17ada7047edb8e0fcd51227104458618c98e5ddec12123aa7ee173e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqqzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b84de1dd84c3c72376a170b7ccbdfff8c3bdd5e9e18a344002e112e81ca555a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqqzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xl2ns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:13Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:13 crc kubenswrapper[4599]: I1012 07:36:13.704869 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2869b43-5f5f-4ed7-9dc3-63d210bde137\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21abc118842f09c54256bd05f1e5978c928bad569bc9b52157287576117fb49d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb25175bfe42e193200fbc76f82294afbd5f23d9653637c12ddf67553431748e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da490def77b19758f5479ef7070846dee19a5057bd6b0568d96fd38a8dd6528\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://622eac195ecab593bc5fa71a4a88ac987a45be5fb130336aae7acd680c51abe4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:13Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:13 crc kubenswrapper[4599]: I1012 07:36:13.744598 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:13 crc kubenswrapper[4599]: I1012 07:36:13.744646 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:13 crc kubenswrapper[4599]: I1012 07:36:13.744657 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:13 crc kubenswrapper[4599]: I1012 07:36:13.744676 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:13 crc kubenswrapper[4599]: I1012 07:36:13.744686 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:13Z","lastTransitionTime":"2025-10-12T07:36:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:13 crc kubenswrapper[4599]: I1012 07:36:13.846926 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:13 crc kubenswrapper[4599]: I1012 07:36:13.846979 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:13 crc kubenswrapper[4599]: I1012 07:36:13.846991 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:13 crc kubenswrapper[4599]: I1012 07:36:13.847009 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:13 crc kubenswrapper[4599]: I1012 07:36:13.847022 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:13Z","lastTransitionTime":"2025-10-12T07:36:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:13 crc kubenswrapper[4599]: I1012 07:36:13.949790 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:13 crc kubenswrapper[4599]: I1012 07:36:13.949840 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:13 crc kubenswrapper[4599]: I1012 07:36:13.949869 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:13 crc kubenswrapper[4599]: I1012 07:36:13.949890 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:13 crc kubenswrapper[4599]: I1012 07:36:13.949900 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:13Z","lastTransitionTime":"2025-10-12T07:36:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:14 crc kubenswrapper[4599]: I1012 07:36:14.052182 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:14 crc kubenswrapper[4599]: I1012 07:36:14.052227 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:14 crc kubenswrapper[4599]: I1012 07:36:14.052241 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:14 crc kubenswrapper[4599]: I1012 07:36:14.052260 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:14 crc kubenswrapper[4599]: I1012 07:36:14.052276 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:14Z","lastTransitionTime":"2025-10-12T07:36:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:14 crc kubenswrapper[4599]: I1012 07:36:14.154686 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:14 crc kubenswrapper[4599]: I1012 07:36:14.154734 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:14 crc kubenswrapper[4599]: I1012 07:36:14.154744 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:14 crc kubenswrapper[4599]: I1012 07:36:14.154770 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:14 crc kubenswrapper[4599]: I1012 07:36:14.154782 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:14Z","lastTransitionTime":"2025-10-12T07:36:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:14 crc kubenswrapper[4599]: I1012 07:36:14.257063 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:14 crc kubenswrapper[4599]: I1012 07:36:14.257144 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:14 crc kubenswrapper[4599]: I1012 07:36:14.257159 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:14 crc kubenswrapper[4599]: I1012 07:36:14.257180 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:14 crc kubenswrapper[4599]: I1012 07:36:14.257194 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:14Z","lastTransitionTime":"2025-10-12T07:36:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:14 crc kubenswrapper[4599]: I1012 07:36:14.359723 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:14 crc kubenswrapper[4599]: I1012 07:36:14.359772 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:14 crc kubenswrapper[4599]: I1012 07:36:14.359783 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:14 crc kubenswrapper[4599]: I1012 07:36:14.359801 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:14 crc kubenswrapper[4599]: I1012 07:36:14.359812 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:14Z","lastTransitionTime":"2025-10-12T07:36:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:14 crc kubenswrapper[4599]: I1012 07:36:14.462589 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:14 crc kubenswrapper[4599]: I1012 07:36:14.462644 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:14 crc kubenswrapper[4599]: I1012 07:36:14.462653 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:14 crc kubenswrapper[4599]: I1012 07:36:14.462673 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:14 crc kubenswrapper[4599]: I1012 07:36:14.462685 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:14Z","lastTransitionTime":"2025-10-12T07:36:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:14 crc kubenswrapper[4599]: I1012 07:36:14.544632 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kwphq" Oct 12 07:36:14 crc kubenswrapper[4599]: E1012 07:36:14.544790 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kwphq" podUID="3c3e76cc-139b-4a2a-b96b-6077e3706376" Oct 12 07:36:14 crc kubenswrapper[4599]: I1012 07:36:14.564544 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:14 crc kubenswrapper[4599]: I1012 07:36:14.564605 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:14 crc kubenswrapper[4599]: I1012 07:36:14.564619 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:14 crc kubenswrapper[4599]: I1012 07:36:14.564637 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:14 crc kubenswrapper[4599]: I1012 07:36:14.564648 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:14Z","lastTransitionTime":"2025-10-12T07:36:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:14 crc kubenswrapper[4599]: I1012 07:36:14.667109 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:14 crc kubenswrapper[4599]: I1012 07:36:14.667159 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:14 crc kubenswrapper[4599]: I1012 07:36:14.667169 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:14 crc kubenswrapper[4599]: I1012 07:36:14.667191 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:14 crc kubenswrapper[4599]: I1012 07:36:14.667203 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:14Z","lastTransitionTime":"2025-10-12T07:36:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:14 crc kubenswrapper[4599]: I1012 07:36:14.769849 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:14 crc kubenswrapper[4599]: I1012 07:36:14.769896 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:14 crc kubenswrapper[4599]: I1012 07:36:14.769906 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:14 crc kubenswrapper[4599]: I1012 07:36:14.769927 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:14 crc kubenswrapper[4599]: I1012 07:36:14.769939 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:14Z","lastTransitionTime":"2025-10-12T07:36:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:14 crc kubenswrapper[4599]: I1012 07:36:14.872448 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:14 crc kubenswrapper[4599]: I1012 07:36:14.872495 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:14 crc kubenswrapper[4599]: I1012 07:36:14.872508 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:14 crc kubenswrapper[4599]: I1012 07:36:14.872529 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:14 crc kubenswrapper[4599]: I1012 07:36:14.872545 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:14Z","lastTransitionTime":"2025-10-12T07:36:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:14 crc kubenswrapper[4599]: I1012 07:36:14.975357 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:14 crc kubenswrapper[4599]: I1012 07:36:14.975391 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:14 crc kubenswrapper[4599]: I1012 07:36:14.975400 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:14 crc kubenswrapper[4599]: I1012 07:36:14.975415 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:14 crc kubenswrapper[4599]: I1012 07:36:14.975424 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:14Z","lastTransitionTime":"2025-10-12T07:36:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:15 crc kubenswrapper[4599]: I1012 07:36:15.078035 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:15 crc kubenswrapper[4599]: I1012 07:36:15.078086 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:15 crc kubenswrapper[4599]: I1012 07:36:15.078096 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:15 crc kubenswrapper[4599]: I1012 07:36:15.078117 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:15 crc kubenswrapper[4599]: I1012 07:36:15.078128 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:15Z","lastTransitionTime":"2025-10-12T07:36:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:15 crc kubenswrapper[4599]: I1012 07:36:15.181176 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:15 crc kubenswrapper[4599]: I1012 07:36:15.181213 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:15 crc kubenswrapper[4599]: I1012 07:36:15.181222 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:15 crc kubenswrapper[4599]: I1012 07:36:15.181237 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:15 crc kubenswrapper[4599]: I1012 07:36:15.181248 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:15Z","lastTransitionTime":"2025-10-12T07:36:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:15 crc kubenswrapper[4599]: I1012 07:36:15.283516 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:15 crc kubenswrapper[4599]: I1012 07:36:15.283576 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:15 crc kubenswrapper[4599]: I1012 07:36:15.283588 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:15 crc kubenswrapper[4599]: I1012 07:36:15.283605 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:15 crc kubenswrapper[4599]: I1012 07:36:15.283615 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:15Z","lastTransitionTime":"2025-10-12T07:36:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:15 crc kubenswrapper[4599]: I1012 07:36:15.386707 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:15 crc kubenswrapper[4599]: I1012 07:36:15.386744 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:15 crc kubenswrapper[4599]: I1012 07:36:15.386755 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:15 crc kubenswrapper[4599]: I1012 07:36:15.386772 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:15 crc kubenswrapper[4599]: I1012 07:36:15.386786 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:15Z","lastTransitionTime":"2025-10-12T07:36:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:15 crc kubenswrapper[4599]: I1012 07:36:15.450317 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:15 crc kubenswrapper[4599]: I1012 07:36:15.450388 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:15 crc kubenswrapper[4599]: I1012 07:36:15.450399 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:15 crc kubenswrapper[4599]: I1012 07:36:15.450419 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:15 crc kubenswrapper[4599]: I1012 07:36:15.450430 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:15Z","lastTransitionTime":"2025-10-12T07:36:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:15 crc kubenswrapper[4599]: E1012 07:36:15.462097 4599 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:36:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:36:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:36:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:36:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:36:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:36:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:36:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:36:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b3fd0901-686d-4e85-8767-c56bc470edcc\\\",\\\"systemUUID\\\":\\\"c0d90076-d180-408b-98a9-f48996ced0a6\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:15Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:15 crc kubenswrapper[4599]: I1012 07:36:15.466943 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:15 crc kubenswrapper[4599]: I1012 07:36:15.467015 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:15 crc kubenswrapper[4599]: I1012 07:36:15.467029 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:15 crc kubenswrapper[4599]: I1012 07:36:15.467054 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:15 crc kubenswrapper[4599]: I1012 07:36:15.467066 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:15Z","lastTransitionTime":"2025-10-12T07:36:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:15 crc kubenswrapper[4599]: E1012 07:36:15.478150 4599 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:36:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:36:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:36:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:36:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:36:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:36:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:36:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:36:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b3fd0901-686d-4e85-8767-c56bc470edcc\\\",\\\"systemUUID\\\":\\\"c0d90076-d180-408b-98a9-f48996ced0a6\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:15Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:15 crc kubenswrapper[4599]: I1012 07:36:15.481666 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:15 crc kubenswrapper[4599]: I1012 07:36:15.481717 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:15 crc kubenswrapper[4599]: I1012 07:36:15.481728 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:15 crc kubenswrapper[4599]: I1012 07:36:15.481755 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:15 crc kubenswrapper[4599]: I1012 07:36:15.481770 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:15Z","lastTransitionTime":"2025-10-12T07:36:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:15 crc kubenswrapper[4599]: E1012 07:36:15.494243 4599 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:36:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:36:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:36:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:36:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:36:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:36:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:36:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:36:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b3fd0901-686d-4e85-8767-c56bc470edcc\\\",\\\"systemUUID\\\":\\\"c0d90076-d180-408b-98a9-f48996ced0a6\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:15Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:15 crc kubenswrapper[4599]: I1012 07:36:15.498087 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:15 crc kubenswrapper[4599]: I1012 07:36:15.498128 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:15 crc kubenswrapper[4599]: I1012 07:36:15.498147 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:15 crc kubenswrapper[4599]: I1012 07:36:15.498165 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:15 crc kubenswrapper[4599]: I1012 07:36:15.498177 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:15Z","lastTransitionTime":"2025-10-12T07:36:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:15 crc kubenswrapper[4599]: E1012 07:36:15.509126 4599 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:36:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:36:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:36:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:36:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:36:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:36:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:36:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:36:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b3fd0901-686d-4e85-8767-c56bc470edcc\\\",\\\"systemUUID\\\":\\\"c0d90076-d180-408b-98a9-f48996ced0a6\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:15Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:15 crc kubenswrapper[4599]: I1012 07:36:15.512787 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:15 crc kubenswrapper[4599]: I1012 07:36:15.512827 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:15 crc kubenswrapper[4599]: I1012 07:36:15.512840 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:15 crc kubenswrapper[4599]: I1012 07:36:15.512854 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:15 crc kubenswrapper[4599]: I1012 07:36:15.512862 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:15Z","lastTransitionTime":"2025-10-12T07:36:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:15 crc kubenswrapper[4599]: E1012 07:36:15.523002 4599 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:36:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:36:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:36:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:36:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:36:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:36:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:36:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:36:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b3fd0901-686d-4e85-8767-c56bc470edcc\\\",\\\"systemUUID\\\":\\\"c0d90076-d180-408b-98a9-f48996ced0a6\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:15Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:15 crc kubenswrapper[4599]: E1012 07:36:15.523144 4599 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 12 07:36:15 crc kubenswrapper[4599]: I1012 07:36:15.524418 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:15 crc kubenswrapper[4599]: I1012 07:36:15.524453 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:15 crc kubenswrapper[4599]: I1012 07:36:15.524463 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:15 crc kubenswrapper[4599]: I1012 07:36:15.524500 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:15 crc kubenswrapper[4599]: I1012 07:36:15.524512 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:15Z","lastTransitionTime":"2025-10-12T07:36:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:15 crc kubenswrapper[4599]: I1012 07:36:15.545173 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 07:36:15 crc kubenswrapper[4599]: E1012 07:36:15.545368 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 07:36:15 crc kubenswrapper[4599]: I1012 07:36:15.545408 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 07:36:15 crc kubenswrapper[4599]: I1012 07:36:15.545391 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 07:36:15 crc kubenswrapper[4599]: E1012 07:36:15.545920 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 07:36:15 crc kubenswrapper[4599]: I1012 07:36:15.546181 4599 scope.go:117] "RemoveContainer" containerID="72e00dcebdb38459b2ffe60407ea9c8a343a2a57435d0596134981223e989f8a" Oct 12 07:36:15 crc kubenswrapper[4599]: E1012 07:36:15.546172 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 07:36:15 crc kubenswrapper[4599]: I1012 07:36:15.626388 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:15 crc kubenswrapper[4599]: I1012 07:36:15.626430 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:15 crc kubenswrapper[4599]: I1012 07:36:15.626446 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:15 crc kubenswrapper[4599]: I1012 07:36:15.626467 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:15 crc kubenswrapper[4599]: I1012 07:36:15.626481 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:15Z","lastTransitionTime":"2025-10-12T07:36:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:15 crc kubenswrapper[4599]: I1012 07:36:15.729302 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:15 crc kubenswrapper[4599]: I1012 07:36:15.729328 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:15 crc kubenswrapper[4599]: I1012 07:36:15.729353 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:15 crc kubenswrapper[4599]: I1012 07:36:15.729369 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:15 crc kubenswrapper[4599]: I1012 07:36:15.729379 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:15Z","lastTransitionTime":"2025-10-12T07:36:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:15 crc kubenswrapper[4599]: I1012 07:36:15.831545 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:15 crc kubenswrapper[4599]: I1012 07:36:15.832256 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:15 crc kubenswrapper[4599]: I1012 07:36:15.832330 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:15 crc kubenswrapper[4599]: I1012 07:36:15.832439 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:15 crc kubenswrapper[4599]: I1012 07:36:15.832501 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:15Z","lastTransitionTime":"2025-10-12T07:36:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:15 crc kubenswrapper[4599]: I1012 07:36:15.880597 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-whk5b_1a95b7ab-8632-4332-a30f-64f28ef8d313/ovnkube-controller/2.log" Oct 12 07:36:15 crc kubenswrapper[4599]: I1012 07:36:15.883231 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" event={"ID":"1a95b7ab-8632-4332-a30f-64f28ef8d313","Type":"ContainerStarted","Data":"22386ac9ba5ff4e4575b13c71cd5865e1a6ea9c403197c2c0ea796f37f1085d1"} Oct 12 07:36:15 crc kubenswrapper[4599]: I1012 07:36:15.883696 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" Oct 12 07:36:15 crc kubenswrapper[4599]: I1012 07:36:15.895039 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2869b43-5f5f-4ed7-9dc3-63d210bde137\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21abc118842f09c54256bd05f1e5978c928bad569bc9b52157287576117fb49d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb25175bfe42e193200fbc76f82294afbd5f23d9653637c12ddf67553431748e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da490def77b19758f5479ef7070846dee19a5057bd6b0568d96fd38a8dd6528\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://622eac195ecab593bc5fa71a4a88ac987a45be5fb130336aae7acd680c51abe4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:15Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:15 crc kubenswrapper[4599]: I1012 07:36:15.908423 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9xbn5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18ca8765-c435-4750-b803-14b539958d9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4db4e7d06f60157d91fcb8514d176f78540085701360ebcfa7eabb36fe84d6b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16c75354b9fbe00273ab4aaa5ca1306c5d097996b8ce8074a2cee8185118a001\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16c75354b9fbe00273ab4aaa5ca1306c5d097996b8ce8074a2cee8185118a001\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a7c28a100796e21008c597d15ee36786f7e87553a8a4feb5734ac79cef9b556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a7c28a100796e21008c597d15ee36786f7e87553a8a4feb5734ac79cef9b556\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6388c1e143c0caded216941ec25bd07be0a14f2f97f6328c35d5cc04815a88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6388c1e143c0caded216941ec25bd07be0a14f2f97f6328c35d5cc04815a88a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e201774968890569efafc736378cff0d700067bb0bd39677f42ef65fa2c07646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e201774968890569efafc736378cff0d700067bb0bd39677f42ef65fa2c07646\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f07c63fbebde2f05a5b1682af69eb42b5aec20728e0997f1ebc2f5a59815ad1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f07c63fbebde2f05a5b1682af69eb42b5aec20728e0997f1ebc2f5a59815ad1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f62b824f1e7400e9588b3d3a1afc52f86b3bcae039fbe3c757e1e01733e3a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f62b824f1e7400e9588b3d3a1afc52f86b3bcae039fbe3c757e1e01733e3a69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9xbn5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:15Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:15 crc kubenswrapper[4599]: I1012 07:36:15.918812 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8hm26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce311f52-0501-45d3-8209-b1d2aa25028b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d07bb1c67c880fb49b68be21180f0a5a053da62097b38605440625d10650033\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://611114a1fecbd461507dd2f566f975c7ef3e8d7655d90c1a5b31cd05a430cb38\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T07:36:07Z\\\",\\\"message\\\":\\\"2025-10-12T07:35:22+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_dbf0399a-84fd-451c-96bc-e20b8ff4f200\\\\n2025-10-12T07:35:22+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_dbf0399a-84fd-451c-96bc-e20b8ff4f200 to /host/opt/cni/bin/\\\\n2025-10-12T07:35:22Z [verbose] multus-daemon started\\\\n2025-10-12T07:35:22Z [verbose] Readiness Indicator file check\\\\n2025-10-12T07:36:07Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:36:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8hm26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:15Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:15 crc kubenswrapper[4599]: I1012 07:36:15.930486 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:15Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:15 crc kubenswrapper[4599]: I1012 07:36:15.934046 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:15 crc kubenswrapper[4599]: I1012 07:36:15.934076 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:15 crc kubenswrapper[4599]: I1012 07:36:15.934086 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:15 crc kubenswrapper[4599]: I1012 07:36:15.934102 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:15 crc kubenswrapper[4599]: I1012 07:36:15.934113 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:15Z","lastTransitionTime":"2025-10-12T07:36:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:15 crc kubenswrapper[4599]: I1012 07:36:15.940022 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:15Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:15 crc kubenswrapper[4599]: I1012 07:36:15.948526 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4b9189620cc5d626fe6c3e7bbe0efd22f7a6f061bc2afd3d33b1e04ebdbfb1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc203371af6667734441e7dca5f3b301365eb1136f363df8c87377ae58168130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:15Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:15 crc kubenswrapper[4599]: I1012 07:36:15.958796 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b574e7cb76b05a81f63a55ce4544b1d4b3a58e6793be2c89620f66282642e39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:15Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:15 crc kubenswrapper[4599]: I1012 07:36:15.967354 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"636df6e3-ed6a-452a-9d5d-e26139f62951\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f52b4e0ae3f90600ec5e799352a1f70b4fad54a4805616d1508c22af008a0f8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56aa5b66fc1ca787e84c8ca2a717038f9e522862fb680d5271921e410d5841f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://091875c6e2e9f643a0c31783ab1905c5fd448d20e98b9b4de70be991e7e8f1c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d08b26fb90464c31907b23edfd9f4997375c3178e8699cf951af85424d9354d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d08b26fb90464c31907b23edfd9f4997375c3178e8699cf951af85424d9354d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:03Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:15Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:15 crc kubenswrapper[4599]: I1012 07:36:15.975661 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rxr42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92909e59-b659-4fc0-91c0-1880ff96b4f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4162f0c9d7789e80004cf3ea54eed47b4d473832cf68a03e348973c300f9dcc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7hr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rxr42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:15Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:15 crc kubenswrapper[4599]: I1012 07:36:15.982969 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f5988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0147ed28-bc0f-409c-a813-dc2ffffba092\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b06de751793dade650ec18334a6bcf8bb26e5e89af47ede264d087d1a950178a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzz9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f5988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:15Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:15 crc kubenswrapper[4599]: I1012 07:36:15.999382 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a95b7ab-8632-4332-a30f-64f28ef8d313\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://543f674ce98b17c90c5a351ae72f596a2e90006a1b86fcede3359715c67b94bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c150b687d6b0c337604ac14537335d57d9c955a9e31001d274507e5e65ac7bc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f73bbbb549129c48a1421dda3d2385cf3515feafd8215cdac6b792823d614b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a156f480c7252f61ec3dc991814db0e7259fdf7ae618c8f43d86da97f8a6bc5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f99854365302221a36d3d7647292430b5655222c551953b6aa6de9eb1c922bea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://810de96e34e6b028e90bcc5693f1624d337cdd775c01901fc91e6dd2a7f3564d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22386ac9ba5ff4e4575b13c71cd5865e1a6ea9c403197c2c0ea796f37f1085d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72e00dcebdb38459b2ffe60407ea9c8a343a2a57435d0596134981223e989f8a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T07:35:46Z\\\",\\\"message\\\":\\\"achine-config-daemon-5mz5c openshift-multus/multus-8hm26 openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xl2ns openshift-dns/node-resolver-f5988 openshift-multus/network-metrics-daemon-kwphq openshift-network-diagnostics/network-check-target-xd92c openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-kube-apiserver/kube-apiserver-crc openshift-kube-controller-manager/kube-controller-manager-crc openshift-network-operator/iptables-alerter-4ln5h]\\\\nI1012 07:35:46.210430 6238 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nF1012 07:35:46.210433 6238 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired o\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:36:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce496909e393b4896eb9ae165317648df3e828d219c231d95c6057cd08b78fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce4f64c7fab07385f9215821c5c75c5b345653fa141a238bb3dd74e7a1a291a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce4f64c7fab07385f9215821c5c75c5b345653fa141a238bb3dd74e7a1a291a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-whk5b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:15Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:16 crc kubenswrapper[4599]: I1012 07:36:16.009312 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kwphq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c3e76cc-139b-4a2a-b96b-6077e3706376\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9v52\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9v52\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kwphq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:16Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:16 crc kubenswrapper[4599]: I1012 07:36:16.020610 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8afb4c88-1806-483a-9957-30833be28202\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c61ee75e4bbe142c54914177759cf0c49ecc36f904c8c6bf421cd4917256d5a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://545c8135b4e5319ac3b9c84ee50556b66e4f5cfe0987f5aa37ef668130f6f267\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40188eb89126b26a3002269d063827de60fa17f72c18fbbd0e15b16c744e8ddb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc5c76ab7433309a1f2c32d2ada884b5b0d926fee8219463079f8689a1f7efef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21d90e31b7ebf70d48881d9d0837e8a1fdeabc3d53988c7ba993ce28e902d3d4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T07:35:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW1012 07:35:20.755648 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1012 07:35:20.755757 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 07:35:20.757774 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2293705747/tls.crt::/tmp/serving-cert-2293705747/tls.key\\\\\\\"\\\\nI1012 07:35:20.975376 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1012 07:35:20.977324 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1012 07:35:20.977357 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1012 07:35:20.977377 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1012 07:35:20.977382 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1012 07:35:20.982144 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1012 07:35:20.982166 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1012 07:35:20.982172 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1012 07:35:20.982212 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1012 07:35:20.982216 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1012 07:35:20.982220 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1012 07:35:20.982223 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1012 07:35:20.982226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1012 07:35:20.985689 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e595e7d1dee4ecff3b5deb6e363b57de09e4bcfddd6ab9ae9be27b6e3578628\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afc60525d60f5b497a75efb631611d658c23f6336c21ca7421bf3efee3ece0ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afc60525d60f5b497a75efb631611d658c23f6336c21ca7421bf3efee3ece0ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:16Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:16 crc kubenswrapper[4599]: I1012 07:36:16.034135 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db79c5e0b03856a5da5fba6b8e6cb77590ef9e29090ac66da24d54ea638a20f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:16Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:16 crc kubenswrapper[4599]: I1012 07:36:16.036924 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:16 crc kubenswrapper[4599]: I1012 07:36:16.036998 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:16 crc kubenswrapper[4599]: I1012 07:36:16.037011 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:16 crc kubenswrapper[4599]: I1012 07:36:16.037032 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:16 crc kubenswrapper[4599]: I1012 07:36:16.037044 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:16Z","lastTransitionTime":"2025-10-12T07:36:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:16 crc kubenswrapper[4599]: I1012 07:36:16.046217 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:16Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:16 crc kubenswrapper[4599]: I1012 07:36:16.057809 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc694bce-8c25-4729-b452-29d44d3efe6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c222e0cd037e0719d1ea94d4f28639dae42c676fa56b5f57b22cd50d87e6b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25l5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://238fe8d2b2e7f08d70bc5f938aca3c90b68c447db21a713d651d47b47a8127c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25l5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5mz5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:16Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:16 crc kubenswrapper[4599]: I1012 07:36:16.067961 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xl2ns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29fbfc4b-8e32-4132-a34d-48b25ec31428\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e2f2c3c17ada7047edb8e0fcd51227104458618c98e5ddec12123aa7ee173e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqqzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b84de1dd84c3c72376a170b7ccbdfff8c3bdd5e9e18a344002e112e81ca555a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqqzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xl2ns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:16Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:16 crc kubenswrapper[4599]: I1012 07:36:16.140238 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:16 crc kubenswrapper[4599]: I1012 07:36:16.140278 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:16 crc kubenswrapper[4599]: I1012 07:36:16.140290 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:16 crc kubenswrapper[4599]: I1012 07:36:16.140310 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:16 crc kubenswrapper[4599]: I1012 07:36:16.140323 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:16Z","lastTransitionTime":"2025-10-12T07:36:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:16 crc kubenswrapper[4599]: I1012 07:36:16.242322 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:16 crc kubenswrapper[4599]: I1012 07:36:16.242380 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:16 crc kubenswrapper[4599]: I1012 07:36:16.242391 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:16 crc kubenswrapper[4599]: I1012 07:36:16.242413 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:16 crc kubenswrapper[4599]: I1012 07:36:16.242424 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:16Z","lastTransitionTime":"2025-10-12T07:36:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:16 crc kubenswrapper[4599]: I1012 07:36:16.344998 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:16 crc kubenswrapper[4599]: I1012 07:36:16.345043 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:16 crc kubenswrapper[4599]: I1012 07:36:16.345052 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:16 crc kubenswrapper[4599]: I1012 07:36:16.345068 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:16 crc kubenswrapper[4599]: I1012 07:36:16.345079 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:16Z","lastTransitionTime":"2025-10-12T07:36:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:16 crc kubenswrapper[4599]: I1012 07:36:16.447459 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:16 crc kubenswrapper[4599]: I1012 07:36:16.447512 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:16 crc kubenswrapper[4599]: I1012 07:36:16.447523 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:16 crc kubenswrapper[4599]: I1012 07:36:16.447546 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:16 crc kubenswrapper[4599]: I1012 07:36:16.447559 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:16Z","lastTransitionTime":"2025-10-12T07:36:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:16 crc kubenswrapper[4599]: I1012 07:36:16.544951 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kwphq" Oct 12 07:36:16 crc kubenswrapper[4599]: E1012 07:36:16.545112 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kwphq" podUID="3c3e76cc-139b-4a2a-b96b-6077e3706376" Oct 12 07:36:16 crc kubenswrapper[4599]: I1012 07:36:16.549755 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:16 crc kubenswrapper[4599]: I1012 07:36:16.549806 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:16 crc kubenswrapper[4599]: I1012 07:36:16.549818 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:16 crc kubenswrapper[4599]: I1012 07:36:16.549838 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:16 crc kubenswrapper[4599]: I1012 07:36:16.549851 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:16Z","lastTransitionTime":"2025-10-12T07:36:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:16 crc kubenswrapper[4599]: I1012 07:36:16.652614 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:16 crc kubenswrapper[4599]: I1012 07:36:16.652665 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:16 crc kubenswrapper[4599]: I1012 07:36:16.652677 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:16 crc kubenswrapper[4599]: I1012 07:36:16.652697 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:16 crc kubenswrapper[4599]: I1012 07:36:16.652708 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:16Z","lastTransitionTime":"2025-10-12T07:36:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:16 crc kubenswrapper[4599]: I1012 07:36:16.755325 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:16 crc kubenswrapper[4599]: I1012 07:36:16.755406 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:16 crc kubenswrapper[4599]: I1012 07:36:16.755417 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:16 crc kubenswrapper[4599]: I1012 07:36:16.755437 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:16 crc kubenswrapper[4599]: I1012 07:36:16.755454 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:16Z","lastTransitionTime":"2025-10-12T07:36:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:16 crc kubenswrapper[4599]: I1012 07:36:16.858203 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:16 crc kubenswrapper[4599]: I1012 07:36:16.858252 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:16 crc kubenswrapper[4599]: I1012 07:36:16.858262 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:16 crc kubenswrapper[4599]: I1012 07:36:16.858286 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:16 crc kubenswrapper[4599]: I1012 07:36:16.858303 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:16Z","lastTransitionTime":"2025-10-12T07:36:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:16 crc kubenswrapper[4599]: I1012 07:36:16.888118 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-whk5b_1a95b7ab-8632-4332-a30f-64f28ef8d313/ovnkube-controller/3.log" Oct 12 07:36:16 crc kubenswrapper[4599]: I1012 07:36:16.888694 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-whk5b_1a95b7ab-8632-4332-a30f-64f28ef8d313/ovnkube-controller/2.log" Oct 12 07:36:16 crc kubenswrapper[4599]: I1012 07:36:16.891802 4599 generic.go:334] "Generic (PLEG): container finished" podID="1a95b7ab-8632-4332-a30f-64f28ef8d313" containerID="22386ac9ba5ff4e4575b13c71cd5865e1a6ea9c403197c2c0ea796f37f1085d1" exitCode=1 Oct 12 07:36:16 crc kubenswrapper[4599]: I1012 07:36:16.891852 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" event={"ID":"1a95b7ab-8632-4332-a30f-64f28ef8d313","Type":"ContainerDied","Data":"22386ac9ba5ff4e4575b13c71cd5865e1a6ea9c403197c2c0ea796f37f1085d1"} Oct 12 07:36:16 crc kubenswrapper[4599]: I1012 07:36:16.891926 4599 scope.go:117] "RemoveContainer" containerID="72e00dcebdb38459b2ffe60407ea9c8a343a2a57435d0596134981223e989f8a" Oct 12 07:36:16 crc kubenswrapper[4599]: I1012 07:36:16.893890 4599 scope.go:117] "RemoveContainer" containerID="22386ac9ba5ff4e4575b13c71cd5865e1a6ea9c403197c2c0ea796f37f1085d1" Oct 12 07:36:16 crc kubenswrapper[4599]: E1012 07:36:16.894246 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-whk5b_openshift-ovn-kubernetes(1a95b7ab-8632-4332-a30f-64f28ef8d313)\"" pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" podUID="1a95b7ab-8632-4332-a30f-64f28ef8d313" Oct 12 07:36:16 crc kubenswrapper[4599]: I1012 07:36:16.905620 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4b9189620cc5d626fe6c3e7bbe0efd22f7a6f061bc2afd3d33b1e04ebdbfb1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc203371af6667734441e7dca5f3b301365eb1136f363df8c87377ae58168130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:16Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:16 crc kubenswrapper[4599]: I1012 07:36:16.915427 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b574e7cb76b05a81f63a55ce4544b1d4b3a58e6793be2c89620f66282642e39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:16Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:16 crc kubenswrapper[4599]: I1012 07:36:16.926629 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9xbn5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18ca8765-c435-4750-b803-14b539958d9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4db4e7d06f60157d91fcb8514d176f78540085701360ebcfa7eabb36fe84d6b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16c75354b9fbe00273ab4aaa5ca1306c5d097996b8ce8074a2cee8185118a001\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16c75354b9fbe00273ab4aaa5ca1306c5d097996b8ce8074a2cee8185118a001\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a7c28a100796e21008c597d15ee36786f7e87553a8a4feb5734ac79cef9b556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a7c28a100796e21008c597d15ee36786f7e87553a8a4feb5734ac79cef9b556\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6388c1e143c0caded216941ec25bd07be0a14f2f97f6328c35d5cc04815a88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6388c1e143c0caded216941ec25bd07be0a14f2f97f6328c35d5cc04815a88a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e201774968890569efafc736378cff0d700067bb0bd39677f42ef65fa2c07646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e201774968890569efafc736378cff0d700067bb0bd39677f42ef65fa2c07646\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f07c63fbebde2f05a5b1682af69eb42b5aec20728e0997f1ebc2f5a59815ad1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f07c63fbebde2f05a5b1682af69eb42b5aec20728e0997f1ebc2f5a59815ad1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f62b824f1e7400e9588b3d3a1afc52f86b3bcae039fbe3c757e1e01733e3a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f62b824f1e7400e9588b3d3a1afc52f86b3bcae039fbe3c757e1e01733e3a69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9xbn5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:16Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:16 crc kubenswrapper[4599]: I1012 07:36:16.937152 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8hm26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce311f52-0501-45d3-8209-b1d2aa25028b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d07bb1c67c880fb49b68be21180f0a5a053da62097b38605440625d10650033\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://611114a1fecbd461507dd2f566f975c7ef3e8d7655d90c1a5b31cd05a430cb38\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T07:36:07Z\\\",\\\"message\\\":\\\"2025-10-12T07:35:22+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_dbf0399a-84fd-451c-96bc-e20b8ff4f200\\\\n2025-10-12T07:35:22+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_dbf0399a-84fd-451c-96bc-e20b8ff4f200 to /host/opt/cni/bin/\\\\n2025-10-12T07:35:22Z [verbose] multus-daemon started\\\\n2025-10-12T07:35:22Z [verbose] Readiness Indicator file check\\\\n2025-10-12T07:36:07Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:36:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8hm26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:16Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:16 crc kubenswrapper[4599]: I1012 07:36:16.946129 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:16Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:16 crc kubenswrapper[4599]: I1012 07:36:16.955367 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:16Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:16 crc kubenswrapper[4599]: I1012 07:36:16.960846 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:16 crc kubenswrapper[4599]: I1012 07:36:16.960902 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:16 crc kubenswrapper[4599]: I1012 07:36:16.960926 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:16 crc kubenswrapper[4599]: I1012 07:36:16.960945 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:16 crc kubenswrapper[4599]: I1012 07:36:16.960957 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:16Z","lastTransitionTime":"2025-10-12T07:36:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:16 crc kubenswrapper[4599]: I1012 07:36:16.972810 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a95b7ab-8632-4332-a30f-64f28ef8d313\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://543f674ce98b17c90c5a351ae72f596a2e90006a1b86fcede3359715c67b94bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c150b687d6b0c337604ac14537335d57d9c955a9e31001d274507e5e65ac7bc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f73bbbb549129c48a1421dda3d2385cf3515feafd8215cdac6b792823d614b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a156f480c7252f61ec3dc991814db0e7259fdf7ae618c8f43d86da97f8a6bc5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f99854365302221a36d3d7647292430b5655222c551953b6aa6de9eb1c922bea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://810de96e34e6b028e90bcc5693f1624d337cdd775c01901fc91e6dd2a7f3564d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22386ac9ba5ff4e4575b13c71cd5865e1a6ea9c403197c2c0ea796f37f1085d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72e00dcebdb38459b2ffe60407ea9c8a343a2a57435d0596134981223e989f8a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T07:35:46Z\\\",\\\"message\\\":\\\"achine-config-daemon-5mz5c openshift-multus/multus-8hm26 openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xl2ns openshift-dns/node-resolver-f5988 openshift-multus/network-metrics-daemon-kwphq openshift-network-diagnostics/network-check-target-xd92c openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-kube-apiserver/kube-apiserver-crc openshift-kube-controller-manager/kube-controller-manager-crc openshift-network-operator/iptables-alerter-4ln5h]\\\\nI1012 07:35:46.210430 6238 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nF1012 07:35:46.210433 6238 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired o\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22386ac9ba5ff4e4575b13c71cd5865e1a6ea9c403197c2c0ea796f37f1085d1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T07:36:16Z\\\",\\\"message\\\":\\\"TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1012 07:36:16.300660 6611 services_controller.go:454] Service openshift-marketplace/redhat-operators for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI1012 07:36:16.300096 6611 ovn.go:134] Ensuring zone local for Pod openshift-image-registry/node-ca-rxr42 in node crc\\\\nI1012 07:36:16.300675 6611 obj_retry.go:386] Retry successful for *v1.Pod openshift-image-registry/node-ca-rxr42 after 0 failed attempt(s)\\\\nI1012 07:36:16.300681 6611 default_network_controller.go:776] Recording success event on pod openshift-image-registry/node-ca-rxr42\\\\nI1012 07:36:16.300674 6611 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/redhat-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"97419c58-41c7-41d7-a137-a446f0c7eeb3\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/redhat-operators\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Gro\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T07:36:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce496909e393b4896eb9ae165317648df3e828d219c231d95c6057cd08b78fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce4f64c7fab07385f9215821c5c75c5b345653fa141a238bb3dd74e7a1a291a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce4f64c7fab07385f9215821c5c75c5b345653fa141a238bb3dd74e7a1a291a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-whk5b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:16Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:16 crc kubenswrapper[4599]: I1012 07:36:16.984737 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kwphq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c3e76cc-139b-4a2a-b96b-6077e3706376\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9v52\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9v52\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kwphq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:16Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:16 crc kubenswrapper[4599]: I1012 07:36:16.994565 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"636df6e3-ed6a-452a-9d5d-e26139f62951\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f52b4e0ae3f90600ec5e799352a1f70b4fad54a4805616d1508c22af008a0f8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56aa5b66fc1ca787e84c8ca2a717038f9e522862fb680d5271921e410d5841f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://091875c6e2e9f643a0c31783ab1905c5fd448d20e98b9b4de70be991e7e8f1c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d08b26fb90464c31907b23edfd9f4997375c3178e8699cf951af85424d9354d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d08b26fb90464c31907b23edfd9f4997375c3178e8699cf951af85424d9354d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:03Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:16Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:17 crc kubenswrapper[4599]: I1012 07:36:17.004982 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rxr42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92909e59-b659-4fc0-91c0-1880ff96b4f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4162f0c9d7789e80004cf3ea54eed47b4d473832cf68a03e348973c300f9dcc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7hr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rxr42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:17Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:17 crc kubenswrapper[4599]: I1012 07:36:17.015564 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f5988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0147ed28-bc0f-409c-a813-dc2ffffba092\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b06de751793dade650ec18334a6bcf8bb26e5e89af47ede264d087d1a950178a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzz9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f5988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:17Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:17 crc kubenswrapper[4599]: I1012 07:36:17.025414 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc694bce-8c25-4729-b452-29d44d3efe6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c222e0cd037e0719d1ea94d4f28639dae42c676fa56b5f57b22cd50d87e6b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25l5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://238fe8d2b2e7f08d70bc5f938aca3c90b68c447db21a713d651d47b47a8127c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25l5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5mz5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:17Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:17 crc kubenswrapper[4599]: I1012 07:36:17.036989 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xl2ns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29fbfc4b-8e32-4132-a34d-48b25ec31428\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e2f2c3c17ada7047edb8e0fcd51227104458618c98e5ddec12123aa7ee173e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqqzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b84de1dd84c3c72376a170b7ccbdfff8c3bdd5e9e18a344002e112e81ca555a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqqzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xl2ns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:17Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:17 crc kubenswrapper[4599]: I1012 07:36:17.048424 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8afb4c88-1806-483a-9957-30833be28202\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c61ee75e4bbe142c54914177759cf0c49ecc36f904c8c6bf421cd4917256d5a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://545c8135b4e5319ac3b9c84ee50556b66e4f5cfe0987f5aa37ef668130f6f267\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40188eb89126b26a3002269d063827de60fa17f72c18fbbd0e15b16c744e8ddb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc5c76ab7433309a1f2c32d2ada884b5b0d926fee8219463079f8689a1f7efef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21d90e31b7ebf70d48881d9d0837e8a1fdeabc3d53988c7ba993ce28e902d3d4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T07:35:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW1012 07:35:20.755648 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1012 07:35:20.755757 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 07:35:20.757774 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2293705747/tls.crt::/tmp/serving-cert-2293705747/tls.key\\\\\\\"\\\\nI1012 07:35:20.975376 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1012 07:35:20.977324 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1012 07:35:20.977357 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1012 07:35:20.977377 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1012 07:35:20.977382 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1012 07:35:20.982144 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1012 07:35:20.982166 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1012 07:35:20.982172 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1012 07:35:20.982212 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1012 07:35:20.982216 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1012 07:35:20.982220 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1012 07:35:20.982223 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1012 07:35:20.982226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1012 07:35:20.985689 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e595e7d1dee4ecff3b5deb6e363b57de09e4bcfddd6ab9ae9be27b6e3578628\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afc60525d60f5b497a75efb631611d658c23f6336c21ca7421bf3efee3ece0ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afc60525d60f5b497a75efb631611d658c23f6336c21ca7421bf3efee3ece0ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:17Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:17 crc kubenswrapper[4599]: I1012 07:36:17.058091 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db79c5e0b03856a5da5fba6b8e6cb77590ef9e29090ac66da24d54ea638a20f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:17Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:17 crc kubenswrapper[4599]: I1012 07:36:17.063647 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:17 crc kubenswrapper[4599]: I1012 07:36:17.063682 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:17 crc kubenswrapper[4599]: I1012 07:36:17.063692 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:17 crc kubenswrapper[4599]: I1012 07:36:17.063710 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:17 crc kubenswrapper[4599]: I1012 07:36:17.063721 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:17Z","lastTransitionTime":"2025-10-12T07:36:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:17 crc kubenswrapper[4599]: I1012 07:36:17.067368 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:17Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:17 crc kubenswrapper[4599]: I1012 07:36:17.076285 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2869b43-5f5f-4ed7-9dc3-63d210bde137\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21abc118842f09c54256bd05f1e5978c928bad569bc9b52157287576117fb49d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb25175bfe42e193200fbc76f82294afbd5f23d9653637c12ddf67553431748e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da490def77b19758f5479ef7070846dee19a5057bd6b0568d96fd38a8dd6528\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://622eac195ecab593bc5fa71a4a88ac987a45be5fb130336aae7acd680c51abe4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:17Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:17 crc kubenswrapper[4599]: I1012 07:36:17.165864 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:17 crc kubenswrapper[4599]: I1012 07:36:17.165948 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:17 crc kubenswrapper[4599]: I1012 07:36:17.165960 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:17 crc kubenswrapper[4599]: I1012 07:36:17.165983 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:17 crc kubenswrapper[4599]: I1012 07:36:17.165995 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:17Z","lastTransitionTime":"2025-10-12T07:36:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:17 crc kubenswrapper[4599]: I1012 07:36:17.268588 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:17 crc kubenswrapper[4599]: I1012 07:36:17.268639 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:17 crc kubenswrapper[4599]: I1012 07:36:17.268648 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:17 crc kubenswrapper[4599]: I1012 07:36:17.268662 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:17 crc kubenswrapper[4599]: I1012 07:36:17.268674 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:17Z","lastTransitionTime":"2025-10-12T07:36:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:17 crc kubenswrapper[4599]: I1012 07:36:17.371400 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:17 crc kubenswrapper[4599]: I1012 07:36:17.371470 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:17 crc kubenswrapper[4599]: I1012 07:36:17.371480 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:17 crc kubenswrapper[4599]: I1012 07:36:17.371498 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:17 crc kubenswrapper[4599]: I1012 07:36:17.371512 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:17Z","lastTransitionTime":"2025-10-12T07:36:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:17 crc kubenswrapper[4599]: I1012 07:36:17.474848 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:17 crc kubenswrapper[4599]: I1012 07:36:17.474924 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:17 crc kubenswrapper[4599]: I1012 07:36:17.474940 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:17 crc kubenswrapper[4599]: I1012 07:36:17.474962 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:17 crc kubenswrapper[4599]: I1012 07:36:17.474976 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:17Z","lastTransitionTime":"2025-10-12T07:36:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:17 crc kubenswrapper[4599]: I1012 07:36:17.544671 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 07:36:17 crc kubenswrapper[4599]: I1012 07:36:17.544751 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 07:36:17 crc kubenswrapper[4599]: E1012 07:36:17.544874 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 07:36:17 crc kubenswrapper[4599]: I1012 07:36:17.544938 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 07:36:17 crc kubenswrapper[4599]: E1012 07:36:17.545112 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 07:36:17 crc kubenswrapper[4599]: E1012 07:36:17.545315 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 07:36:17 crc kubenswrapper[4599]: I1012 07:36:17.577975 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:17 crc kubenswrapper[4599]: I1012 07:36:17.578018 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:17 crc kubenswrapper[4599]: I1012 07:36:17.578029 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:17 crc kubenswrapper[4599]: I1012 07:36:17.578047 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:17 crc kubenswrapper[4599]: I1012 07:36:17.578061 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:17Z","lastTransitionTime":"2025-10-12T07:36:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:17 crc kubenswrapper[4599]: I1012 07:36:17.680716 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:17 crc kubenswrapper[4599]: I1012 07:36:17.680758 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:17 crc kubenswrapper[4599]: I1012 07:36:17.680768 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:17 crc kubenswrapper[4599]: I1012 07:36:17.680782 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:17 crc kubenswrapper[4599]: I1012 07:36:17.680794 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:17Z","lastTransitionTime":"2025-10-12T07:36:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:17 crc kubenswrapper[4599]: I1012 07:36:17.783765 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:17 crc kubenswrapper[4599]: I1012 07:36:17.783828 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:17 crc kubenswrapper[4599]: I1012 07:36:17.783840 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:17 crc kubenswrapper[4599]: I1012 07:36:17.783862 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:17 crc kubenswrapper[4599]: I1012 07:36:17.783873 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:17Z","lastTransitionTime":"2025-10-12T07:36:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:17 crc kubenswrapper[4599]: I1012 07:36:17.886004 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:17 crc kubenswrapper[4599]: I1012 07:36:17.886063 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:17 crc kubenswrapper[4599]: I1012 07:36:17.886078 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:17 crc kubenswrapper[4599]: I1012 07:36:17.886098 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:17 crc kubenswrapper[4599]: I1012 07:36:17.886110 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:17Z","lastTransitionTime":"2025-10-12T07:36:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:17 crc kubenswrapper[4599]: I1012 07:36:17.896475 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-whk5b_1a95b7ab-8632-4332-a30f-64f28ef8d313/ovnkube-controller/3.log" Oct 12 07:36:17 crc kubenswrapper[4599]: I1012 07:36:17.900187 4599 scope.go:117] "RemoveContainer" containerID="22386ac9ba5ff4e4575b13c71cd5865e1a6ea9c403197c2c0ea796f37f1085d1" Oct 12 07:36:17 crc kubenswrapper[4599]: E1012 07:36:17.900519 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-whk5b_openshift-ovn-kubernetes(1a95b7ab-8632-4332-a30f-64f28ef8d313)\"" pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" podUID="1a95b7ab-8632-4332-a30f-64f28ef8d313" Oct 12 07:36:17 crc kubenswrapper[4599]: I1012 07:36:17.913289 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xl2ns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29fbfc4b-8e32-4132-a34d-48b25ec31428\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e2f2c3c17ada7047edb8e0fcd51227104458618c98e5ddec12123aa7ee173e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqqzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b84de1dd84c3c72376a170b7ccbdfff8c3bdd5e9e18a344002e112e81ca555a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqqzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xl2ns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:17Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:17 crc kubenswrapper[4599]: I1012 07:36:17.925310 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8afb4c88-1806-483a-9957-30833be28202\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c61ee75e4bbe142c54914177759cf0c49ecc36f904c8c6bf421cd4917256d5a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://545c8135b4e5319ac3b9c84ee50556b66e4f5cfe0987f5aa37ef668130f6f267\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40188eb89126b26a3002269d063827de60fa17f72c18fbbd0e15b16c744e8ddb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc5c76ab7433309a1f2c32d2ada884b5b0d926fee8219463079f8689a1f7efef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21d90e31b7ebf70d48881d9d0837e8a1fdeabc3d53988c7ba993ce28e902d3d4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T07:35:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW1012 07:35:20.755648 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1012 07:35:20.755757 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 07:35:20.757774 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2293705747/tls.crt::/tmp/serving-cert-2293705747/tls.key\\\\\\\"\\\\nI1012 07:35:20.975376 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1012 07:35:20.977324 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1012 07:35:20.977357 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1012 07:35:20.977377 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1012 07:35:20.977382 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1012 07:35:20.982144 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1012 07:35:20.982166 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1012 07:35:20.982172 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1012 07:35:20.982212 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1012 07:35:20.982216 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1012 07:35:20.982220 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1012 07:35:20.982223 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1012 07:35:20.982226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1012 07:35:20.985689 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e595e7d1dee4ecff3b5deb6e363b57de09e4bcfddd6ab9ae9be27b6e3578628\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afc60525d60f5b497a75efb631611d658c23f6336c21ca7421bf3efee3ece0ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afc60525d60f5b497a75efb631611d658c23f6336c21ca7421bf3efee3ece0ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:17Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:17 crc kubenswrapper[4599]: I1012 07:36:17.935915 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db79c5e0b03856a5da5fba6b8e6cb77590ef9e29090ac66da24d54ea638a20f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:17Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:17 crc kubenswrapper[4599]: I1012 07:36:17.944845 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:17Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:17 crc kubenswrapper[4599]: I1012 07:36:17.953436 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc694bce-8c25-4729-b452-29d44d3efe6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c222e0cd037e0719d1ea94d4f28639dae42c676fa56b5f57b22cd50d87e6b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25l5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://238fe8d2b2e7f08d70bc5f938aca3c90b68c447db21a713d651d47b47a8127c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25l5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5mz5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:17Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:17 crc kubenswrapper[4599]: I1012 07:36:17.962727 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2869b43-5f5f-4ed7-9dc3-63d210bde137\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21abc118842f09c54256bd05f1e5978c928bad569bc9b52157287576117fb49d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb25175bfe42e193200fbc76f82294afbd5f23d9653637c12ddf67553431748e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da490def77b19758f5479ef7070846dee19a5057bd6b0568d96fd38a8dd6528\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://622eac195ecab593bc5fa71a4a88ac987a45be5fb130336aae7acd680c51abe4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:17Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:17 crc kubenswrapper[4599]: I1012 07:36:17.970381 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b574e7cb76b05a81f63a55ce4544b1d4b3a58e6793be2c89620f66282642e39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:17Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:17 crc kubenswrapper[4599]: I1012 07:36:17.980438 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9xbn5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18ca8765-c435-4750-b803-14b539958d9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4db4e7d06f60157d91fcb8514d176f78540085701360ebcfa7eabb36fe84d6b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16c75354b9fbe00273ab4aaa5ca1306c5d097996b8ce8074a2cee8185118a001\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16c75354b9fbe00273ab4aaa5ca1306c5d097996b8ce8074a2cee8185118a001\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a7c28a100796e21008c597d15ee36786f7e87553a8a4feb5734ac79cef9b556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a7c28a100796e21008c597d15ee36786f7e87553a8a4feb5734ac79cef9b556\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6388c1e143c0caded216941ec25bd07be0a14f2f97f6328c35d5cc04815a88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6388c1e143c0caded216941ec25bd07be0a14f2f97f6328c35d5cc04815a88a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e201774968890569efafc736378cff0d700067bb0bd39677f42ef65fa2c07646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e201774968890569efafc736378cff0d700067bb0bd39677f42ef65fa2c07646\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f07c63fbebde2f05a5b1682af69eb42b5aec20728e0997f1ebc2f5a59815ad1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f07c63fbebde2f05a5b1682af69eb42b5aec20728e0997f1ebc2f5a59815ad1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f62b824f1e7400e9588b3d3a1afc52f86b3bcae039fbe3c757e1e01733e3a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f62b824f1e7400e9588b3d3a1afc52f86b3bcae039fbe3c757e1e01733e3a69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9xbn5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:17Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:17 crc kubenswrapper[4599]: I1012 07:36:17.987950 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:17 crc kubenswrapper[4599]: I1012 07:36:17.988004 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:17 crc kubenswrapper[4599]: I1012 07:36:17.988014 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:17 crc kubenswrapper[4599]: I1012 07:36:17.988031 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:17 crc kubenswrapper[4599]: I1012 07:36:17.988059 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:17Z","lastTransitionTime":"2025-10-12T07:36:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:17 crc kubenswrapper[4599]: I1012 07:36:17.990791 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8hm26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce311f52-0501-45d3-8209-b1d2aa25028b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d07bb1c67c880fb49b68be21180f0a5a053da62097b38605440625d10650033\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://611114a1fecbd461507dd2f566f975c7ef3e8d7655d90c1a5b31cd05a430cb38\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T07:36:07Z\\\",\\\"message\\\":\\\"2025-10-12T07:35:22+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_dbf0399a-84fd-451c-96bc-e20b8ff4f200\\\\n2025-10-12T07:35:22+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_dbf0399a-84fd-451c-96bc-e20b8ff4f200 to /host/opt/cni/bin/\\\\n2025-10-12T07:35:22Z [verbose] multus-daemon started\\\\n2025-10-12T07:35:22Z [verbose] Readiness Indicator file check\\\\n2025-10-12T07:36:07Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:36:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8hm26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:17Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:18 crc kubenswrapper[4599]: I1012 07:36:18.000571 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:17Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:18 crc kubenswrapper[4599]: I1012 07:36:18.009656 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:18Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:18 crc kubenswrapper[4599]: I1012 07:36:18.020458 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4b9189620cc5d626fe6c3e7bbe0efd22f7a6f061bc2afd3d33b1e04ebdbfb1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc203371af6667734441e7dca5f3b301365eb1136f363df8c87377ae58168130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:18Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:18 crc kubenswrapper[4599]: I1012 07:36:18.028102 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kwphq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c3e76cc-139b-4a2a-b96b-6077e3706376\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9v52\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9v52\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kwphq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:18Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:18 crc kubenswrapper[4599]: I1012 07:36:18.039547 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"636df6e3-ed6a-452a-9d5d-e26139f62951\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f52b4e0ae3f90600ec5e799352a1f70b4fad54a4805616d1508c22af008a0f8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56aa5b66fc1ca787e84c8ca2a717038f9e522862fb680d5271921e410d5841f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://091875c6e2e9f643a0c31783ab1905c5fd448d20e98b9b4de70be991e7e8f1c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d08b26fb90464c31907b23edfd9f4997375c3178e8699cf951af85424d9354d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d08b26fb90464c31907b23edfd9f4997375c3178e8699cf951af85424d9354d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:03Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:18Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:18 crc kubenswrapper[4599]: I1012 07:36:18.048264 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rxr42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92909e59-b659-4fc0-91c0-1880ff96b4f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4162f0c9d7789e80004cf3ea54eed47b4d473832cf68a03e348973c300f9dcc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7hr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rxr42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:18Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:18 crc kubenswrapper[4599]: I1012 07:36:18.059312 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f5988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0147ed28-bc0f-409c-a813-dc2ffffba092\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b06de751793dade650ec18334a6bcf8bb26e5e89af47ede264d087d1a950178a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzz9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f5988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:18Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:18 crc kubenswrapper[4599]: I1012 07:36:18.076490 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a95b7ab-8632-4332-a30f-64f28ef8d313\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://543f674ce98b17c90c5a351ae72f596a2e90006a1b86fcede3359715c67b94bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c150b687d6b0c337604ac14537335d57d9c955a9e31001d274507e5e65ac7bc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f73bbbb549129c48a1421dda3d2385cf3515feafd8215cdac6b792823d614b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a156f480c7252f61ec3dc991814db0e7259fdf7ae618c8f43d86da97f8a6bc5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f99854365302221a36d3d7647292430b5655222c551953b6aa6de9eb1c922bea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://810de96e34e6b028e90bcc5693f1624d337cdd775c01901fc91e6dd2a7f3564d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22386ac9ba5ff4e4575b13c71cd5865e1a6ea9c403197c2c0ea796f37f1085d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22386ac9ba5ff4e4575b13c71cd5865e1a6ea9c403197c2c0ea796f37f1085d1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T07:36:16Z\\\",\\\"message\\\":\\\"TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1012 07:36:16.300660 6611 services_controller.go:454] Service openshift-marketplace/redhat-operators for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI1012 07:36:16.300096 6611 ovn.go:134] Ensuring zone local for Pod openshift-image-registry/node-ca-rxr42 in node crc\\\\nI1012 07:36:16.300675 6611 obj_retry.go:386] Retry successful for *v1.Pod openshift-image-registry/node-ca-rxr42 after 0 failed attempt(s)\\\\nI1012 07:36:16.300681 6611 default_network_controller.go:776] Recording success event on pod openshift-image-registry/node-ca-rxr42\\\\nI1012 07:36:16.300674 6611 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/redhat-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"97419c58-41c7-41d7-a137-a446f0c7eeb3\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/redhat-operators\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Gro\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T07:36:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-whk5b_openshift-ovn-kubernetes(1a95b7ab-8632-4332-a30f-64f28ef8d313)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce496909e393b4896eb9ae165317648df3e828d219c231d95c6057cd08b78fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce4f64c7fab07385f9215821c5c75c5b345653fa141a238bb3dd74e7a1a291a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce4f64c7fab07385f9215821c5c75c5b345653fa141a238bb3dd74e7a1a291a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-whk5b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:18Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:18 crc kubenswrapper[4599]: I1012 07:36:18.090488 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:18 crc kubenswrapper[4599]: I1012 07:36:18.090524 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:18 crc kubenswrapper[4599]: I1012 07:36:18.090535 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:18 crc kubenswrapper[4599]: I1012 07:36:18.090552 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:18 crc kubenswrapper[4599]: I1012 07:36:18.090562 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:18Z","lastTransitionTime":"2025-10-12T07:36:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:18 crc kubenswrapper[4599]: I1012 07:36:18.192795 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:18 crc kubenswrapper[4599]: I1012 07:36:18.193032 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:18 crc kubenswrapper[4599]: I1012 07:36:18.193099 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:18 crc kubenswrapper[4599]: I1012 07:36:18.193176 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:18 crc kubenswrapper[4599]: I1012 07:36:18.193246 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:18Z","lastTransitionTime":"2025-10-12T07:36:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:18 crc kubenswrapper[4599]: I1012 07:36:18.295544 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:18 crc kubenswrapper[4599]: I1012 07:36:18.295593 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:18 crc kubenswrapper[4599]: I1012 07:36:18.295607 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:18 crc kubenswrapper[4599]: I1012 07:36:18.295627 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:18 crc kubenswrapper[4599]: I1012 07:36:18.295639 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:18Z","lastTransitionTime":"2025-10-12T07:36:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:18 crc kubenswrapper[4599]: I1012 07:36:18.398799 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:18 crc kubenswrapper[4599]: I1012 07:36:18.398882 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:18 crc kubenswrapper[4599]: I1012 07:36:18.398894 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:18 crc kubenswrapper[4599]: I1012 07:36:18.398915 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:18 crc kubenswrapper[4599]: I1012 07:36:18.398929 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:18Z","lastTransitionTime":"2025-10-12T07:36:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:18 crc kubenswrapper[4599]: I1012 07:36:18.501582 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:18 crc kubenswrapper[4599]: I1012 07:36:18.501639 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:18 crc kubenswrapper[4599]: I1012 07:36:18.501648 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:18 crc kubenswrapper[4599]: I1012 07:36:18.501672 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:18 crc kubenswrapper[4599]: I1012 07:36:18.501683 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:18Z","lastTransitionTime":"2025-10-12T07:36:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:18 crc kubenswrapper[4599]: I1012 07:36:18.545243 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kwphq" Oct 12 07:36:18 crc kubenswrapper[4599]: E1012 07:36:18.545474 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kwphq" podUID="3c3e76cc-139b-4a2a-b96b-6077e3706376" Oct 12 07:36:18 crc kubenswrapper[4599]: I1012 07:36:18.604403 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:18 crc kubenswrapper[4599]: I1012 07:36:18.604453 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:18 crc kubenswrapper[4599]: I1012 07:36:18.604465 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:18 crc kubenswrapper[4599]: I1012 07:36:18.604483 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:18 crc kubenswrapper[4599]: I1012 07:36:18.604495 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:18Z","lastTransitionTime":"2025-10-12T07:36:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:18 crc kubenswrapper[4599]: I1012 07:36:18.707186 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:18 crc kubenswrapper[4599]: I1012 07:36:18.707482 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:18 crc kubenswrapper[4599]: I1012 07:36:18.707566 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:18 crc kubenswrapper[4599]: I1012 07:36:18.707656 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:18 crc kubenswrapper[4599]: I1012 07:36:18.707782 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:18Z","lastTransitionTime":"2025-10-12T07:36:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:18 crc kubenswrapper[4599]: I1012 07:36:18.811550 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:18 crc kubenswrapper[4599]: I1012 07:36:18.811710 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:18 crc kubenswrapper[4599]: I1012 07:36:18.811777 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:18 crc kubenswrapper[4599]: I1012 07:36:18.811918 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:18 crc kubenswrapper[4599]: I1012 07:36:18.811996 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:18Z","lastTransitionTime":"2025-10-12T07:36:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:18 crc kubenswrapper[4599]: I1012 07:36:18.914772 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:18 crc kubenswrapper[4599]: I1012 07:36:18.915235 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:18 crc kubenswrapper[4599]: I1012 07:36:18.915366 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:18 crc kubenswrapper[4599]: I1012 07:36:18.915485 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:18 crc kubenswrapper[4599]: I1012 07:36:18.915578 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:18Z","lastTransitionTime":"2025-10-12T07:36:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:19 crc kubenswrapper[4599]: I1012 07:36:19.019809 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:19 crc kubenswrapper[4599]: I1012 07:36:19.019975 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:19 crc kubenswrapper[4599]: I1012 07:36:19.020039 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:19 crc kubenswrapper[4599]: I1012 07:36:19.020107 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:19 crc kubenswrapper[4599]: I1012 07:36:19.020161 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:19Z","lastTransitionTime":"2025-10-12T07:36:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:19 crc kubenswrapper[4599]: I1012 07:36:19.122087 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:19 crc kubenswrapper[4599]: I1012 07:36:19.122271 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:19 crc kubenswrapper[4599]: I1012 07:36:19.122355 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:19 crc kubenswrapper[4599]: I1012 07:36:19.122437 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:19 crc kubenswrapper[4599]: I1012 07:36:19.122497 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:19Z","lastTransitionTime":"2025-10-12T07:36:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:19 crc kubenswrapper[4599]: I1012 07:36:19.224358 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:19 crc kubenswrapper[4599]: I1012 07:36:19.224670 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:19 crc kubenswrapper[4599]: I1012 07:36:19.224730 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:19 crc kubenswrapper[4599]: I1012 07:36:19.224785 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:19 crc kubenswrapper[4599]: I1012 07:36:19.224846 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:19Z","lastTransitionTime":"2025-10-12T07:36:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:19 crc kubenswrapper[4599]: I1012 07:36:19.327248 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:19 crc kubenswrapper[4599]: I1012 07:36:19.327375 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:19 crc kubenswrapper[4599]: I1012 07:36:19.327471 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:19 crc kubenswrapper[4599]: I1012 07:36:19.327533 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:19 crc kubenswrapper[4599]: I1012 07:36:19.327590 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:19Z","lastTransitionTime":"2025-10-12T07:36:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:19 crc kubenswrapper[4599]: I1012 07:36:19.429566 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:19 crc kubenswrapper[4599]: I1012 07:36:19.429612 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:19 crc kubenswrapper[4599]: I1012 07:36:19.429621 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:19 crc kubenswrapper[4599]: I1012 07:36:19.429637 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:19 crc kubenswrapper[4599]: I1012 07:36:19.429647 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:19Z","lastTransitionTime":"2025-10-12T07:36:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:19 crc kubenswrapper[4599]: I1012 07:36:19.532612 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:19 crc kubenswrapper[4599]: I1012 07:36:19.532678 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:19 crc kubenswrapper[4599]: I1012 07:36:19.532692 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:19 crc kubenswrapper[4599]: I1012 07:36:19.532714 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:19 crc kubenswrapper[4599]: I1012 07:36:19.532728 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:19Z","lastTransitionTime":"2025-10-12T07:36:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:19 crc kubenswrapper[4599]: I1012 07:36:19.545066 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 07:36:19 crc kubenswrapper[4599]: I1012 07:36:19.545164 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 07:36:19 crc kubenswrapper[4599]: E1012 07:36:19.545225 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 07:36:19 crc kubenswrapper[4599]: I1012 07:36:19.545272 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 07:36:19 crc kubenswrapper[4599]: E1012 07:36:19.545397 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 07:36:19 crc kubenswrapper[4599]: E1012 07:36:19.545453 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 07:36:19 crc kubenswrapper[4599]: I1012 07:36:19.635288 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:19 crc kubenswrapper[4599]: I1012 07:36:19.635324 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:19 crc kubenswrapper[4599]: I1012 07:36:19.635352 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:19 crc kubenswrapper[4599]: I1012 07:36:19.635370 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:19 crc kubenswrapper[4599]: I1012 07:36:19.635385 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:19Z","lastTransitionTime":"2025-10-12T07:36:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:19 crc kubenswrapper[4599]: I1012 07:36:19.737160 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:19 crc kubenswrapper[4599]: I1012 07:36:19.737307 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:19 crc kubenswrapper[4599]: I1012 07:36:19.737398 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:19 crc kubenswrapper[4599]: I1012 07:36:19.737477 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:19 crc kubenswrapper[4599]: I1012 07:36:19.737534 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:19Z","lastTransitionTime":"2025-10-12T07:36:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:19 crc kubenswrapper[4599]: I1012 07:36:19.839742 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:19 crc kubenswrapper[4599]: I1012 07:36:19.839783 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:19 crc kubenswrapper[4599]: I1012 07:36:19.839792 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:19 crc kubenswrapper[4599]: I1012 07:36:19.839821 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:19 crc kubenswrapper[4599]: I1012 07:36:19.839832 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:19Z","lastTransitionTime":"2025-10-12T07:36:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:19 crc kubenswrapper[4599]: I1012 07:36:19.941698 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:19 crc kubenswrapper[4599]: I1012 07:36:19.942143 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:19 crc kubenswrapper[4599]: I1012 07:36:19.942214 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:19 crc kubenswrapper[4599]: I1012 07:36:19.942296 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:19 crc kubenswrapper[4599]: I1012 07:36:19.942406 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:19Z","lastTransitionTime":"2025-10-12T07:36:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:20 crc kubenswrapper[4599]: I1012 07:36:20.044551 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:20 crc kubenswrapper[4599]: I1012 07:36:20.044592 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:20 crc kubenswrapper[4599]: I1012 07:36:20.044605 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:20 crc kubenswrapper[4599]: I1012 07:36:20.044621 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:20 crc kubenswrapper[4599]: I1012 07:36:20.044633 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:20Z","lastTransitionTime":"2025-10-12T07:36:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:20 crc kubenswrapper[4599]: I1012 07:36:20.146848 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:20 crc kubenswrapper[4599]: I1012 07:36:20.146892 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:20 crc kubenswrapper[4599]: I1012 07:36:20.146904 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:20 crc kubenswrapper[4599]: I1012 07:36:20.146921 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:20 crc kubenswrapper[4599]: I1012 07:36:20.146931 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:20Z","lastTransitionTime":"2025-10-12T07:36:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:20 crc kubenswrapper[4599]: I1012 07:36:20.249279 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:20 crc kubenswrapper[4599]: I1012 07:36:20.249377 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:20 crc kubenswrapper[4599]: I1012 07:36:20.249385 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:20 crc kubenswrapper[4599]: I1012 07:36:20.249402 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:20 crc kubenswrapper[4599]: I1012 07:36:20.249412 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:20Z","lastTransitionTime":"2025-10-12T07:36:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:20 crc kubenswrapper[4599]: I1012 07:36:20.351832 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:20 crc kubenswrapper[4599]: I1012 07:36:20.351896 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:20 crc kubenswrapper[4599]: I1012 07:36:20.351906 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:20 crc kubenswrapper[4599]: I1012 07:36:20.351928 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:20 crc kubenswrapper[4599]: I1012 07:36:20.351938 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:20Z","lastTransitionTime":"2025-10-12T07:36:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:20 crc kubenswrapper[4599]: I1012 07:36:20.454547 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:20 crc kubenswrapper[4599]: I1012 07:36:20.454618 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:20 crc kubenswrapper[4599]: I1012 07:36:20.454629 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:20 crc kubenswrapper[4599]: I1012 07:36:20.454651 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:20 crc kubenswrapper[4599]: I1012 07:36:20.454662 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:20Z","lastTransitionTime":"2025-10-12T07:36:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:20 crc kubenswrapper[4599]: I1012 07:36:20.544642 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kwphq" Oct 12 07:36:20 crc kubenswrapper[4599]: E1012 07:36:20.544833 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kwphq" podUID="3c3e76cc-139b-4a2a-b96b-6077e3706376" Oct 12 07:36:20 crc kubenswrapper[4599]: I1012 07:36:20.556846 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:20 crc kubenswrapper[4599]: I1012 07:36:20.556889 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:20 crc kubenswrapper[4599]: I1012 07:36:20.556899 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:20 crc kubenswrapper[4599]: I1012 07:36:20.556916 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:20 crc kubenswrapper[4599]: I1012 07:36:20.556926 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:20Z","lastTransitionTime":"2025-10-12T07:36:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:20 crc kubenswrapper[4599]: I1012 07:36:20.659827 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:20 crc kubenswrapper[4599]: I1012 07:36:20.659895 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:20 crc kubenswrapper[4599]: I1012 07:36:20.659905 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:20 crc kubenswrapper[4599]: I1012 07:36:20.659926 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:20 crc kubenswrapper[4599]: I1012 07:36:20.659936 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:20Z","lastTransitionTime":"2025-10-12T07:36:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:20 crc kubenswrapper[4599]: I1012 07:36:20.762702 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:20 crc kubenswrapper[4599]: I1012 07:36:20.762754 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:20 crc kubenswrapper[4599]: I1012 07:36:20.762869 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:20 crc kubenswrapper[4599]: I1012 07:36:20.762895 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:20 crc kubenswrapper[4599]: I1012 07:36:20.762908 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:20Z","lastTransitionTime":"2025-10-12T07:36:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:20 crc kubenswrapper[4599]: I1012 07:36:20.865286 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:20 crc kubenswrapper[4599]: I1012 07:36:20.865357 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:20 crc kubenswrapper[4599]: I1012 07:36:20.865376 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:20 crc kubenswrapper[4599]: I1012 07:36:20.865397 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:20 crc kubenswrapper[4599]: I1012 07:36:20.865411 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:20Z","lastTransitionTime":"2025-10-12T07:36:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:20 crc kubenswrapper[4599]: I1012 07:36:20.967198 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:20 crc kubenswrapper[4599]: I1012 07:36:20.967246 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:20 crc kubenswrapper[4599]: I1012 07:36:20.967258 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:20 crc kubenswrapper[4599]: I1012 07:36:20.967277 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:20 crc kubenswrapper[4599]: I1012 07:36:20.967291 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:20Z","lastTransitionTime":"2025-10-12T07:36:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:21 crc kubenswrapper[4599]: I1012 07:36:21.069601 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:21 crc kubenswrapper[4599]: I1012 07:36:21.069638 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:21 crc kubenswrapper[4599]: I1012 07:36:21.069649 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:21 crc kubenswrapper[4599]: I1012 07:36:21.069666 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:21 crc kubenswrapper[4599]: I1012 07:36:21.069677 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:21Z","lastTransitionTime":"2025-10-12T07:36:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:21 crc kubenswrapper[4599]: I1012 07:36:21.172036 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:21 crc kubenswrapper[4599]: I1012 07:36:21.172088 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:21 crc kubenswrapper[4599]: I1012 07:36:21.172105 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:21 crc kubenswrapper[4599]: I1012 07:36:21.172126 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:21 crc kubenswrapper[4599]: I1012 07:36:21.172138 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:21Z","lastTransitionTime":"2025-10-12T07:36:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:21 crc kubenswrapper[4599]: I1012 07:36:21.274840 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:21 crc kubenswrapper[4599]: I1012 07:36:21.274885 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:21 crc kubenswrapper[4599]: I1012 07:36:21.274900 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:21 crc kubenswrapper[4599]: I1012 07:36:21.274919 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:21 crc kubenswrapper[4599]: I1012 07:36:21.274928 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:21Z","lastTransitionTime":"2025-10-12T07:36:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:21 crc kubenswrapper[4599]: I1012 07:36:21.377641 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:21 crc kubenswrapper[4599]: I1012 07:36:21.377700 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:21 crc kubenswrapper[4599]: I1012 07:36:21.377711 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:21 crc kubenswrapper[4599]: I1012 07:36:21.377731 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:21 crc kubenswrapper[4599]: I1012 07:36:21.377754 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:21Z","lastTransitionTime":"2025-10-12T07:36:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:21 crc kubenswrapper[4599]: I1012 07:36:21.480432 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:21 crc kubenswrapper[4599]: I1012 07:36:21.480479 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:21 crc kubenswrapper[4599]: I1012 07:36:21.480488 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:21 crc kubenswrapper[4599]: I1012 07:36:21.480505 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:21 crc kubenswrapper[4599]: I1012 07:36:21.480516 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:21Z","lastTransitionTime":"2025-10-12T07:36:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:21 crc kubenswrapper[4599]: I1012 07:36:21.545284 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 07:36:21 crc kubenswrapper[4599]: I1012 07:36:21.545326 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 07:36:21 crc kubenswrapper[4599]: I1012 07:36:21.545609 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 07:36:21 crc kubenswrapper[4599]: E1012 07:36:21.545721 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 07:36:21 crc kubenswrapper[4599]: E1012 07:36:21.545946 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 07:36:21 crc kubenswrapper[4599]: E1012 07:36:21.546018 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 07:36:21 crc kubenswrapper[4599]: I1012 07:36:21.557780 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Oct 12 07:36:21 crc kubenswrapper[4599]: I1012 07:36:21.583110 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:21 crc kubenswrapper[4599]: I1012 07:36:21.583156 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:21 crc kubenswrapper[4599]: I1012 07:36:21.583167 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:21 crc kubenswrapper[4599]: I1012 07:36:21.583185 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:21 crc kubenswrapper[4599]: I1012 07:36:21.583200 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:21Z","lastTransitionTime":"2025-10-12T07:36:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:21 crc kubenswrapper[4599]: I1012 07:36:21.685637 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:21 crc kubenswrapper[4599]: I1012 07:36:21.685683 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:21 crc kubenswrapper[4599]: I1012 07:36:21.685695 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:21 crc kubenswrapper[4599]: I1012 07:36:21.685716 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:21 crc kubenswrapper[4599]: I1012 07:36:21.685728 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:21Z","lastTransitionTime":"2025-10-12T07:36:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:21 crc kubenswrapper[4599]: I1012 07:36:21.788304 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:21 crc kubenswrapper[4599]: I1012 07:36:21.788369 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:21 crc kubenswrapper[4599]: I1012 07:36:21.788380 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:21 crc kubenswrapper[4599]: I1012 07:36:21.788399 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:21 crc kubenswrapper[4599]: I1012 07:36:21.788413 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:21Z","lastTransitionTime":"2025-10-12T07:36:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:21 crc kubenswrapper[4599]: I1012 07:36:21.890823 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:21 crc kubenswrapper[4599]: I1012 07:36:21.890872 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:21 crc kubenswrapper[4599]: I1012 07:36:21.890881 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:21 crc kubenswrapper[4599]: I1012 07:36:21.890899 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:21 crc kubenswrapper[4599]: I1012 07:36:21.890910 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:21Z","lastTransitionTime":"2025-10-12T07:36:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:21 crc kubenswrapper[4599]: I1012 07:36:21.993743 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:21 crc kubenswrapper[4599]: I1012 07:36:21.993787 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:21 crc kubenswrapper[4599]: I1012 07:36:21.993796 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:21 crc kubenswrapper[4599]: I1012 07:36:21.993815 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:21 crc kubenswrapper[4599]: I1012 07:36:21.993827 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:21Z","lastTransitionTime":"2025-10-12T07:36:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:22 crc kubenswrapper[4599]: I1012 07:36:22.096185 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:22 crc kubenswrapper[4599]: I1012 07:36:22.096222 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:22 crc kubenswrapper[4599]: I1012 07:36:22.096230 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:22 crc kubenswrapper[4599]: I1012 07:36:22.096243 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:22 crc kubenswrapper[4599]: I1012 07:36:22.096252 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:22Z","lastTransitionTime":"2025-10-12T07:36:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:22 crc kubenswrapper[4599]: I1012 07:36:22.198557 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:22 crc kubenswrapper[4599]: I1012 07:36:22.198619 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:22 crc kubenswrapper[4599]: I1012 07:36:22.198630 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:22 crc kubenswrapper[4599]: I1012 07:36:22.198652 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:22 crc kubenswrapper[4599]: I1012 07:36:22.198676 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:22Z","lastTransitionTime":"2025-10-12T07:36:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:22 crc kubenswrapper[4599]: I1012 07:36:22.300872 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:22 crc kubenswrapper[4599]: I1012 07:36:22.300917 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:22 crc kubenswrapper[4599]: I1012 07:36:22.300926 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:22 crc kubenswrapper[4599]: I1012 07:36:22.300943 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:22 crc kubenswrapper[4599]: I1012 07:36:22.300954 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:22Z","lastTransitionTime":"2025-10-12T07:36:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:22 crc kubenswrapper[4599]: I1012 07:36:22.403069 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:22 crc kubenswrapper[4599]: I1012 07:36:22.403131 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:22 crc kubenswrapper[4599]: I1012 07:36:22.403142 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:22 crc kubenswrapper[4599]: I1012 07:36:22.403162 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:22 crc kubenswrapper[4599]: I1012 07:36:22.403174 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:22Z","lastTransitionTime":"2025-10-12T07:36:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:22 crc kubenswrapper[4599]: I1012 07:36:22.505858 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:22 crc kubenswrapper[4599]: I1012 07:36:22.505935 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:22 crc kubenswrapper[4599]: I1012 07:36:22.505944 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:22 crc kubenswrapper[4599]: I1012 07:36:22.505959 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:22 crc kubenswrapper[4599]: I1012 07:36:22.505968 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:22Z","lastTransitionTime":"2025-10-12T07:36:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:22 crc kubenswrapper[4599]: I1012 07:36:22.544542 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kwphq" Oct 12 07:36:22 crc kubenswrapper[4599]: E1012 07:36:22.544749 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kwphq" podUID="3c3e76cc-139b-4a2a-b96b-6077e3706376" Oct 12 07:36:22 crc kubenswrapper[4599]: I1012 07:36:22.608523 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:22 crc kubenswrapper[4599]: I1012 07:36:22.608568 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:22 crc kubenswrapper[4599]: I1012 07:36:22.608577 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:22 crc kubenswrapper[4599]: I1012 07:36:22.608594 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:22 crc kubenswrapper[4599]: I1012 07:36:22.608606 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:22Z","lastTransitionTime":"2025-10-12T07:36:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:22 crc kubenswrapper[4599]: I1012 07:36:22.710374 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:22 crc kubenswrapper[4599]: I1012 07:36:22.710412 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:22 crc kubenswrapper[4599]: I1012 07:36:22.710421 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:22 crc kubenswrapper[4599]: I1012 07:36:22.710434 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:22 crc kubenswrapper[4599]: I1012 07:36:22.710443 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:22Z","lastTransitionTime":"2025-10-12T07:36:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:22 crc kubenswrapper[4599]: I1012 07:36:22.811978 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:22 crc kubenswrapper[4599]: I1012 07:36:22.812007 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:22 crc kubenswrapper[4599]: I1012 07:36:22.812016 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:22 crc kubenswrapper[4599]: I1012 07:36:22.812031 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:22 crc kubenswrapper[4599]: I1012 07:36:22.812039 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:22Z","lastTransitionTime":"2025-10-12T07:36:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:22 crc kubenswrapper[4599]: I1012 07:36:22.913649 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:22 crc kubenswrapper[4599]: I1012 07:36:22.913697 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:22 crc kubenswrapper[4599]: I1012 07:36:22.913706 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:22 crc kubenswrapper[4599]: I1012 07:36:22.913719 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:22 crc kubenswrapper[4599]: I1012 07:36:22.913728 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:22Z","lastTransitionTime":"2025-10-12T07:36:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:23 crc kubenswrapper[4599]: I1012 07:36:23.015571 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:23 crc kubenswrapper[4599]: I1012 07:36:23.015616 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:23 crc kubenswrapper[4599]: I1012 07:36:23.015626 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:23 crc kubenswrapper[4599]: I1012 07:36:23.015641 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:23 crc kubenswrapper[4599]: I1012 07:36:23.015652 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:23Z","lastTransitionTime":"2025-10-12T07:36:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:23 crc kubenswrapper[4599]: I1012 07:36:23.118394 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:23 crc kubenswrapper[4599]: I1012 07:36:23.118617 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:23 crc kubenswrapper[4599]: I1012 07:36:23.118625 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:23 crc kubenswrapper[4599]: I1012 07:36:23.118642 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:23 crc kubenswrapper[4599]: I1012 07:36:23.118653 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:23Z","lastTransitionTime":"2025-10-12T07:36:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:23 crc kubenswrapper[4599]: I1012 07:36:23.221134 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:23 crc kubenswrapper[4599]: I1012 07:36:23.221184 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:23 crc kubenswrapper[4599]: I1012 07:36:23.221196 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:23 crc kubenswrapper[4599]: I1012 07:36:23.221214 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:23 crc kubenswrapper[4599]: I1012 07:36:23.221226 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:23Z","lastTransitionTime":"2025-10-12T07:36:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:23 crc kubenswrapper[4599]: I1012 07:36:23.323235 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:23 crc kubenswrapper[4599]: I1012 07:36:23.323286 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:23 crc kubenswrapper[4599]: I1012 07:36:23.323298 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:23 crc kubenswrapper[4599]: I1012 07:36:23.323318 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:23 crc kubenswrapper[4599]: I1012 07:36:23.323328 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:23Z","lastTransitionTime":"2025-10-12T07:36:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:23 crc kubenswrapper[4599]: I1012 07:36:23.425420 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:23 crc kubenswrapper[4599]: I1012 07:36:23.425455 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:23 crc kubenswrapper[4599]: I1012 07:36:23.425480 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:23 crc kubenswrapper[4599]: I1012 07:36:23.425494 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:23 crc kubenswrapper[4599]: I1012 07:36:23.425504 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:23Z","lastTransitionTime":"2025-10-12T07:36:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:23 crc kubenswrapper[4599]: I1012 07:36:23.527845 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:23 crc kubenswrapper[4599]: I1012 07:36:23.527899 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:23 crc kubenswrapper[4599]: I1012 07:36:23.527908 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:23 crc kubenswrapper[4599]: I1012 07:36:23.527921 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:23 crc kubenswrapper[4599]: I1012 07:36:23.527955 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:23Z","lastTransitionTime":"2025-10-12T07:36:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:23 crc kubenswrapper[4599]: I1012 07:36:23.545316 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 07:36:23 crc kubenswrapper[4599]: E1012 07:36:23.545416 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 07:36:23 crc kubenswrapper[4599]: I1012 07:36:23.545574 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 07:36:23 crc kubenswrapper[4599]: E1012 07:36:23.545652 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 07:36:23 crc kubenswrapper[4599]: I1012 07:36:23.545689 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 07:36:23 crc kubenswrapper[4599]: E1012 07:36:23.545852 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 07:36:23 crc kubenswrapper[4599]: I1012 07:36:23.557100 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3aad3aaa-c433-4fe0-b01e-f34979c35817\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://443dfe629ce4bc2f6d96108ef419b3f8e577609981571c42882a8b3eb587722a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b748451e24963062c3aec5bf944775ca77c166a8b97b4309b9102a9157312cdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b748451e24963062c3aec5bf944775ca77c166a8b97b4309b9102a9157312cdd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:23Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:23 crc kubenswrapper[4599]: I1012 07:36:23.566436 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2869b43-5f5f-4ed7-9dc3-63d210bde137\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21abc118842f09c54256bd05f1e5978c928bad569bc9b52157287576117fb49d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb25175bfe42e193200fbc76f82294afbd5f23d9653637c12ddf67553431748e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da490def77b19758f5479ef7070846dee19a5057bd6b0568d96fd38a8dd6528\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://622eac195ecab593bc5fa71a4a88ac987a45be5fb130336aae7acd680c51abe4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:23Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:23 crc kubenswrapper[4599]: I1012 07:36:23.574883 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b574e7cb76b05a81f63a55ce4544b1d4b3a58e6793be2c89620f66282642e39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:23Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:23 crc kubenswrapper[4599]: I1012 07:36:23.584309 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9xbn5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18ca8765-c435-4750-b803-14b539958d9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4db4e7d06f60157d91fcb8514d176f78540085701360ebcfa7eabb36fe84d6b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16c75354b9fbe00273ab4aaa5ca1306c5d097996b8ce8074a2cee8185118a001\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16c75354b9fbe00273ab4aaa5ca1306c5d097996b8ce8074a2cee8185118a001\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a7c28a100796e21008c597d15ee36786f7e87553a8a4feb5734ac79cef9b556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a7c28a100796e21008c597d15ee36786f7e87553a8a4feb5734ac79cef9b556\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6388c1e143c0caded216941ec25bd07be0a14f2f97f6328c35d5cc04815a88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6388c1e143c0caded216941ec25bd07be0a14f2f97f6328c35d5cc04815a88a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e201774968890569efafc736378cff0d700067bb0bd39677f42ef65fa2c07646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e201774968890569efafc736378cff0d700067bb0bd39677f42ef65fa2c07646\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f07c63fbebde2f05a5b1682af69eb42b5aec20728e0997f1ebc2f5a59815ad1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f07c63fbebde2f05a5b1682af69eb42b5aec20728e0997f1ebc2f5a59815ad1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f62b824f1e7400e9588b3d3a1afc52f86b3bcae039fbe3c757e1e01733e3a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f62b824f1e7400e9588b3d3a1afc52f86b3bcae039fbe3c757e1e01733e3a69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9xbn5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:23Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:23 crc kubenswrapper[4599]: I1012 07:36:23.592929 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8hm26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce311f52-0501-45d3-8209-b1d2aa25028b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d07bb1c67c880fb49b68be21180f0a5a053da62097b38605440625d10650033\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://611114a1fecbd461507dd2f566f975c7ef3e8d7655d90c1a5b31cd05a430cb38\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T07:36:07Z\\\",\\\"message\\\":\\\"2025-10-12T07:35:22+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_dbf0399a-84fd-451c-96bc-e20b8ff4f200\\\\n2025-10-12T07:35:22+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_dbf0399a-84fd-451c-96bc-e20b8ff4f200 to /host/opt/cni/bin/\\\\n2025-10-12T07:35:22Z [verbose] multus-daemon started\\\\n2025-10-12T07:35:22Z [verbose] Readiness Indicator file check\\\\n2025-10-12T07:36:07Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:36:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8hm26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:23Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:23 crc kubenswrapper[4599]: I1012 07:36:23.601180 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:23Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:23 crc kubenswrapper[4599]: I1012 07:36:23.609753 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:23Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:23 crc kubenswrapper[4599]: I1012 07:36:23.619104 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4b9189620cc5d626fe6c3e7bbe0efd22f7a6f061bc2afd3d33b1e04ebdbfb1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc203371af6667734441e7dca5f3b301365eb1136f363df8c87377ae58168130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:23Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:23 crc kubenswrapper[4599]: I1012 07:36:23.625587 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kwphq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c3e76cc-139b-4a2a-b96b-6077e3706376\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9v52\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9v52\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kwphq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:23Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:23 crc kubenswrapper[4599]: I1012 07:36:23.630011 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:23 crc kubenswrapper[4599]: I1012 07:36:23.630058 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:23 crc kubenswrapper[4599]: I1012 07:36:23.630070 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:23 crc kubenswrapper[4599]: I1012 07:36:23.630084 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:23 crc kubenswrapper[4599]: I1012 07:36:23.630092 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:23Z","lastTransitionTime":"2025-10-12T07:36:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:23 crc kubenswrapper[4599]: I1012 07:36:23.633306 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"636df6e3-ed6a-452a-9d5d-e26139f62951\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f52b4e0ae3f90600ec5e799352a1f70b4fad54a4805616d1508c22af008a0f8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56aa5b66fc1ca787e84c8ca2a717038f9e522862fb680d5271921e410d5841f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://091875c6e2e9f643a0c31783ab1905c5fd448d20e98b9b4de70be991e7e8f1c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d08b26fb90464c31907b23edfd9f4997375c3178e8699cf951af85424d9354d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d08b26fb90464c31907b23edfd9f4997375c3178e8699cf951af85424d9354d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:03Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:23Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:23 crc kubenswrapper[4599]: I1012 07:36:23.640967 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rxr42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92909e59-b659-4fc0-91c0-1880ff96b4f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4162f0c9d7789e80004cf3ea54eed47b4d473832cf68a03e348973c300f9dcc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7hr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rxr42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:23Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:23 crc kubenswrapper[4599]: I1012 07:36:23.649010 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f5988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0147ed28-bc0f-409c-a813-dc2ffffba092\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b06de751793dade650ec18334a6bcf8bb26e5e89af47ede264d087d1a950178a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzz9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f5988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:23Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:23 crc kubenswrapper[4599]: I1012 07:36:23.665039 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a95b7ab-8632-4332-a30f-64f28ef8d313\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://543f674ce98b17c90c5a351ae72f596a2e90006a1b86fcede3359715c67b94bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c150b687d6b0c337604ac14537335d57d9c955a9e31001d274507e5e65ac7bc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f73bbbb549129c48a1421dda3d2385cf3515feafd8215cdac6b792823d614b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a156f480c7252f61ec3dc991814db0e7259fdf7ae618c8f43d86da97f8a6bc5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f99854365302221a36d3d7647292430b5655222c551953b6aa6de9eb1c922bea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://810de96e34e6b028e90bcc5693f1624d337cdd775c01901fc91e6dd2a7f3564d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22386ac9ba5ff4e4575b13c71cd5865e1a6ea9c403197c2c0ea796f37f1085d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22386ac9ba5ff4e4575b13c71cd5865e1a6ea9c403197c2c0ea796f37f1085d1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T07:36:16Z\\\",\\\"message\\\":\\\"TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1012 07:36:16.300660 6611 services_controller.go:454] Service openshift-marketplace/redhat-operators for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI1012 07:36:16.300096 6611 ovn.go:134] Ensuring zone local for Pod openshift-image-registry/node-ca-rxr42 in node crc\\\\nI1012 07:36:16.300675 6611 obj_retry.go:386] Retry successful for *v1.Pod openshift-image-registry/node-ca-rxr42 after 0 failed attempt(s)\\\\nI1012 07:36:16.300681 6611 default_network_controller.go:776] Recording success event on pod openshift-image-registry/node-ca-rxr42\\\\nI1012 07:36:16.300674 6611 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/redhat-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"97419c58-41c7-41d7-a137-a446f0c7eeb3\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/redhat-operators\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Gro\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T07:36:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-whk5b_openshift-ovn-kubernetes(1a95b7ab-8632-4332-a30f-64f28ef8d313)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce496909e393b4896eb9ae165317648df3e828d219c231d95c6057cd08b78fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce4f64c7fab07385f9215821c5c75c5b345653fa141a238bb3dd74e7a1a291a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce4f64c7fab07385f9215821c5c75c5b345653fa141a238bb3dd74e7a1a291a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-whk5b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:23Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:23 crc kubenswrapper[4599]: I1012 07:36:23.673596 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xl2ns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29fbfc4b-8e32-4132-a34d-48b25ec31428\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e2f2c3c17ada7047edb8e0fcd51227104458618c98e5ddec12123aa7ee173e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqqzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b84de1dd84c3c72376a170b7ccbdfff8c3bdd5e9e18a344002e112e81ca555a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqqzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xl2ns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:23Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:23 crc kubenswrapper[4599]: I1012 07:36:23.684636 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8afb4c88-1806-483a-9957-30833be28202\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c61ee75e4bbe142c54914177759cf0c49ecc36f904c8c6bf421cd4917256d5a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://545c8135b4e5319ac3b9c84ee50556b66e4f5cfe0987f5aa37ef668130f6f267\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40188eb89126b26a3002269d063827de60fa17f72c18fbbd0e15b16c744e8ddb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc5c76ab7433309a1f2c32d2ada884b5b0d926fee8219463079f8689a1f7efef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21d90e31b7ebf70d48881d9d0837e8a1fdeabc3d53988c7ba993ce28e902d3d4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T07:35:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW1012 07:35:20.755648 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1012 07:35:20.755757 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 07:35:20.757774 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2293705747/tls.crt::/tmp/serving-cert-2293705747/tls.key\\\\\\\"\\\\nI1012 07:35:20.975376 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1012 07:35:20.977324 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1012 07:35:20.977357 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1012 07:35:20.977377 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1012 07:35:20.977382 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1012 07:35:20.982144 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1012 07:35:20.982166 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1012 07:35:20.982172 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1012 07:35:20.982212 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1012 07:35:20.982216 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1012 07:35:20.982220 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1012 07:35:20.982223 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1012 07:35:20.982226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1012 07:35:20.985689 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e595e7d1dee4ecff3b5deb6e363b57de09e4bcfddd6ab9ae9be27b6e3578628\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afc60525d60f5b497a75efb631611d658c23f6336c21ca7421bf3efee3ece0ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afc60525d60f5b497a75efb631611d658c23f6336c21ca7421bf3efee3ece0ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:23Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:23 crc kubenswrapper[4599]: I1012 07:36:23.693328 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db79c5e0b03856a5da5fba6b8e6cb77590ef9e29090ac66da24d54ea638a20f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:23Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:23 crc kubenswrapper[4599]: I1012 07:36:23.701585 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:23Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:23 crc kubenswrapper[4599]: I1012 07:36:23.710832 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc694bce-8c25-4729-b452-29d44d3efe6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c222e0cd037e0719d1ea94d4f28639dae42c676fa56b5f57b22cd50d87e6b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25l5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://238fe8d2b2e7f08d70bc5f938aca3c90b68c447db21a713d651d47b47a8127c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25l5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5mz5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:23Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:23 crc kubenswrapper[4599]: I1012 07:36:23.732142 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:23 crc kubenswrapper[4599]: I1012 07:36:23.732176 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:23 crc kubenswrapper[4599]: I1012 07:36:23.732185 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:23 crc kubenswrapper[4599]: I1012 07:36:23.732200 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:23 crc kubenswrapper[4599]: I1012 07:36:23.732211 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:23Z","lastTransitionTime":"2025-10-12T07:36:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:23 crc kubenswrapper[4599]: I1012 07:36:23.834229 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:23 crc kubenswrapper[4599]: I1012 07:36:23.834259 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:23 crc kubenswrapper[4599]: I1012 07:36:23.834267 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:23 crc kubenswrapper[4599]: I1012 07:36:23.834280 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:23 crc kubenswrapper[4599]: I1012 07:36:23.834290 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:23Z","lastTransitionTime":"2025-10-12T07:36:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:23 crc kubenswrapper[4599]: I1012 07:36:23.936217 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:23 crc kubenswrapper[4599]: I1012 07:36:23.936258 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:23 crc kubenswrapper[4599]: I1012 07:36:23.936268 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:23 crc kubenswrapper[4599]: I1012 07:36:23.936286 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:23 crc kubenswrapper[4599]: I1012 07:36:23.936296 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:23Z","lastTransitionTime":"2025-10-12T07:36:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:24 crc kubenswrapper[4599]: I1012 07:36:24.039290 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:24 crc kubenswrapper[4599]: I1012 07:36:24.039361 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:24 crc kubenswrapper[4599]: I1012 07:36:24.039373 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:24 crc kubenswrapper[4599]: I1012 07:36:24.039397 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:24 crc kubenswrapper[4599]: I1012 07:36:24.039409 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:24Z","lastTransitionTime":"2025-10-12T07:36:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:24 crc kubenswrapper[4599]: I1012 07:36:24.141355 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:24 crc kubenswrapper[4599]: I1012 07:36:24.141391 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:24 crc kubenswrapper[4599]: I1012 07:36:24.141401 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:24 crc kubenswrapper[4599]: I1012 07:36:24.141416 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:24 crc kubenswrapper[4599]: I1012 07:36:24.141427 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:24Z","lastTransitionTime":"2025-10-12T07:36:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:24 crc kubenswrapper[4599]: I1012 07:36:24.242952 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:24 crc kubenswrapper[4599]: I1012 07:36:24.242989 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:24 crc kubenswrapper[4599]: I1012 07:36:24.242998 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:24 crc kubenswrapper[4599]: I1012 07:36:24.243013 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:24 crc kubenswrapper[4599]: I1012 07:36:24.243022 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:24Z","lastTransitionTime":"2025-10-12T07:36:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:24 crc kubenswrapper[4599]: I1012 07:36:24.345211 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:24 crc kubenswrapper[4599]: I1012 07:36:24.345259 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:24 crc kubenswrapper[4599]: I1012 07:36:24.345268 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:24 crc kubenswrapper[4599]: I1012 07:36:24.345290 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:24 crc kubenswrapper[4599]: I1012 07:36:24.345302 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:24Z","lastTransitionTime":"2025-10-12T07:36:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:24 crc kubenswrapper[4599]: I1012 07:36:24.447853 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:24 crc kubenswrapper[4599]: I1012 07:36:24.447885 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:24 crc kubenswrapper[4599]: I1012 07:36:24.447896 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:24 crc kubenswrapper[4599]: I1012 07:36:24.447910 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:24 crc kubenswrapper[4599]: I1012 07:36:24.447922 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:24Z","lastTransitionTime":"2025-10-12T07:36:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:24 crc kubenswrapper[4599]: I1012 07:36:24.544835 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kwphq" Oct 12 07:36:24 crc kubenswrapper[4599]: E1012 07:36:24.544981 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kwphq" podUID="3c3e76cc-139b-4a2a-b96b-6077e3706376" Oct 12 07:36:24 crc kubenswrapper[4599]: I1012 07:36:24.550320 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:24 crc kubenswrapper[4599]: I1012 07:36:24.550375 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:24 crc kubenswrapper[4599]: I1012 07:36:24.550386 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:24 crc kubenswrapper[4599]: I1012 07:36:24.550400 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:24 crc kubenswrapper[4599]: I1012 07:36:24.550408 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:24Z","lastTransitionTime":"2025-10-12T07:36:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:24 crc kubenswrapper[4599]: I1012 07:36:24.652999 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:24 crc kubenswrapper[4599]: I1012 07:36:24.653059 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:24 crc kubenswrapper[4599]: I1012 07:36:24.653070 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:24 crc kubenswrapper[4599]: I1012 07:36:24.653091 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:24 crc kubenswrapper[4599]: I1012 07:36:24.653104 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:24Z","lastTransitionTime":"2025-10-12T07:36:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:24 crc kubenswrapper[4599]: I1012 07:36:24.755181 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:24 crc kubenswrapper[4599]: I1012 07:36:24.755231 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:24 crc kubenswrapper[4599]: I1012 07:36:24.755241 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:24 crc kubenswrapper[4599]: I1012 07:36:24.755259 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:24 crc kubenswrapper[4599]: I1012 07:36:24.755270 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:24Z","lastTransitionTime":"2025-10-12T07:36:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:24 crc kubenswrapper[4599]: I1012 07:36:24.857774 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:24 crc kubenswrapper[4599]: I1012 07:36:24.857821 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:24 crc kubenswrapper[4599]: I1012 07:36:24.857831 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:24 crc kubenswrapper[4599]: I1012 07:36:24.857847 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:24 crc kubenswrapper[4599]: I1012 07:36:24.857861 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:24Z","lastTransitionTime":"2025-10-12T07:36:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:24 crc kubenswrapper[4599]: I1012 07:36:24.960093 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:24 crc kubenswrapper[4599]: I1012 07:36:24.960143 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:24 crc kubenswrapper[4599]: I1012 07:36:24.960153 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:24 crc kubenswrapper[4599]: I1012 07:36:24.960174 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:24 crc kubenswrapper[4599]: I1012 07:36:24.960191 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:24Z","lastTransitionTime":"2025-10-12T07:36:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:25 crc kubenswrapper[4599]: I1012 07:36:25.062698 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:25 crc kubenswrapper[4599]: I1012 07:36:25.062748 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:25 crc kubenswrapper[4599]: I1012 07:36:25.062757 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:25 crc kubenswrapper[4599]: I1012 07:36:25.062775 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:25 crc kubenswrapper[4599]: I1012 07:36:25.062786 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:25Z","lastTransitionTime":"2025-10-12T07:36:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:25 crc kubenswrapper[4599]: I1012 07:36:25.165788 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:25 crc kubenswrapper[4599]: I1012 07:36:25.165845 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:25 crc kubenswrapper[4599]: I1012 07:36:25.165858 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:25 crc kubenswrapper[4599]: I1012 07:36:25.165875 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:25 crc kubenswrapper[4599]: I1012 07:36:25.165887 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:25Z","lastTransitionTime":"2025-10-12T07:36:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:25 crc kubenswrapper[4599]: I1012 07:36:25.268755 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:25 crc kubenswrapper[4599]: I1012 07:36:25.268809 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:25 crc kubenswrapper[4599]: I1012 07:36:25.268818 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:25 crc kubenswrapper[4599]: I1012 07:36:25.268834 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:25 crc kubenswrapper[4599]: I1012 07:36:25.268845 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:25Z","lastTransitionTime":"2025-10-12T07:36:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:25 crc kubenswrapper[4599]: I1012 07:36:25.295132 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 07:36:25 crc kubenswrapper[4599]: E1012 07:36:25.295438 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 07:37:29.29541366 +0000 UTC m=+146.084609162 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 07:36:25 crc kubenswrapper[4599]: I1012 07:36:25.371174 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:25 crc kubenswrapper[4599]: I1012 07:36:25.371228 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:25 crc kubenswrapper[4599]: I1012 07:36:25.371236 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:25 crc kubenswrapper[4599]: I1012 07:36:25.371255 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:25 crc kubenswrapper[4599]: I1012 07:36:25.371264 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:25Z","lastTransitionTime":"2025-10-12T07:36:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:25 crc kubenswrapper[4599]: I1012 07:36:25.397285 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 07:36:25 crc kubenswrapper[4599]: I1012 07:36:25.397534 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 07:36:25 crc kubenswrapper[4599]: E1012 07:36:25.397544 4599 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 12 07:36:25 crc kubenswrapper[4599]: I1012 07:36:25.397570 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 07:36:25 crc kubenswrapper[4599]: I1012 07:36:25.397635 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 07:36:25 crc kubenswrapper[4599]: E1012 07:36:25.397765 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-12 07:37:29.397741047 +0000 UTC m=+146.186936549 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 12 07:36:25 crc kubenswrapper[4599]: E1012 07:36:25.397789 4599 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 12 07:36:25 crc kubenswrapper[4599]: E1012 07:36:25.397811 4599 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 12 07:36:25 crc kubenswrapper[4599]: E1012 07:36:25.397823 4599 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 12 07:36:25 crc kubenswrapper[4599]: E1012 07:36:25.397875 4599 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 12 07:36:25 crc kubenswrapper[4599]: E1012 07:36:25.397893 4599 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 12 07:36:25 crc kubenswrapper[4599]: E1012 07:36:25.397837 4599 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 12 07:36:25 crc kubenswrapper[4599]: E1012 07:36:25.398021 4599 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 12 07:36:25 crc kubenswrapper[4599]: E1012 07:36:25.397858 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-12 07:37:29.397833557 +0000 UTC m=+146.187029069 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 12 07:36:25 crc kubenswrapper[4599]: E1012 07:36:25.398057 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-12 07:37:29.398041772 +0000 UTC m=+146.187237284 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 12 07:36:25 crc kubenswrapper[4599]: E1012 07:36:25.398086 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-12 07:37:29.3980789 +0000 UTC m=+146.187274403 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 12 07:36:25 crc kubenswrapper[4599]: I1012 07:36:25.473538 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:25 crc kubenswrapper[4599]: I1012 07:36:25.473581 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:25 crc kubenswrapper[4599]: I1012 07:36:25.473611 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:25 crc kubenswrapper[4599]: I1012 07:36:25.473631 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:25 crc kubenswrapper[4599]: I1012 07:36:25.473647 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:25Z","lastTransitionTime":"2025-10-12T07:36:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:25 crc kubenswrapper[4599]: I1012 07:36:25.531727 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:25 crc kubenswrapper[4599]: I1012 07:36:25.531783 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:25 crc kubenswrapper[4599]: I1012 07:36:25.531793 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:25 crc kubenswrapper[4599]: I1012 07:36:25.531812 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:25 crc kubenswrapper[4599]: I1012 07:36:25.531826 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:25Z","lastTransitionTime":"2025-10-12T07:36:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:25 crc kubenswrapper[4599]: I1012 07:36:25.544830 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 07:36:25 crc kubenswrapper[4599]: I1012 07:36:25.544830 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 07:36:25 crc kubenswrapper[4599]: I1012 07:36:25.544830 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 07:36:25 crc kubenswrapper[4599]: E1012 07:36:25.545003 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 07:36:25 crc kubenswrapper[4599]: E1012 07:36:25.544966 4599 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:36:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:36:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:36:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:36:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:36:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:36:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:36:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:36:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b3fd0901-686d-4e85-8767-c56bc470edcc\\\",\\\"systemUUID\\\":\\\"c0d90076-d180-408b-98a9-f48996ced0a6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:25Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:25 crc kubenswrapper[4599]: E1012 07:36:25.545156 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 07:36:25 crc kubenswrapper[4599]: E1012 07:36:25.545227 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 07:36:25 crc kubenswrapper[4599]: I1012 07:36:25.552789 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:25 crc kubenswrapper[4599]: I1012 07:36:25.553118 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:25 crc kubenswrapper[4599]: I1012 07:36:25.553182 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:25 crc kubenswrapper[4599]: I1012 07:36:25.553243 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:25 crc kubenswrapper[4599]: I1012 07:36:25.553310 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:25Z","lastTransitionTime":"2025-10-12T07:36:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:25 crc kubenswrapper[4599]: E1012 07:36:25.563705 4599 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:36:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:36:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:36:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:36:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:36:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:36:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:36:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:36:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b3fd0901-686d-4e85-8767-c56bc470edcc\\\",\\\"systemUUID\\\":\\\"c0d90076-d180-408b-98a9-f48996ced0a6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:25Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:25 crc kubenswrapper[4599]: I1012 07:36:25.568528 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:25 crc kubenswrapper[4599]: I1012 07:36:25.568636 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:25 crc kubenswrapper[4599]: I1012 07:36:25.568701 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:25 crc kubenswrapper[4599]: I1012 07:36:25.568755 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:25 crc kubenswrapper[4599]: I1012 07:36:25.568810 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:25Z","lastTransitionTime":"2025-10-12T07:36:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:25 crc kubenswrapper[4599]: E1012 07:36:25.578405 4599 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:36:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:36:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:36:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:36:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:36:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:36:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:36:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:36:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b3fd0901-686d-4e85-8767-c56bc470edcc\\\",\\\"systemUUID\\\":\\\"c0d90076-d180-408b-98a9-f48996ced0a6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:25Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:25 crc kubenswrapper[4599]: I1012 07:36:25.581348 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:25 crc kubenswrapper[4599]: I1012 07:36:25.581383 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:25 crc kubenswrapper[4599]: I1012 07:36:25.581393 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:25 crc kubenswrapper[4599]: I1012 07:36:25.581408 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:25 crc kubenswrapper[4599]: I1012 07:36:25.581417 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:25Z","lastTransitionTime":"2025-10-12T07:36:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:25 crc kubenswrapper[4599]: E1012 07:36:25.589676 4599 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:36:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:36:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:36:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:36:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:36:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:36:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:36:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:36:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b3fd0901-686d-4e85-8767-c56bc470edcc\\\",\\\"systemUUID\\\":\\\"c0d90076-d180-408b-98a9-f48996ced0a6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:25Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:25 crc kubenswrapper[4599]: I1012 07:36:25.592127 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:25 crc kubenswrapper[4599]: I1012 07:36:25.592178 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:25 crc kubenswrapper[4599]: I1012 07:36:25.592188 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:25 crc kubenswrapper[4599]: I1012 07:36:25.592204 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:25 crc kubenswrapper[4599]: I1012 07:36:25.592215 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:25Z","lastTransitionTime":"2025-10-12T07:36:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:25 crc kubenswrapper[4599]: E1012 07:36:25.600932 4599 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:36:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:36:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:36:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:36:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:36:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:36:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T07:36:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T07:36:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b3fd0901-686d-4e85-8767-c56bc470edcc\\\",\\\"systemUUID\\\":\\\"c0d90076-d180-408b-98a9-f48996ced0a6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:25Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:25 crc kubenswrapper[4599]: E1012 07:36:25.601041 4599 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 12 07:36:25 crc kubenswrapper[4599]: I1012 07:36:25.602255 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:25 crc kubenswrapper[4599]: I1012 07:36:25.602289 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:25 crc kubenswrapper[4599]: I1012 07:36:25.602300 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:25 crc kubenswrapper[4599]: I1012 07:36:25.602314 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:25 crc kubenswrapper[4599]: I1012 07:36:25.602325 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:25Z","lastTransitionTime":"2025-10-12T07:36:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:25 crc kubenswrapper[4599]: I1012 07:36:25.704971 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:25 crc kubenswrapper[4599]: I1012 07:36:25.705028 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:25 crc kubenswrapper[4599]: I1012 07:36:25.705038 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:25 crc kubenswrapper[4599]: I1012 07:36:25.705053 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:25 crc kubenswrapper[4599]: I1012 07:36:25.705067 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:25Z","lastTransitionTime":"2025-10-12T07:36:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:25 crc kubenswrapper[4599]: I1012 07:36:25.807201 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:25 crc kubenswrapper[4599]: I1012 07:36:25.807241 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:25 crc kubenswrapper[4599]: I1012 07:36:25.807250 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:25 crc kubenswrapper[4599]: I1012 07:36:25.807264 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:25 crc kubenswrapper[4599]: I1012 07:36:25.807275 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:25Z","lastTransitionTime":"2025-10-12T07:36:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:25 crc kubenswrapper[4599]: I1012 07:36:25.909787 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:25 crc kubenswrapper[4599]: I1012 07:36:25.909822 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:25 crc kubenswrapper[4599]: I1012 07:36:25.909832 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:25 crc kubenswrapper[4599]: I1012 07:36:25.909852 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:25 crc kubenswrapper[4599]: I1012 07:36:25.909862 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:25Z","lastTransitionTime":"2025-10-12T07:36:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:26 crc kubenswrapper[4599]: I1012 07:36:26.012682 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:26 crc kubenswrapper[4599]: I1012 07:36:26.012736 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:26 crc kubenswrapper[4599]: I1012 07:36:26.012746 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:26 crc kubenswrapper[4599]: I1012 07:36:26.012770 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:26 crc kubenswrapper[4599]: I1012 07:36:26.012787 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:26Z","lastTransitionTime":"2025-10-12T07:36:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:26 crc kubenswrapper[4599]: I1012 07:36:26.115084 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:26 crc kubenswrapper[4599]: I1012 07:36:26.115151 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:26 crc kubenswrapper[4599]: I1012 07:36:26.115164 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:26 crc kubenswrapper[4599]: I1012 07:36:26.115181 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:26 crc kubenswrapper[4599]: I1012 07:36:26.115196 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:26Z","lastTransitionTime":"2025-10-12T07:36:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:26 crc kubenswrapper[4599]: I1012 07:36:26.217617 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:26 crc kubenswrapper[4599]: I1012 07:36:26.217664 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:26 crc kubenswrapper[4599]: I1012 07:36:26.217678 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:26 crc kubenswrapper[4599]: I1012 07:36:26.217696 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:26 crc kubenswrapper[4599]: I1012 07:36:26.217710 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:26Z","lastTransitionTime":"2025-10-12T07:36:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:26 crc kubenswrapper[4599]: I1012 07:36:26.320232 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:26 crc kubenswrapper[4599]: I1012 07:36:26.320289 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:26 crc kubenswrapper[4599]: I1012 07:36:26.320301 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:26 crc kubenswrapper[4599]: I1012 07:36:26.320325 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:26 crc kubenswrapper[4599]: I1012 07:36:26.320355 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:26Z","lastTransitionTime":"2025-10-12T07:36:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:26 crc kubenswrapper[4599]: I1012 07:36:26.422534 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:26 crc kubenswrapper[4599]: I1012 07:36:26.422626 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:26 crc kubenswrapper[4599]: I1012 07:36:26.422643 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:26 crc kubenswrapper[4599]: I1012 07:36:26.422666 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:26 crc kubenswrapper[4599]: I1012 07:36:26.422679 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:26Z","lastTransitionTime":"2025-10-12T07:36:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:26 crc kubenswrapper[4599]: I1012 07:36:26.525440 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:26 crc kubenswrapper[4599]: I1012 07:36:26.525490 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:26 crc kubenswrapper[4599]: I1012 07:36:26.525503 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:26 crc kubenswrapper[4599]: I1012 07:36:26.525521 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:26 crc kubenswrapper[4599]: I1012 07:36:26.525531 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:26Z","lastTransitionTime":"2025-10-12T07:36:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:26 crc kubenswrapper[4599]: I1012 07:36:26.544984 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kwphq" Oct 12 07:36:26 crc kubenswrapper[4599]: E1012 07:36:26.545140 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kwphq" podUID="3c3e76cc-139b-4a2a-b96b-6077e3706376" Oct 12 07:36:26 crc kubenswrapper[4599]: I1012 07:36:26.628404 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:26 crc kubenswrapper[4599]: I1012 07:36:26.628454 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:26 crc kubenswrapper[4599]: I1012 07:36:26.628466 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:26 crc kubenswrapper[4599]: I1012 07:36:26.628491 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:26 crc kubenswrapper[4599]: I1012 07:36:26.628519 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:26Z","lastTransitionTime":"2025-10-12T07:36:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:26 crc kubenswrapper[4599]: I1012 07:36:26.730998 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:26 crc kubenswrapper[4599]: I1012 07:36:26.731057 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:26 crc kubenswrapper[4599]: I1012 07:36:26.731073 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:26 crc kubenswrapper[4599]: I1012 07:36:26.731096 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:26 crc kubenswrapper[4599]: I1012 07:36:26.731113 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:26Z","lastTransitionTime":"2025-10-12T07:36:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:26 crc kubenswrapper[4599]: I1012 07:36:26.833863 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:26 crc kubenswrapper[4599]: I1012 07:36:26.833917 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:26 crc kubenswrapper[4599]: I1012 07:36:26.833928 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:26 crc kubenswrapper[4599]: I1012 07:36:26.833948 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:26 crc kubenswrapper[4599]: I1012 07:36:26.833959 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:26Z","lastTransitionTime":"2025-10-12T07:36:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:26 crc kubenswrapper[4599]: I1012 07:36:26.935945 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:26 crc kubenswrapper[4599]: I1012 07:36:26.935998 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:26 crc kubenswrapper[4599]: I1012 07:36:26.936012 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:26 crc kubenswrapper[4599]: I1012 07:36:26.936035 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:26 crc kubenswrapper[4599]: I1012 07:36:26.936052 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:26Z","lastTransitionTime":"2025-10-12T07:36:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:27 crc kubenswrapper[4599]: I1012 07:36:27.038491 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:27 crc kubenswrapper[4599]: I1012 07:36:27.038555 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:27 crc kubenswrapper[4599]: I1012 07:36:27.038566 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:27 crc kubenswrapper[4599]: I1012 07:36:27.038588 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:27 crc kubenswrapper[4599]: I1012 07:36:27.038603 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:27Z","lastTransitionTime":"2025-10-12T07:36:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:27 crc kubenswrapper[4599]: I1012 07:36:27.140464 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:27 crc kubenswrapper[4599]: I1012 07:36:27.140500 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:27 crc kubenswrapper[4599]: I1012 07:36:27.140508 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:27 crc kubenswrapper[4599]: I1012 07:36:27.140533 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:27 crc kubenswrapper[4599]: I1012 07:36:27.140545 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:27Z","lastTransitionTime":"2025-10-12T07:36:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:27 crc kubenswrapper[4599]: I1012 07:36:27.242404 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:27 crc kubenswrapper[4599]: I1012 07:36:27.242455 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:27 crc kubenswrapper[4599]: I1012 07:36:27.242471 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:27 crc kubenswrapper[4599]: I1012 07:36:27.242487 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:27 crc kubenswrapper[4599]: I1012 07:36:27.242499 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:27Z","lastTransitionTime":"2025-10-12T07:36:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:27 crc kubenswrapper[4599]: I1012 07:36:27.344601 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:27 crc kubenswrapper[4599]: I1012 07:36:27.344659 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:27 crc kubenswrapper[4599]: I1012 07:36:27.344671 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:27 crc kubenswrapper[4599]: I1012 07:36:27.344690 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:27 crc kubenswrapper[4599]: I1012 07:36:27.344701 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:27Z","lastTransitionTime":"2025-10-12T07:36:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:27 crc kubenswrapper[4599]: I1012 07:36:27.447208 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:27 crc kubenswrapper[4599]: I1012 07:36:27.447258 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:27 crc kubenswrapper[4599]: I1012 07:36:27.447268 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:27 crc kubenswrapper[4599]: I1012 07:36:27.447286 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:27 crc kubenswrapper[4599]: I1012 07:36:27.447299 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:27Z","lastTransitionTime":"2025-10-12T07:36:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:27 crc kubenswrapper[4599]: I1012 07:36:27.545268 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 07:36:27 crc kubenswrapper[4599]: I1012 07:36:27.545367 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 07:36:27 crc kubenswrapper[4599]: I1012 07:36:27.545275 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 07:36:27 crc kubenswrapper[4599]: E1012 07:36:27.545571 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 07:36:27 crc kubenswrapper[4599]: E1012 07:36:27.545705 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 07:36:27 crc kubenswrapper[4599]: E1012 07:36:27.545848 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 07:36:27 crc kubenswrapper[4599]: I1012 07:36:27.549612 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:27 crc kubenswrapper[4599]: I1012 07:36:27.549654 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:27 crc kubenswrapper[4599]: I1012 07:36:27.549667 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:27 crc kubenswrapper[4599]: I1012 07:36:27.549691 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:27 crc kubenswrapper[4599]: I1012 07:36:27.549707 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:27Z","lastTransitionTime":"2025-10-12T07:36:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:27 crc kubenswrapper[4599]: I1012 07:36:27.652609 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:27 crc kubenswrapper[4599]: I1012 07:36:27.652667 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:27 crc kubenswrapper[4599]: I1012 07:36:27.652677 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:27 crc kubenswrapper[4599]: I1012 07:36:27.652695 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:27 crc kubenswrapper[4599]: I1012 07:36:27.652708 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:27Z","lastTransitionTime":"2025-10-12T07:36:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:27 crc kubenswrapper[4599]: I1012 07:36:27.755351 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:27 crc kubenswrapper[4599]: I1012 07:36:27.755401 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:27 crc kubenswrapper[4599]: I1012 07:36:27.755410 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:27 crc kubenswrapper[4599]: I1012 07:36:27.755431 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:27 crc kubenswrapper[4599]: I1012 07:36:27.755445 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:27Z","lastTransitionTime":"2025-10-12T07:36:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:27 crc kubenswrapper[4599]: I1012 07:36:27.858364 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:27 crc kubenswrapper[4599]: I1012 07:36:27.858412 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:27 crc kubenswrapper[4599]: I1012 07:36:27.858424 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:27 crc kubenswrapper[4599]: I1012 07:36:27.858447 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:27 crc kubenswrapper[4599]: I1012 07:36:27.858461 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:27Z","lastTransitionTime":"2025-10-12T07:36:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:27 crc kubenswrapper[4599]: I1012 07:36:27.961482 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:27 crc kubenswrapper[4599]: I1012 07:36:27.961784 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:27 crc kubenswrapper[4599]: I1012 07:36:27.961852 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:27 crc kubenswrapper[4599]: I1012 07:36:27.961983 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:27 crc kubenswrapper[4599]: I1012 07:36:27.962049 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:27Z","lastTransitionTime":"2025-10-12T07:36:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:28 crc kubenswrapper[4599]: I1012 07:36:28.065067 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:28 crc kubenswrapper[4599]: I1012 07:36:28.065106 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:28 crc kubenswrapper[4599]: I1012 07:36:28.065116 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:28 crc kubenswrapper[4599]: I1012 07:36:28.065130 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:28 crc kubenswrapper[4599]: I1012 07:36:28.065142 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:28Z","lastTransitionTime":"2025-10-12T07:36:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:28 crc kubenswrapper[4599]: I1012 07:36:28.167280 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:28 crc kubenswrapper[4599]: I1012 07:36:28.167327 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:28 crc kubenswrapper[4599]: I1012 07:36:28.167366 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:28 crc kubenswrapper[4599]: I1012 07:36:28.167390 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:28 crc kubenswrapper[4599]: I1012 07:36:28.167405 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:28Z","lastTransitionTime":"2025-10-12T07:36:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:28 crc kubenswrapper[4599]: I1012 07:36:28.270153 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:28 crc kubenswrapper[4599]: I1012 07:36:28.270204 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:28 crc kubenswrapper[4599]: I1012 07:36:28.270216 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:28 crc kubenswrapper[4599]: I1012 07:36:28.270236 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:28 crc kubenswrapper[4599]: I1012 07:36:28.270247 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:28Z","lastTransitionTime":"2025-10-12T07:36:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:28 crc kubenswrapper[4599]: I1012 07:36:28.372623 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:28 crc kubenswrapper[4599]: I1012 07:36:28.372670 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:28 crc kubenswrapper[4599]: I1012 07:36:28.372681 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:28 crc kubenswrapper[4599]: I1012 07:36:28.372698 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:28 crc kubenswrapper[4599]: I1012 07:36:28.372708 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:28Z","lastTransitionTime":"2025-10-12T07:36:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:28 crc kubenswrapper[4599]: I1012 07:36:28.474947 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:28 crc kubenswrapper[4599]: I1012 07:36:28.474996 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:28 crc kubenswrapper[4599]: I1012 07:36:28.475009 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:28 crc kubenswrapper[4599]: I1012 07:36:28.475030 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:28 crc kubenswrapper[4599]: I1012 07:36:28.475044 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:28Z","lastTransitionTime":"2025-10-12T07:36:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:28 crc kubenswrapper[4599]: I1012 07:36:28.545027 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kwphq" Oct 12 07:36:28 crc kubenswrapper[4599]: E1012 07:36:28.545197 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kwphq" podUID="3c3e76cc-139b-4a2a-b96b-6077e3706376" Oct 12 07:36:28 crc kubenswrapper[4599]: I1012 07:36:28.577053 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:28 crc kubenswrapper[4599]: I1012 07:36:28.577083 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:28 crc kubenswrapper[4599]: I1012 07:36:28.577093 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:28 crc kubenswrapper[4599]: I1012 07:36:28.577111 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:28 crc kubenswrapper[4599]: I1012 07:36:28.577121 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:28Z","lastTransitionTime":"2025-10-12T07:36:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:28 crc kubenswrapper[4599]: I1012 07:36:28.678988 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:28 crc kubenswrapper[4599]: I1012 07:36:28.679307 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:28 crc kubenswrapper[4599]: I1012 07:36:28.679414 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:28 crc kubenswrapper[4599]: I1012 07:36:28.679500 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:28 crc kubenswrapper[4599]: I1012 07:36:28.679567 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:28Z","lastTransitionTime":"2025-10-12T07:36:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:28 crc kubenswrapper[4599]: I1012 07:36:28.782433 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:28 crc kubenswrapper[4599]: I1012 07:36:28.782781 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:28 crc kubenswrapper[4599]: I1012 07:36:28.782847 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:28 crc kubenswrapper[4599]: I1012 07:36:28.782965 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:28 crc kubenswrapper[4599]: I1012 07:36:28.783042 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:28Z","lastTransitionTime":"2025-10-12T07:36:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:28 crc kubenswrapper[4599]: I1012 07:36:28.885429 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:28 crc kubenswrapper[4599]: I1012 07:36:28.885566 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:28 crc kubenswrapper[4599]: I1012 07:36:28.885635 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:28 crc kubenswrapper[4599]: I1012 07:36:28.885718 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:28 crc kubenswrapper[4599]: I1012 07:36:28.885785 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:28Z","lastTransitionTime":"2025-10-12T07:36:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:28 crc kubenswrapper[4599]: I1012 07:36:28.988613 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:28 crc kubenswrapper[4599]: I1012 07:36:28.988783 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:28 crc kubenswrapper[4599]: I1012 07:36:28.988864 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:28 crc kubenswrapper[4599]: I1012 07:36:28.988932 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:28 crc kubenswrapper[4599]: I1012 07:36:28.988997 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:28Z","lastTransitionTime":"2025-10-12T07:36:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:29 crc kubenswrapper[4599]: I1012 07:36:29.090716 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:29 crc kubenswrapper[4599]: I1012 07:36:29.090765 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:29 crc kubenswrapper[4599]: I1012 07:36:29.090776 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:29 crc kubenswrapper[4599]: I1012 07:36:29.090797 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:29 crc kubenswrapper[4599]: I1012 07:36:29.090810 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:29Z","lastTransitionTime":"2025-10-12T07:36:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:29 crc kubenswrapper[4599]: I1012 07:36:29.192584 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:29 crc kubenswrapper[4599]: I1012 07:36:29.192616 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:29 crc kubenswrapper[4599]: I1012 07:36:29.192643 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:29 crc kubenswrapper[4599]: I1012 07:36:29.192653 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:29 crc kubenswrapper[4599]: I1012 07:36:29.192661 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:29Z","lastTransitionTime":"2025-10-12T07:36:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:29 crc kubenswrapper[4599]: I1012 07:36:29.295309 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:29 crc kubenswrapper[4599]: I1012 07:36:29.295394 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:29 crc kubenswrapper[4599]: I1012 07:36:29.295407 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:29 crc kubenswrapper[4599]: I1012 07:36:29.295432 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:29 crc kubenswrapper[4599]: I1012 07:36:29.295459 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:29Z","lastTransitionTime":"2025-10-12T07:36:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:29 crc kubenswrapper[4599]: I1012 07:36:29.397839 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:29 crc kubenswrapper[4599]: I1012 07:36:29.397895 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:29 crc kubenswrapper[4599]: I1012 07:36:29.397905 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:29 crc kubenswrapper[4599]: I1012 07:36:29.397921 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:29 crc kubenswrapper[4599]: I1012 07:36:29.397932 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:29Z","lastTransitionTime":"2025-10-12T07:36:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:29 crc kubenswrapper[4599]: I1012 07:36:29.500718 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:29 crc kubenswrapper[4599]: I1012 07:36:29.500789 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:29 crc kubenswrapper[4599]: I1012 07:36:29.500801 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:29 crc kubenswrapper[4599]: I1012 07:36:29.500827 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:29 crc kubenswrapper[4599]: I1012 07:36:29.500841 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:29Z","lastTransitionTime":"2025-10-12T07:36:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:29 crc kubenswrapper[4599]: I1012 07:36:29.545245 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 07:36:29 crc kubenswrapper[4599]: I1012 07:36:29.545332 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 07:36:29 crc kubenswrapper[4599]: E1012 07:36:29.545538 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 07:36:29 crc kubenswrapper[4599]: I1012 07:36:29.545637 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 07:36:29 crc kubenswrapper[4599]: E1012 07:36:29.545862 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 07:36:29 crc kubenswrapper[4599]: E1012 07:36:29.546066 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 07:36:29 crc kubenswrapper[4599]: I1012 07:36:29.603289 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:29 crc kubenswrapper[4599]: I1012 07:36:29.603354 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:29 crc kubenswrapper[4599]: I1012 07:36:29.603365 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:29 crc kubenswrapper[4599]: I1012 07:36:29.603381 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:29 crc kubenswrapper[4599]: I1012 07:36:29.603396 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:29Z","lastTransitionTime":"2025-10-12T07:36:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:29 crc kubenswrapper[4599]: I1012 07:36:29.706652 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:29 crc kubenswrapper[4599]: I1012 07:36:29.706705 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:29 crc kubenswrapper[4599]: I1012 07:36:29.706714 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:29 crc kubenswrapper[4599]: I1012 07:36:29.706734 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:29 crc kubenswrapper[4599]: I1012 07:36:29.706747 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:29Z","lastTransitionTime":"2025-10-12T07:36:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:29 crc kubenswrapper[4599]: I1012 07:36:29.809307 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:29 crc kubenswrapper[4599]: I1012 07:36:29.809379 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:29 crc kubenswrapper[4599]: I1012 07:36:29.809390 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:29 crc kubenswrapper[4599]: I1012 07:36:29.809407 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:29 crc kubenswrapper[4599]: I1012 07:36:29.809419 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:29Z","lastTransitionTime":"2025-10-12T07:36:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:29 crc kubenswrapper[4599]: I1012 07:36:29.912523 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:29 crc kubenswrapper[4599]: I1012 07:36:29.912567 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:29 crc kubenswrapper[4599]: I1012 07:36:29.912577 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:29 crc kubenswrapper[4599]: I1012 07:36:29.912599 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:29 crc kubenswrapper[4599]: I1012 07:36:29.912611 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:29Z","lastTransitionTime":"2025-10-12T07:36:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:30 crc kubenswrapper[4599]: I1012 07:36:30.015456 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:30 crc kubenswrapper[4599]: I1012 07:36:30.015518 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:30 crc kubenswrapper[4599]: I1012 07:36:30.015529 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:30 crc kubenswrapper[4599]: I1012 07:36:30.015550 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:30 crc kubenswrapper[4599]: I1012 07:36:30.015563 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:30Z","lastTransitionTime":"2025-10-12T07:36:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:30 crc kubenswrapper[4599]: I1012 07:36:30.118087 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:30 crc kubenswrapper[4599]: I1012 07:36:30.118140 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:30 crc kubenswrapper[4599]: I1012 07:36:30.118151 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:30 crc kubenswrapper[4599]: I1012 07:36:30.118172 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:30 crc kubenswrapper[4599]: I1012 07:36:30.118184 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:30Z","lastTransitionTime":"2025-10-12T07:36:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:30 crc kubenswrapper[4599]: I1012 07:36:30.220859 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:30 crc kubenswrapper[4599]: I1012 07:36:30.220909 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:30 crc kubenswrapper[4599]: I1012 07:36:30.220919 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:30 crc kubenswrapper[4599]: I1012 07:36:30.220939 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:30 crc kubenswrapper[4599]: I1012 07:36:30.220950 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:30Z","lastTransitionTime":"2025-10-12T07:36:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:30 crc kubenswrapper[4599]: I1012 07:36:30.323909 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:30 crc kubenswrapper[4599]: I1012 07:36:30.323972 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:30 crc kubenswrapper[4599]: I1012 07:36:30.323984 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:30 crc kubenswrapper[4599]: I1012 07:36:30.324004 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:30 crc kubenswrapper[4599]: I1012 07:36:30.324017 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:30Z","lastTransitionTime":"2025-10-12T07:36:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:30 crc kubenswrapper[4599]: I1012 07:36:30.425641 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:30 crc kubenswrapper[4599]: I1012 07:36:30.425681 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:30 crc kubenswrapper[4599]: I1012 07:36:30.425691 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:30 crc kubenswrapper[4599]: I1012 07:36:30.425703 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:30 crc kubenswrapper[4599]: I1012 07:36:30.425713 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:30Z","lastTransitionTime":"2025-10-12T07:36:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:30 crc kubenswrapper[4599]: I1012 07:36:30.527787 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:30 crc kubenswrapper[4599]: I1012 07:36:30.527837 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:30 crc kubenswrapper[4599]: I1012 07:36:30.527848 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:30 crc kubenswrapper[4599]: I1012 07:36:30.527868 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:30 crc kubenswrapper[4599]: I1012 07:36:30.527881 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:30Z","lastTransitionTime":"2025-10-12T07:36:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:30 crc kubenswrapper[4599]: I1012 07:36:30.544299 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kwphq" Oct 12 07:36:30 crc kubenswrapper[4599]: E1012 07:36:30.544534 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kwphq" podUID="3c3e76cc-139b-4a2a-b96b-6077e3706376" Oct 12 07:36:30 crc kubenswrapper[4599]: I1012 07:36:30.630865 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:30 crc kubenswrapper[4599]: I1012 07:36:30.631052 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:30 crc kubenswrapper[4599]: I1012 07:36:30.631119 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:30 crc kubenswrapper[4599]: I1012 07:36:30.631207 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:30 crc kubenswrapper[4599]: I1012 07:36:30.631268 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:30Z","lastTransitionTime":"2025-10-12T07:36:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:30 crc kubenswrapper[4599]: I1012 07:36:30.734013 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:30 crc kubenswrapper[4599]: I1012 07:36:30.734133 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:30 crc kubenswrapper[4599]: I1012 07:36:30.734213 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:30 crc kubenswrapper[4599]: I1012 07:36:30.734274 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:30 crc kubenswrapper[4599]: I1012 07:36:30.734368 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:30Z","lastTransitionTime":"2025-10-12T07:36:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:30 crc kubenswrapper[4599]: I1012 07:36:30.837041 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:30 crc kubenswrapper[4599]: I1012 07:36:30.837094 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:30 crc kubenswrapper[4599]: I1012 07:36:30.837108 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:30 crc kubenswrapper[4599]: I1012 07:36:30.837127 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:30 crc kubenswrapper[4599]: I1012 07:36:30.837143 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:30Z","lastTransitionTime":"2025-10-12T07:36:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:30 crc kubenswrapper[4599]: I1012 07:36:30.939269 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:30 crc kubenswrapper[4599]: I1012 07:36:30.939318 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:30 crc kubenswrapper[4599]: I1012 07:36:30.939327 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:30 crc kubenswrapper[4599]: I1012 07:36:30.939364 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:30 crc kubenswrapper[4599]: I1012 07:36:30.939414 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:30Z","lastTransitionTime":"2025-10-12T07:36:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:31 crc kubenswrapper[4599]: I1012 07:36:31.042020 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:31 crc kubenswrapper[4599]: I1012 07:36:31.042058 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:31 crc kubenswrapper[4599]: I1012 07:36:31.042067 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:31 crc kubenswrapper[4599]: I1012 07:36:31.042082 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:31 crc kubenswrapper[4599]: I1012 07:36:31.042092 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:31Z","lastTransitionTime":"2025-10-12T07:36:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:31 crc kubenswrapper[4599]: I1012 07:36:31.144252 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:31 crc kubenswrapper[4599]: I1012 07:36:31.144275 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:31 crc kubenswrapper[4599]: I1012 07:36:31.144284 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:31 crc kubenswrapper[4599]: I1012 07:36:31.144295 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:31 crc kubenswrapper[4599]: I1012 07:36:31.144304 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:31Z","lastTransitionTime":"2025-10-12T07:36:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:31 crc kubenswrapper[4599]: I1012 07:36:31.246556 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:31 crc kubenswrapper[4599]: I1012 07:36:31.246592 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:31 crc kubenswrapper[4599]: I1012 07:36:31.246600 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:31 crc kubenswrapper[4599]: I1012 07:36:31.246614 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:31 crc kubenswrapper[4599]: I1012 07:36:31.246624 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:31Z","lastTransitionTime":"2025-10-12T07:36:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:31 crc kubenswrapper[4599]: I1012 07:36:31.348855 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:31 crc kubenswrapper[4599]: I1012 07:36:31.348913 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:31 crc kubenswrapper[4599]: I1012 07:36:31.348925 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:31 crc kubenswrapper[4599]: I1012 07:36:31.348943 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:31 crc kubenswrapper[4599]: I1012 07:36:31.348954 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:31Z","lastTransitionTime":"2025-10-12T07:36:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:31 crc kubenswrapper[4599]: I1012 07:36:31.451296 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:31 crc kubenswrapper[4599]: I1012 07:36:31.451378 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:31 crc kubenswrapper[4599]: I1012 07:36:31.451388 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:31 crc kubenswrapper[4599]: I1012 07:36:31.451408 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:31 crc kubenswrapper[4599]: I1012 07:36:31.451423 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:31Z","lastTransitionTime":"2025-10-12T07:36:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:31 crc kubenswrapper[4599]: I1012 07:36:31.544799 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 07:36:31 crc kubenswrapper[4599]: I1012 07:36:31.544902 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 07:36:31 crc kubenswrapper[4599]: E1012 07:36:31.544989 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 07:36:31 crc kubenswrapper[4599]: I1012 07:36:31.545036 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 07:36:31 crc kubenswrapper[4599]: E1012 07:36:31.545219 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 07:36:31 crc kubenswrapper[4599]: E1012 07:36:31.545241 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 07:36:31 crc kubenswrapper[4599]: I1012 07:36:31.546040 4599 scope.go:117] "RemoveContainer" containerID="22386ac9ba5ff4e4575b13c71cd5865e1a6ea9c403197c2c0ea796f37f1085d1" Oct 12 07:36:31 crc kubenswrapper[4599]: E1012 07:36:31.546236 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-whk5b_openshift-ovn-kubernetes(1a95b7ab-8632-4332-a30f-64f28ef8d313)\"" pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" podUID="1a95b7ab-8632-4332-a30f-64f28ef8d313" Oct 12 07:36:31 crc kubenswrapper[4599]: I1012 07:36:31.553310 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:31 crc kubenswrapper[4599]: I1012 07:36:31.553369 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:31 crc kubenswrapper[4599]: I1012 07:36:31.553384 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:31 crc kubenswrapper[4599]: I1012 07:36:31.553399 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:31 crc kubenswrapper[4599]: I1012 07:36:31.553413 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:31Z","lastTransitionTime":"2025-10-12T07:36:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:31 crc kubenswrapper[4599]: I1012 07:36:31.655659 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:31 crc kubenswrapper[4599]: I1012 07:36:31.655706 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:31 crc kubenswrapper[4599]: I1012 07:36:31.655715 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:31 crc kubenswrapper[4599]: I1012 07:36:31.655732 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:31 crc kubenswrapper[4599]: I1012 07:36:31.655742 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:31Z","lastTransitionTime":"2025-10-12T07:36:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:31 crc kubenswrapper[4599]: I1012 07:36:31.758412 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:31 crc kubenswrapper[4599]: I1012 07:36:31.758451 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:31 crc kubenswrapper[4599]: I1012 07:36:31.758462 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:31 crc kubenswrapper[4599]: I1012 07:36:31.758481 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:31 crc kubenswrapper[4599]: I1012 07:36:31.758492 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:31Z","lastTransitionTime":"2025-10-12T07:36:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:31 crc kubenswrapper[4599]: I1012 07:36:31.860917 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:31 crc kubenswrapper[4599]: I1012 07:36:31.860980 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:31 crc kubenswrapper[4599]: I1012 07:36:31.860992 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:31 crc kubenswrapper[4599]: I1012 07:36:31.861013 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:31 crc kubenswrapper[4599]: I1012 07:36:31.861030 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:31Z","lastTransitionTime":"2025-10-12T07:36:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:31 crc kubenswrapper[4599]: I1012 07:36:31.963682 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:31 crc kubenswrapper[4599]: I1012 07:36:31.963745 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:31 crc kubenswrapper[4599]: I1012 07:36:31.963759 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:31 crc kubenswrapper[4599]: I1012 07:36:31.963782 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:31 crc kubenswrapper[4599]: I1012 07:36:31.963798 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:31Z","lastTransitionTime":"2025-10-12T07:36:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:32 crc kubenswrapper[4599]: I1012 07:36:32.066101 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:32 crc kubenswrapper[4599]: I1012 07:36:32.066144 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:32 crc kubenswrapper[4599]: I1012 07:36:32.066154 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:32 crc kubenswrapper[4599]: I1012 07:36:32.066172 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:32 crc kubenswrapper[4599]: I1012 07:36:32.066185 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:32Z","lastTransitionTime":"2025-10-12T07:36:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:32 crc kubenswrapper[4599]: I1012 07:36:32.168671 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:32 crc kubenswrapper[4599]: I1012 07:36:32.168722 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:32 crc kubenswrapper[4599]: I1012 07:36:32.168733 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:32 crc kubenswrapper[4599]: I1012 07:36:32.168750 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:32 crc kubenswrapper[4599]: I1012 07:36:32.168762 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:32Z","lastTransitionTime":"2025-10-12T07:36:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:32 crc kubenswrapper[4599]: I1012 07:36:32.271050 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:32 crc kubenswrapper[4599]: I1012 07:36:32.271095 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:32 crc kubenswrapper[4599]: I1012 07:36:32.271104 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:32 crc kubenswrapper[4599]: I1012 07:36:32.271120 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:32 crc kubenswrapper[4599]: I1012 07:36:32.271132 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:32Z","lastTransitionTime":"2025-10-12T07:36:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:32 crc kubenswrapper[4599]: I1012 07:36:32.373731 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:32 crc kubenswrapper[4599]: I1012 07:36:32.373798 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:32 crc kubenswrapper[4599]: I1012 07:36:32.373806 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:32 crc kubenswrapper[4599]: I1012 07:36:32.373824 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:32 crc kubenswrapper[4599]: I1012 07:36:32.373834 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:32Z","lastTransitionTime":"2025-10-12T07:36:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:32 crc kubenswrapper[4599]: I1012 07:36:32.476183 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:32 crc kubenswrapper[4599]: I1012 07:36:32.476236 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:32 crc kubenswrapper[4599]: I1012 07:36:32.476249 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:32 crc kubenswrapper[4599]: I1012 07:36:32.476262 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:32 crc kubenswrapper[4599]: I1012 07:36:32.476270 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:32Z","lastTransitionTime":"2025-10-12T07:36:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:32 crc kubenswrapper[4599]: I1012 07:36:32.544963 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kwphq" Oct 12 07:36:32 crc kubenswrapper[4599]: E1012 07:36:32.545071 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kwphq" podUID="3c3e76cc-139b-4a2a-b96b-6077e3706376" Oct 12 07:36:32 crc kubenswrapper[4599]: I1012 07:36:32.578162 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:32 crc kubenswrapper[4599]: I1012 07:36:32.578215 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:32 crc kubenswrapper[4599]: I1012 07:36:32.578224 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:32 crc kubenswrapper[4599]: I1012 07:36:32.578244 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:32 crc kubenswrapper[4599]: I1012 07:36:32.578257 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:32Z","lastTransitionTime":"2025-10-12T07:36:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:32 crc kubenswrapper[4599]: I1012 07:36:32.680137 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:32 crc kubenswrapper[4599]: I1012 07:36:32.680166 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:32 crc kubenswrapper[4599]: I1012 07:36:32.680175 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:32 crc kubenswrapper[4599]: I1012 07:36:32.680202 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:32 crc kubenswrapper[4599]: I1012 07:36:32.680210 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:32Z","lastTransitionTime":"2025-10-12T07:36:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:32 crc kubenswrapper[4599]: I1012 07:36:32.782219 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:32 crc kubenswrapper[4599]: I1012 07:36:32.782249 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:32 crc kubenswrapper[4599]: I1012 07:36:32.782257 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:32 crc kubenswrapper[4599]: I1012 07:36:32.782267 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:32 crc kubenswrapper[4599]: I1012 07:36:32.782275 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:32Z","lastTransitionTime":"2025-10-12T07:36:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:32 crc kubenswrapper[4599]: I1012 07:36:32.884620 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:32 crc kubenswrapper[4599]: I1012 07:36:32.884652 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:32 crc kubenswrapper[4599]: I1012 07:36:32.884661 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:32 crc kubenswrapper[4599]: I1012 07:36:32.884673 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:32 crc kubenswrapper[4599]: I1012 07:36:32.884700 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:32Z","lastTransitionTime":"2025-10-12T07:36:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:32 crc kubenswrapper[4599]: I1012 07:36:32.986295 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:32 crc kubenswrapper[4599]: I1012 07:36:32.986356 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:32 crc kubenswrapper[4599]: I1012 07:36:32.986365 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:32 crc kubenswrapper[4599]: I1012 07:36:32.986374 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:32 crc kubenswrapper[4599]: I1012 07:36:32.986381 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:32Z","lastTransitionTime":"2025-10-12T07:36:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:33 crc kubenswrapper[4599]: I1012 07:36:33.090975 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:33 crc kubenswrapper[4599]: I1012 07:36:33.091559 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:33 crc kubenswrapper[4599]: I1012 07:36:33.091586 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:33 crc kubenswrapper[4599]: I1012 07:36:33.091602 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:33 crc kubenswrapper[4599]: I1012 07:36:33.091611 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:33Z","lastTransitionTime":"2025-10-12T07:36:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:33 crc kubenswrapper[4599]: I1012 07:36:33.193940 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:33 crc kubenswrapper[4599]: I1012 07:36:33.193997 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:33 crc kubenswrapper[4599]: I1012 07:36:33.194007 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:33 crc kubenswrapper[4599]: I1012 07:36:33.194026 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:33 crc kubenswrapper[4599]: I1012 07:36:33.194038 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:33Z","lastTransitionTime":"2025-10-12T07:36:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:33 crc kubenswrapper[4599]: I1012 07:36:33.296186 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:33 crc kubenswrapper[4599]: I1012 07:36:33.296225 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:33 crc kubenswrapper[4599]: I1012 07:36:33.296251 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:33 crc kubenswrapper[4599]: I1012 07:36:33.296263 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:33 crc kubenswrapper[4599]: I1012 07:36:33.296271 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:33Z","lastTransitionTime":"2025-10-12T07:36:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:33 crc kubenswrapper[4599]: I1012 07:36:33.398789 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:33 crc kubenswrapper[4599]: I1012 07:36:33.398842 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:33 crc kubenswrapper[4599]: I1012 07:36:33.398851 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:33 crc kubenswrapper[4599]: I1012 07:36:33.398873 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:33 crc kubenswrapper[4599]: I1012 07:36:33.398886 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:33Z","lastTransitionTime":"2025-10-12T07:36:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:33 crc kubenswrapper[4599]: I1012 07:36:33.501675 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:33 crc kubenswrapper[4599]: I1012 07:36:33.501734 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:33 crc kubenswrapper[4599]: I1012 07:36:33.501743 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:33 crc kubenswrapper[4599]: I1012 07:36:33.501763 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:33 crc kubenswrapper[4599]: I1012 07:36:33.501777 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:33Z","lastTransitionTime":"2025-10-12T07:36:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:33 crc kubenswrapper[4599]: I1012 07:36:33.545196 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 07:36:33 crc kubenswrapper[4599]: I1012 07:36:33.545213 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 07:36:33 crc kubenswrapper[4599]: I1012 07:36:33.545222 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 07:36:33 crc kubenswrapper[4599]: E1012 07:36:33.545479 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 07:36:33 crc kubenswrapper[4599]: E1012 07:36:33.545572 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 07:36:33 crc kubenswrapper[4599]: E1012 07:36:33.545627 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 07:36:33 crc kubenswrapper[4599]: I1012 07:36:33.561020 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8afb4c88-1806-483a-9957-30833be28202\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c61ee75e4bbe142c54914177759cf0c49ecc36f904c8c6bf421cd4917256d5a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://545c8135b4e5319ac3b9c84ee50556b66e4f5cfe0987f5aa37ef668130f6f267\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40188eb89126b26a3002269d063827de60fa17f72c18fbbd0e15b16c744e8ddb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc5c76ab7433309a1f2c32d2ada884b5b0d926fee8219463079f8689a1f7efef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21d90e31b7ebf70d48881d9d0837e8a1fdeabc3d53988c7ba993ce28e902d3d4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T07:35:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW1012 07:35:20.755648 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1012 07:35:20.755757 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 07:35:20.757774 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2293705747/tls.crt::/tmp/serving-cert-2293705747/tls.key\\\\\\\"\\\\nI1012 07:35:20.975376 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1012 07:35:20.977324 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1012 07:35:20.977357 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1012 07:35:20.977377 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1012 07:35:20.977382 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1012 07:35:20.982144 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1012 07:35:20.982166 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1012 07:35:20.982172 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1012 07:35:20.982212 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1012 07:35:20.982216 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1012 07:35:20.982220 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1012 07:35:20.982223 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1012 07:35:20.982226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1012 07:35:20.985689 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e595e7d1dee4ecff3b5deb6e363b57de09e4bcfddd6ab9ae9be27b6e3578628\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afc60525d60f5b497a75efb631611d658c23f6336c21ca7421bf3efee3ece0ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afc60525d60f5b497a75efb631611d658c23f6336c21ca7421bf3efee3ece0ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:33Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:33 crc kubenswrapper[4599]: I1012 07:36:33.573501 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db79c5e0b03856a5da5fba6b8e6cb77590ef9e29090ac66da24d54ea638a20f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:33Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:33 crc kubenswrapper[4599]: I1012 07:36:33.583173 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:33Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:33 crc kubenswrapper[4599]: I1012 07:36:33.591910 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc694bce-8c25-4729-b452-29d44d3efe6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c222e0cd037e0719d1ea94d4f28639dae42c676fa56b5f57b22cd50d87e6b29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25l5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://238fe8d2b2e7f08d70bc5f938aca3c90b68c447db21a713d651d47b47a8127c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25l5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5mz5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:33Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:33 crc kubenswrapper[4599]: I1012 07:36:33.599966 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xl2ns" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29fbfc4b-8e32-4132-a34d-48b25ec31428\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e2f2c3c17ada7047edb8e0fcd51227104458618c98e5ddec12123aa7ee173e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqqzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b84de1dd84c3c72376a170b7ccbdfff8c3bdd5e9e18a344002e112e81ca555a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqqzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xl2ns\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:33Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:33 crc kubenswrapper[4599]: I1012 07:36:33.603895 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:33 crc kubenswrapper[4599]: I1012 07:36:33.603930 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:33 crc kubenswrapper[4599]: I1012 07:36:33.603939 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:33 crc kubenswrapper[4599]: I1012 07:36:33.603954 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:33 crc kubenswrapper[4599]: I1012 07:36:33.603963 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:33Z","lastTransitionTime":"2025-10-12T07:36:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:33 crc kubenswrapper[4599]: I1012 07:36:33.608385 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3aad3aaa-c433-4fe0-b01e-f34979c35817\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://443dfe629ce4bc2f6d96108ef419b3f8e577609981571c42882a8b3eb587722a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b748451e24963062c3aec5bf944775ca77c166a8b97b4309b9102a9157312cdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b748451e24963062c3aec5bf944775ca77c166a8b97b4309b9102a9157312cdd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:33Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:33 crc kubenswrapper[4599]: I1012 07:36:33.618638 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2869b43-5f5f-4ed7-9dc3-63d210bde137\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21abc118842f09c54256bd05f1e5978c928bad569bc9b52157287576117fb49d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb25175bfe42e193200fbc76f82294afbd5f23d9653637c12ddf67553431748e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da490def77b19758f5479ef7070846dee19a5057bd6b0568d96fd38a8dd6528\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://622eac195ecab593bc5fa71a4a88ac987a45be5fb130336aae7acd680c51abe4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:33Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:33 crc kubenswrapper[4599]: I1012 07:36:33.630044 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9xbn5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18ca8765-c435-4750-b803-14b539958d9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4db4e7d06f60157d91fcb8514d176f78540085701360ebcfa7eabb36fe84d6b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16c75354b9fbe00273ab4aaa5ca1306c5d097996b8ce8074a2cee8185118a001\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16c75354b9fbe00273ab4aaa5ca1306c5d097996b8ce8074a2cee8185118a001\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a7c28a100796e21008c597d15ee36786f7e87553a8a4feb5734ac79cef9b556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a7c28a100796e21008c597d15ee36786f7e87553a8a4feb5734ac79cef9b556\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6388c1e143c0caded216941ec25bd07be0a14f2f97f6328c35d5cc04815a88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6388c1e143c0caded216941ec25bd07be0a14f2f97f6328c35d5cc04815a88a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e201774968890569efafc736378cff0d700067bb0bd39677f42ef65fa2c07646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e201774968890569efafc736378cff0d700067bb0bd39677f42ef65fa2c07646\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f07c63fbebde2f05a5b1682af69eb42b5aec20728e0997f1ebc2f5a59815ad1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f07c63fbebde2f05a5b1682af69eb42b5aec20728e0997f1ebc2f5a59815ad1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f62b824f1e7400e9588b3d3a1afc52f86b3bcae039fbe3c757e1e01733e3a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f62b824f1e7400e9588b3d3a1afc52f86b3bcae039fbe3c757e1e01733e3a69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-st7tw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9xbn5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:33Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:33 crc kubenswrapper[4599]: I1012 07:36:33.638489 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8hm26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce311f52-0501-45d3-8209-b1d2aa25028b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d07bb1c67c880fb49b68be21180f0a5a053da62097b38605440625d10650033\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://611114a1fecbd461507dd2f566f975c7ef3e8d7655d90c1a5b31cd05a430cb38\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T07:36:07Z\\\",\\\"message\\\":\\\"2025-10-12T07:35:22+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_dbf0399a-84fd-451c-96bc-e20b8ff4f200\\\\n2025-10-12T07:35:22+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_dbf0399a-84fd-451c-96bc-e20b8ff4f200 to /host/opt/cni/bin/\\\\n2025-10-12T07:35:22Z [verbose] multus-daemon started\\\\n2025-10-12T07:35:22Z [verbose] Readiness Indicator file check\\\\n2025-10-12T07:36:07Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:36:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-54j2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8hm26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:33Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:33 crc kubenswrapper[4599]: I1012 07:36:33.646565 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:33Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:33 crc kubenswrapper[4599]: I1012 07:36:33.653942 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:33Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:33 crc kubenswrapper[4599]: I1012 07:36:33.662217 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4b9189620cc5d626fe6c3e7bbe0efd22f7a6f061bc2afd3d33b1e04ebdbfb1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc203371af6667734441e7dca5f3b301365eb1136f363df8c87377ae58168130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:33Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:33 crc kubenswrapper[4599]: I1012 07:36:33.670667 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b574e7cb76b05a81f63a55ce4544b1d4b3a58e6793be2c89620f66282642e39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:33Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:33 crc kubenswrapper[4599]: I1012 07:36:33.677782 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"636df6e3-ed6a-452a-9d5d-e26139f62951\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f52b4e0ae3f90600ec5e799352a1f70b4fad54a4805616d1508c22af008a0f8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56aa5b66fc1ca787e84c8ca2a717038f9e522862fb680d5271921e410d5841f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://091875c6e2e9f643a0c31783ab1905c5fd448d20e98b9b4de70be991e7e8f1c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d08b26fb90464c31907b23edfd9f4997375c3178e8699cf951af85424d9354d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d08b26fb90464c31907b23edfd9f4997375c3178e8699cf951af85424d9354d9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:04Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:03Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:33Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:33 crc kubenswrapper[4599]: I1012 07:36:33.683760 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rxr42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92909e59-b659-4fc0-91c0-1880ff96b4f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4162f0c9d7789e80004cf3ea54eed47b4d473832cf68a03e348973c300f9dcc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7hr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rxr42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:33Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:33 crc kubenswrapper[4599]: I1012 07:36:33.689899 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f5988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0147ed28-bc0f-409c-a813-dc2ffffba092\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b06de751793dade650ec18334a6bcf8bb26e5e89af47ede264d087d1a950178a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzz9z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f5988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:33Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:33 crc kubenswrapper[4599]: I1012 07:36:33.702096 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a95b7ab-8632-4332-a30f-64f28ef8d313\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://543f674ce98b17c90c5a351ae72f596a2e90006a1b86fcede3359715c67b94bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c150b687d6b0c337604ac14537335d57d9c955a9e31001d274507e5e65ac7bc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f73bbbb549129c48a1421dda3d2385cf3515feafd8215cdac6b792823d614b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a156f480c7252f61ec3dc991814db0e7259fdf7ae618c8f43d86da97f8a6bc5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f99854365302221a36d3d7647292430b5655222c551953b6aa6de9eb1c922bea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://810de96e34e6b028e90bcc5693f1624d337cdd775c01901fc91e6dd2a7f3564d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22386ac9ba5ff4e4575b13c71cd5865e1a6ea9c403197c2c0ea796f37f1085d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22386ac9ba5ff4e4575b13c71cd5865e1a6ea9c403197c2c0ea796f37f1085d1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T07:36:16Z\\\",\\\"message\\\":\\\"TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1012 07:36:16.300660 6611 services_controller.go:454] Service openshift-marketplace/redhat-operators for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI1012 07:36:16.300096 6611 ovn.go:134] Ensuring zone local for Pod openshift-image-registry/node-ca-rxr42 in node crc\\\\nI1012 07:36:16.300675 6611 obj_retry.go:386] Retry successful for *v1.Pod openshift-image-registry/node-ca-rxr42 after 0 failed attempt(s)\\\\nI1012 07:36:16.300681 6611 default_network_controller.go:776] Recording success event on pod openshift-image-registry/node-ca-rxr42\\\\nI1012 07:36:16.300674 6611 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/redhat-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"97419c58-41c7-41d7-a137-a446f0c7eeb3\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/redhat-operators\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Gro\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T07:36:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-whk5b_openshift-ovn-kubernetes(1a95b7ab-8632-4332-a30f-64f28ef8d313)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce496909e393b4896eb9ae165317648df3e828d219c231d95c6057cd08b78fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T07:35:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce4f64c7fab07385f9215821c5c75c5b345653fa141a238bb3dd74e7a1a291a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce4f64c7fab07385f9215821c5c75c5b345653fa141a238bb3dd74e7a1a291a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T07:35:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T07:35:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzwzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-whk5b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:33Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:33 crc kubenswrapper[4599]: I1012 07:36:33.705935 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:33 crc kubenswrapper[4599]: I1012 07:36:33.705984 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:33 crc kubenswrapper[4599]: I1012 07:36:33.705997 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:33 crc kubenswrapper[4599]: I1012 07:36:33.706016 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:33 crc kubenswrapper[4599]: I1012 07:36:33.706028 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:33Z","lastTransitionTime":"2025-10-12T07:36:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:33 crc kubenswrapper[4599]: I1012 07:36:33.710892 4599 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kwphq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c3e76cc-139b-4a2a-b96b-6077e3706376\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T07:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9v52\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9v52\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T07:35:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kwphq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T07:36:33Z is after 2025-08-24T17:21:41Z" Oct 12 07:36:33 crc kubenswrapper[4599]: I1012 07:36:33.808557 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:33 crc kubenswrapper[4599]: I1012 07:36:33.808593 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:33 crc kubenswrapper[4599]: I1012 07:36:33.808602 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:33 crc kubenswrapper[4599]: I1012 07:36:33.808615 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:33 crc kubenswrapper[4599]: I1012 07:36:33.808626 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:33Z","lastTransitionTime":"2025-10-12T07:36:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:33 crc kubenswrapper[4599]: I1012 07:36:33.910245 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:33 crc kubenswrapper[4599]: I1012 07:36:33.910296 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:33 crc kubenswrapper[4599]: I1012 07:36:33.910305 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:33 crc kubenswrapper[4599]: I1012 07:36:33.910324 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:33 crc kubenswrapper[4599]: I1012 07:36:33.910359 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:33Z","lastTransitionTime":"2025-10-12T07:36:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:34 crc kubenswrapper[4599]: I1012 07:36:34.012082 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:34 crc kubenswrapper[4599]: I1012 07:36:34.012133 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:34 crc kubenswrapper[4599]: I1012 07:36:34.012143 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:34 crc kubenswrapper[4599]: I1012 07:36:34.012163 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:34 crc kubenswrapper[4599]: I1012 07:36:34.012176 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:34Z","lastTransitionTime":"2025-10-12T07:36:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:34 crc kubenswrapper[4599]: I1012 07:36:34.115044 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:34 crc kubenswrapper[4599]: I1012 07:36:34.115181 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:34 crc kubenswrapper[4599]: I1012 07:36:34.115279 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:34 crc kubenswrapper[4599]: I1012 07:36:34.115365 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:34 crc kubenswrapper[4599]: I1012 07:36:34.115441 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:34Z","lastTransitionTime":"2025-10-12T07:36:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:34 crc kubenswrapper[4599]: I1012 07:36:34.218041 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:34 crc kubenswrapper[4599]: I1012 07:36:34.218092 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:34 crc kubenswrapper[4599]: I1012 07:36:34.218102 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:34 crc kubenswrapper[4599]: I1012 07:36:34.218116 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:34 crc kubenswrapper[4599]: I1012 07:36:34.218127 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:34Z","lastTransitionTime":"2025-10-12T07:36:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:34 crc kubenswrapper[4599]: I1012 07:36:34.320997 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:34 crc kubenswrapper[4599]: I1012 07:36:34.321034 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:34 crc kubenswrapper[4599]: I1012 07:36:34.321044 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:34 crc kubenswrapper[4599]: I1012 07:36:34.321062 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:34 crc kubenswrapper[4599]: I1012 07:36:34.321072 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:34Z","lastTransitionTime":"2025-10-12T07:36:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:34 crc kubenswrapper[4599]: I1012 07:36:34.425011 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:34 crc kubenswrapper[4599]: I1012 07:36:34.425081 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:34 crc kubenswrapper[4599]: I1012 07:36:34.425093 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:34 crc kubenswrapper[4599]: I1012 07:36:34.425315 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:34 crc kubenswrapper[4599]: I1012 07:36:34.425326 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:34Z","lastTransitionTime":"2025-10-12T07:36:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:34 crc kubenswrapper[4599]: I1012 07:36:34.527585 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:34 crc kubenswrapper[4599]: I1012 07:36:34.527627 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:34 crc kubenswrapper[4599]: I1012 07:36:34.527636 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:34 crc kubenswrapper[4599]: I1012 07:36:34.527651 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:34 crc kubenswrapper[4599]: I1012 07:36:34.527660 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:34Z","lastTransitionTime":"2025-10-12T07:36:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:34 crc kubenswrapper[4599]: I1012 07:36:34.544892 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kwphq" Oct 12 07:36:34 crc kubenswrapper[4599]: E1012 07:36:34.545005 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kwphq" podUID="3c3e76cc-139b-4a2a-b96b-6077e3706376" Oct 12 07:36:34 crc kubenswrapper[4599]: I1012 07:36:34.631079 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:34 crc kubenswrapper[4599]: I1012 07:36:34.631131 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:34 crc kubenswrapper[4599]: I1012 07:36:34.631141 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:34 crc kubenswrapper[4599]: I1012 07:36:34.631165 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:34 crc kubenswrapper[4599]: I1012 07:36:34.631179 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:34Z","lastTransitionTime":"2025-10-12T07:36:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:34 crc kubenswrapper[4599]: I1012 07:36:34.734150 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:34 crc kubenswrapper[4599]: I1012 07:36:34.734191 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:34 crc kubenswrapper[4599]: I1012 07:36:34.734202 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:34 crc kubenswrapper[4599]: I1012 07:36:34.734222 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:34 crc kubenswrapper[4599]: I1012 07:36:34.734233 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:34Z","lastTransitionTime":"2025-10-12T07:36:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:34 crc kubenswrapper[4599]: I1012 07:36:34.836258 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:34 crc kubenswrapper[4599]: I1012 07:36:34.836309 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:34 crc kubenswrapper[4599]: I1012 07:36:34.836317 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:34 crc kubenswrapper[4599]: I1012 07:36:34.836346 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:34 crc kubenswrapper[4599]: I1012 07:36:34.836356 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:34Z","lastTransitionTime":"2025-10-12T07:36:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:34 crc kubenswrapper[4599]: I1012 07:36:34.939241 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:34 crc kubenswrapper[4599]: I1012 07:36:34.939285 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:34 crc kubenswrapper[4599]: I1012 07:36:34.939298 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:34 crc kubenswrapper[4599]: I1012 07:36:34.939317 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:34 crc kubenswrapper[4599]: I1012 07:36:34.939329 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:34Z","lastTransitionTime":"2025-10-12T07:36:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:35 crc kubenswrapper[4599]: I1012 07:36:35.041444 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:35 crc kubenswrapper[4599]: I1012 07:36:35.041474 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:35 crc kubenswrapper[4599]: I1012 07:36:35.041486 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:35 crc kubenswrapper[4599]: I1012 07:36:35.041500 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:35 crc kubenswrapper[4599]: I1012 07:36:35.041512 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:35Z","lastTransitionTime":"2025-10-12T07:36:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:35 crc kubenswrapper[4599]: I1012 07:36:35.144142 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:35 crc kubenswrapper[4599]: I1012 07:36:35.144180 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:35 crc kubenswrapper[4599]: I1012 07:36:35.144192 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:35 crc kubenswrapper[4599]: I1012 07:36:35.144208 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:35 crc kubenswrapper[4599]: I1012 07:36:35.144217 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:35Z","lastTransitionTime":"2025-10-12T07:36:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:35 crc kubenswrapper[4599]: I1012 07:36:35.246413 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:35 crc kubenswrapper[4599]: I1012 07:36:35.246445 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:35 crc kubenswrapper[4599]: I1012 07:36:35.246454 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:35 crc kubenswrapper[4599]: I1012 07:36:35.246468 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:35 crc kubenswrapper[4599]: I1012 07:36:35.246480 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:35Z","lastTransitionTime":"2025-10-12T07:36:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:35 crc kubenswrapper[4599]: I1012 07:36:35.348196 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:35 crc kubenswrapper[4599]: I1012 07:36:35.348632 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:35 crc kubenswrapper[4599]: I1012 07:36:35.348716 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:35 crc kubenswrapper[4599]: I1012 07:36:35.348781 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:35 crc kubenswrapper[4599]: I1012 07:36:35.348845 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:35Z","lastTransitionTime":"2025-10-12T07:36:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:35 crc kubenswrapper[4599]: I1012 07:36:35.450770 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:35 crc kubenswrapper[4599]: I1012 07:36:35.451017 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:35 crc kubenswrapper[4599]: I1012 07:36:35.451097 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:35 crc kubenswrapper[4599]: I1012 07:36:35.451158 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:35 crc kubenswrapper[4599]: I1012 07:36:35.451230 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:35Z","lastTransitionTime":"2025-10-12T07:36:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:35 crc kubenswrapper[4599]: I1012 07:36:35.544446 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 07:36:35 crc kubenswrapper[4599]: I1012 07:36:35.544534 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 07:36:35 crc kubenswrapper[4599]: E1012 07:36:35.544593 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 07:36:35 crc kubenswrapper[4599]: E1012 07:36:35.544723 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 07:36:35 crc kubenswrapper[4599]: I1012 07:36:35.544815 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 07:36:35 crc kubenswrapper[4599]: E1012 07:36:35.545515 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 07:36:35 crc kubenswrapper[4599]: I1012 07:36:35.553888 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:35 crc kubenswrapper[4599]: I1012 07:36:35.553920 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:35 crc kubenswrapper[4599]: I1012 07:36:35.553929 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:35 crc kubenswrapper[4599]: I1012 07:36:35.553947 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:35 crc kubenswrapper[4599]: I1012 07:36:35.553959 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:35Z","lastTransitionTime":"2025-10-12T07:36:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:35 crc kubenswrapper[4599]: I1012 07:36:35.656746 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:35 crc kubenswrapper[4599]: I1012 07:36:35.656793 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:35 crc kubenswrapper[4599]: I1012 07:36:35.656804 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:35 crc kubenswrapper[4599]: I1012 07:36:35.656824 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:35 crc kubenswrapper[4599]: I1012 07:36:35.656835 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:35Z","lastTransitionTime":"2025-10-12T07:36:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:35 crc kubenswrapper[4599]: I1012 07:36:35.758606 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:35 crc kubenswrapper[4599]: I1012 07:36:35.758641 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:35 crc kubenswrapper[4599]: I1012 07:36:35.758651 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:35 crc kubenswrapper[4599]: I1012 07:36:35.758669 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:35 crc kubenswrapper[4599]: I1012 07:36:35.758680 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:35Z","lastTransitionTime":"2025-10-12T07:36:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:35 crc kubenswrapper[4599]: I1012 07:36:35.861069 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:35 crc kubenswrapper[4599]: I1012 07:36:35.861224 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:35 crc kubenswrapper[4599]: I1012 07:36:35.861311 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:35 crc kubenswrapper[4599]: I1012 07:36:35.861410 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:35 crc kubenswrapper[4599]: I1012 07:36:35.861471 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:35Z","lastTransitionTime":"2025-10-12T07:36:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:35 crc kubenswrapper[4599]: I1012 07:36:35.891398 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 07:36:35 crc kubenswrapper[4599]: I1012 07:36:35.891425 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 07:36:35 crc kubenswrapper[4599]: I1012 07:36:35.891435 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 07:36:35 crc kubenswrapper[4599]: I1012 07:36:35.891449 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 07:36:35 crc kubenswrapper[4599]: I1012 07:36:35.891459 4599 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T07:36:35Z","lastTransitionTime":"2025-10-12T07:36:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 07:36:35 crc kubenswrapper[4599]: I1012 07:36:35.934051 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-q2nf8"] Oct 12 07:36:35 crc kubenswrapper[4599]: I1012 07:36:35.934487 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q2nf8" Oct 12 07:36:35 crc kubenswrapper[4599]: I1012 07:36:35.935921 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Oct 12 07:36:35 crc kubenswrapper[4599]: I1012 07:36:35.935921 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Oct 12 07:36:35 crc kubenswrapper[4599]: I1012 07:36:35.936311 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Oct 12 07:36:35 crc kubenswrapper[4599]: I1012 07:36:35.936931 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Oct 12 07:36:35 crc kubenswrapper[4599]: I1012 07:36:35.948395 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=43.94837583 podStartE2EDuration="43.94837583s" podCreationTimestamp="2025-10-12 07:35:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:36:35.948074855 +0000 UTC m=+92.737270358" watchObservedRunningTime="2025-10-12 07:36:35.94837583 +0000 UTC m=+92.737571333" Oct 12 07:36:35 crc kubenswrapper[4599]: I1012 07:36:35.967754 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-rxr42" podStartSLOduration=75.96773764 podStartE2EDuration="1m15.96773764s" podCreationTimestamp="2025-10-12 07:35:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:36:35.95851451 +0000 UTC m=+92.747710012" watchObservedRunningTime="2025-10-12 07:36:35.96773764 +0000 UTC m=+92.756933142" Oct 12 07:36:35 crc kubenswrapper[4599]: I1012 07:36:35.984758 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-f5988" podStartSLOduration=75.984736056 podStartE2EDuration="1m15.984736056s" podCreationTimestamp="2025-10-12 07:35:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:36:35.967886134 +0000 UTC m=+92.757081636" watchObservedRunningTime="2025-10-12 07:36:35.984736056 +0000 UTC m=+92.773931557" Oct 12 07:36:36 crc kubenswrapper[4599]: I1012 07:36:36.042100 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=75.042081518 podStartE2EDuration="1m15.042081518s" podCreationTimestamp="2025-10-12 07:35:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:36:36.019576024 +0000 UTC m=+92.808771526" watchObservedRunningTime="2025-10-12 07:36:36.042081518 +0000 UTC m=+92.831277021" Oct 12 07:36:36 crc kubenswrapper[4599]: I1012 07:36:36.080162 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podStartSLOduration=76.080145679 podStartE2EDuration="1m16.080145679s" podCreationTimestamp="2025-10-12 07:35:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:36:36.069569601 +0000 UTC m=+92.858765104" watchObservedRunningTime="2025-10-12 07:36:36.080145679 +0000 UTC m=+92.869341182" Oct 12 07:36:36 crc kubenswrapper[4599]: I1012 07:36:36.089368 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=15.089346719 podStartE2EDuration="15.089346719s" podCreationTimestamp="2025-10-12 07:36:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:36:36.088818343 +0000 UTC m=+92.878013845" watchObservedRunningTime="2025-10-12 07:36:36.089346719 +0000 UTC m=+92.878542221" Oct 12 07:36:36 crc kubenswrapper[4599]: I1012 07:36:36.089792 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xl2ns" podStartSLOduration=75.089786591 podStartE2EDuration="1m15.089786591s" podCreationTimestamp="2025-10-12 07:35:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:36:36.080506636 +0000 UTC m=+92.869702138" watchObservedRunningTime="2025-10-12 07:36:36.089786591 +0000 UTC m=+92.878982083" Oct 12 07:36:36 crc kubenswrapper[4599]: I1012 07:36:36.098764 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=75.098758887 podStartE2EDuration="1m15.098758887s" podCreationTimestamp="2025-10-12 07:35:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:36:36.09814265 +0000 UTC m=+92.887338151" watchObservedRunningTime="2025-10-12 07:36:36.098758887 +0000 UTC m=+92.887954389" Oct 12 07:36:36 crc kubenswrapper[4599]: I1012 07:36:36.105397 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bf9c11b8-d8d0-439c-b408-0fb5b7038a6e-service-ca\") pod \"cluster-version-operator-5c965bbfc6-q2nf8\" (UID: \"bf9c11b8-d8d0-439c-b408-0fb5b7038a6e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q2nf8" Oct 12 07:36:36 crc kubenswrapper[4599]: I1012 07:36:36.105518 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/bf9c11b8-d8d0-439c-b408-0fb5b7038a6e-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-q2nf8\" (UID: \"bf9c11b8-d8d0-439c-b408-0fb5b7038a6e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q2nf8" Oct 12 07:36:36 crc kubenswrapper[4599]: I1012 07:36:36.105638 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/bf9c11b8-d8d0-439c-b408-0fb5b7038a6e-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-q2nf8\" (UID: \"bf9c11b8-d8d0-439c-b408-0fb5b7038a6e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q2nf8" Oct 12 07:36:36 crc kubenswrapper[4599]: I1012 07:36:36.105718 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bf9c11b8-d8d0-439c-b408-0fb5b7038a6e-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-q2nf8\" (UID: \"bf9c11b8-d8d0-439c-b408-0fb5b7038a6e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q2nf8" Oct 12 07:36:36 crc kubenswrapper[4599]: I1012 07:36:36.105788 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bf9c11b8-d8d0-439c-b408-0fb5b7038a6e-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-q2nf8\" (UID: \"bf9c11b8-d8d0-439c-b408-0fb5b7038a6e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q2nf8" Oct 12 07:36:36 crc kubenswrapper[4599]: I1012 07:36:36.160251 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-9xbn5" podStartSLOduration=75.160229756 podStartE2EDuration="1m15.160229756s" podCreationTimestamp="2025-10-12 07:35:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:36:36.150247506 +0000 UTC m=+92.939443008" watchObservedRunningTime="2025-10-12 07:36:36.160229756 +0000 UTC m=+92.949425259" Oct 12 07:36:36 crc kubenswrapper[4599]: I1012 07:36:36.160701 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-8hm26" podStartSLOduration=76.160696818 podStartE2EDuration="1m16.160696818s" podCreationTimestamp="2025-10-12 07:35:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:36:36.159669582 +0000 UTC m=+92.948865085" watchObservedRunningTime="2025-10-12 07:36:36.160696818 +0000 UTC m=+92.949892320" Oct 12 07:36:36 crc kubenswrapper[4599]: I1012 07:36:36.206873 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bf9c11b8-d8d0-439c-b408-0fb5b7038a6e-service-ca\") pod \"cluster-version-operator-5c965bbfc6-q2nf8\" (UID: \"bf9c11b8-d8d0-439c-b408-0fb5b7038a6e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q2nf8" Oct 12 07:36:36 crc kubenswrapper[4599]: I1012 07:36:36.207194 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/bf9c11b8-d8d0-439c-b408-0fb5b7038a6e-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-q2nf8\" (UID: \"bf9c11b8-d8d0-439c-b408-0fb5b7038a6e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q2nf8" Oct 12 07:36:36 crc kubenswrapper[4599]: I1012 07:36:36.207320 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/bf9c11b8-d8d0-439c-b408-0fb5b7038a6e-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-q2nf8\" (UID: \"bf9c11b8-d8d0-439c-b408-0fb5b7038a6e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q2nf8" Oct 12 07:36:36 crc kubenswrapper[4599]: I1012 07:36:36.207464 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bf9c11b8-d8d0-439c-b408-0fb5b7038a6e-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-q2nf8\" (UID: \"bf9c11b8-d8d0-439c-b408-0fb5b7038a6e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q2nf8" Oct 12 07:36:36 crc kubenswrapper[4599]: I1012 07:36:36.207564 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bf9c11b8-d8d0-439c-b408-0fb5b7038a6e-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-q2nf8\" (UID: \"bf9c11b8-d8d0-439c-b408-0fb5b7038a6e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q2nf8" Oct 12 07:36:36 crc kubenswrapper[4599]: I1012 07:36:36.207562 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/bf9c11b8-d8d0-439c-b408-0fb5b7038a6e-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-q2nf8\" (UID: \"bf9c11b8-d8d0-439c-b408-0fb5b7038a6e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q2nf8" Oct 12 07:36:36 crc kubenswrapper[4599]: I1012 07:36:36.207569 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/bf9c11b8-d8d0-439c-b408-0fb5b7038a6e-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-q2nf8\" (UID: \"bf9c11b8-d8d0-439c-b408-0fb5b7038a6e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q2nf8" Oct 12 07:36:36 crc kubenswrapper[4599]: I1012 07:36:36.208058 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bf9c11b8-d8d0-439c-b408-0fb5b7038a6e-service-ca\") pod \"cluster-version-operator-5c965bbfc6-q2nf8\" (UID: \"bf9c11b8-d8d0-439c-b408-0fb5b7038a6e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q2nf8" Oct 12 07:36:36 crc kubenswrapper[4599]: I1012 07:36:36.214909 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bf9c11b8-d8d0-439c-b408-0fb5b7038a6e-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-q2nf8\" (UID: \"bf9c11b8-d8d0-439c-b408-0fb5b7038a6e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q2nf8" Oct 12 07:36:36 crc kubenswrapper[4599]: I1012 07:36:36.221805 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bf9c11b8-d8d0-439c-b408-0fb5b7038a6e-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-q2nf8\" (UID: \"bf9c11b8-d8d0-439c-b408-0fb5b7038a6e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q2nf8" Oct 12 07:36:36 crc kubenswrapper[4599]: I1012 07:36:36.243578 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q2nf8" Oct 12 07:36:36 crc kubenswrapper[4599]: I1012 07:36:36.544705 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kwphq" Oct 12 07:36:36 crc kubenswrapper[4599]: E1012 07:36:36.544892 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kwphq" podUID="3c3e76cc-139b-4a2a-b96b-6077e3706376" Oct 12 07:36:36 crc kubenswrapper[4599]: I1012 07:36:36.953582 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q2nf8" event={"ID":"bf9c11b8-d8d0-439c-b408-0fb5b7038a6e","Type":"ContainerStarted","Data":"04f888b8a074ad0e666de0c40950a9d39c112268d21cfc7fb58831fd6a0f6b83"} Oct 12 07:36:36 crc kubenswrapper[4599]: I1012 07:36:36.953643 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q2nf8" event={"ID":"bf9c11b8-d8d0-439c-b408-0fb5b7038a6e","Type":"ContainerStarted","Data":"3094b87c9153df8812edd99cf09a57b2882a889c13fb62cfed24331b5a8b6ccd"} Oct 12 07:36:36 crc kubenswrapper[4599]: I1012 07:36:36.965974 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q2nf8" podStartSLOduration=76.965956897 podStartE2EDuration="1m16.965956897s" podCreationTimestamp="2025-10-12 07:35:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:36:36.965401191 +0000 UTC m=+93.754596692" watchObservedRunningTime="2025-10-12 07:36:36.965956897 +0000 UTC m=+93.755152398" Oct 12 07:36:37 crc kubenswrapper[4599]: I1012 07:36:37.544672 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 07:36:37 crc kubenswrapper[4599]: I1012 07:36:37.544758 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 07:36:37 crc kubenswrapper[4599]: I1012 07:36:37.544768 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 07:36:37 crc kubenswrapper[4599]: E1012 07:36:37.545556 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 07:36:37 crc kubenswrapper[4599]: E1012 07:36:37.545878 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 07:36:37 crc kubenswrapper[4599]: E1012 07:36:37.545857 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 07:36:38 crc kubenswrapper[4599]: I1012 07:36:38.526486 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3c3e76cc-139b-4a2a-b96b-6077e3706376-metrics-certs\") pod \"network-metrics-daemon-kwphq\" (UID: \"3c3e76cc-139b-4a2a-b96b-6077e3706376\") " pod="openshift-multus/network-metrics-daemon-kwphq" Oct 12 07:36:38 crc kubenswrapper[4599]: E1012 07:36:38.526662 4599 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 12 07:36:38 crc kubenswrapper[4599]: E1012 07:36:38.526732 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c3e76cc-139b-4a2a-b96b-6077e3706376-metrics-certs podName:3c3e76cc-139b-4a2a-b96b-6077e3706376 nodeName:}" failed. No retries permitted until 2025-10-12 07:37:42.526709513 +0000 UTC m=+159.315905015 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3c3e76cc-139b-4a2a-b96b-6077e3706376-metrics-certs") pod "network-metrics-daemon-kwphq" (UID: "3c3e76cc-139b-4a2a-b96b-6077e3706376") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 12 07:36:38 crc kubenswrapper[4599]: I1012 07:36:38.544753 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kwphq" Oct 12 07:36:38 crc kubenswrapper[4599]: E1012 07:36:38.544928 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kwphq" podUID="3c3e76cc-139b-4a2a-b96b-6077e3706376" Oct 12 07:36:39 crc kubenswrapper[4599]: I1012 07:36:39.544419 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 07:36:39 crc kubenswrapper[4599]: I1012 07:36:39.544441 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 07:36:39 crc kubenswrapper[4599]: I1012 07:36:39.544474 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 07:36:39 crc kubenswrapper[4599]: E1012 07:36:39.544809 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 07:36:39 crc kubenswrapper[4599]: E1012 07:36:39.544947 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 07:36:39 crc kubenswrapper[4599]: E1012 07:36:39.545001 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 07:36:39 crc kubenswrapper[4599]: I1012 07:36:39.556819 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Oct 12 07:36:40 crc kubenswrapper[4599]: I1012 07:36:40.544753 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kwphq" Oct 12 07:36:40 crc kubenswrapper[4599]: E1012 07:36:40.544890 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kwphq" podUID="3c3e76cc-139b-4a2a-b96b-6077e3706376" Oct 12 07:36:41 crc kubenswrapper[4599]: I1012 07:36:41.545041 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 07:36:41 crc kubenswrapper[4599]: I1012 07:36:41.545075 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 07:36:41 crc kubenswrapper[4599]: I1012 07:36:41.545108 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 07:36:41 crc kubenswrapper[4599]: E1012 07:36:41.545193 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 07:36:41 crc kubenswrapper[4599]: E1012 07:36:41.545267 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 07:36:41 crc kubenswrapper[4599]: E1012 07:36:41.545371 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 07:36:42 crc kubenswrapper[4599]: I1012 07:36:42.544410 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kwphq" Oct 12 07:36:42 crc kubenswrapper[4599]: E1012 07:36:42.544548 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kwphq" podUID="3c3e76cc-139b-4a2a-b96b-6077e3706376" Oct 12 07:36:43 crc kubenswrapper[4599]: I1012 07:36:43.544428 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 07:36:43 crc kubenswrapper[4599]: I1012 07:36:43.545507 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 07:36:43 crc kubenswrapper[4599]: I1012 07:36:43.545531 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 07:36:43 crc kubenswrapper[4599]: E1012 07:36:43.545850 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 07:36:43 crc kubenswrapper[4599]: I1012 07:36:43.546173 4599 scope.go:117] "RemoveContainer" containerID="22386ac9ba5ff4e4575b13c71cd5865e1a6ea9c403197c2c0ea796f37f1085d1" Oct 12 07:36:43 crc kubenswrapper[4599]: E1012 07:36:43.546179 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 07:36:43 crc kubenswrapper[4599]: E1012 07:36:43.546295 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 07:36:43 crc kubenswrapper[4599]: E1012 07:36:43.546357 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-whk5b_openshift-ovn-kubernetes(1a95b7ab-8632-4332-a30f-64f28ef8d313)\"" pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" podUID="1a95b7ab-8632-4332-a30f-64f28ef8d313" Oct 12 07:36:43 crc kubenswrapper[4599]: I1012 07:36:43.568572 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=4.56856075 podStartE2EDuration="4.56856075s" podCreationTimestamp="2025-10-12 07:36:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:36:43.568396346 +0000 UTC m=+100.357591848" watchObservedRunningTime="2025-10-12 07:36:43.56856075 +0000 UTC m=+100.357756251" Oct 12 07:36:44 crc kubenswrapper[4599]: I1012 07:36:44.544946 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kwphq" Oct 12 07:36:44 crc kubenswrapper[4599]: E1012 07:36:44.545721 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kwphq" podUID="3c3e76cc-139b-4a2a-b96b-6077e3706376" Oct 12 07:36:45 crc kubenswrapper[4599]: I1012 07:36:45.544548 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 07:36:45 crc kubenswrapper[4599]: I1012 07:36:45.544630 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 07:36:45 crc kubenswrapper[4599]: E1012 07:36:45.544694 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 07:36:45 crc kubenswrapper[4599]: I1012 07:36:45.544548 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 07:36:45 crc kubenswrapper[4599]: E1012 07:36:45.544775 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 07:36:45 crc kubenswrapper[4599]: E1012 07:36:45.544881 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 07:36:46 crc kubenswrapper[4599]: I1012 07:36:46.544672 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kwphq" Oct 12 07:36:46 crc kubenswrapper[4599]: E1012 07:36:46.544802 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kwphq" podUID="3c3e76cc-139b-4a2a-b96b-6077e3706376" Oct 12 07:36:47 crc kubenswrapper[4599]: I1012 07:36:47.544930 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 07:36:47 crc kubenswrapper[4599]: I1012 07:36:47.545023 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 07:36:47 crc kubenswrapper[4599]: E1012 07:36:47.545065 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 07:36:47 crc kubenswrapper[4599]: E1012 07:36:47.545207 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 07:36:47 crc kubenswrapper[4599]: I1012 07:36:47.545249 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 07:36:47 crc kubenswrapper[4599]: E1012 07:36:47.545503 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 07:36:48 crc kubenswrapper[4599]: I1012 07:36:48.544864 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kwphq" Oct 12 07:36:48 crc kubenswrapper[4599]: E1012 07:36:48.545099 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kwphq" podUID="3c3e76cc-139b-4a2a-b96b-6077e3706376" Oct 12 07:36:49 crc kubenswrapper[4599]: I1012 07:36:49.544167 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 07:36:49 crc kubenswrapper[4599]: E1012 07:36:49.544266 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 07:36:49 crc kubenswrapper[4599]: I1012 07:36:49.544362 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 07:36:49 crc kubenswrapper[4599]: I1012 07:36:49.544412 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 07:36:49 crc kubenswrapper[4599]: E1012 07:36:49.544512 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 07:36:49 crc kubenswrapper[4599]: E1012 07:36:49.544554 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 07:36:50 crc kubenswrapper[4599]: I1012 07:36:50.544570 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kwphq" Oct 12 07:36:50 crc kubenswrapper[4599]: E1012 07:36:50.544729 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kwphq" podUID="3c3e76cc-139b-4a2a-b96b-6077e3706376" Oct 12 07:36:51 crc kubenswrapper[4599]: I1012 07:36:51.544789 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 07:36:51 crc kubenswrapper[4599]: I1012 07:36:51.544786 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 07:36:51 crc kubenswrapper[4599]: I1012 07:36:51.545453 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 07:36:51 crc kubenswrapper[4599]: E1012 07:36:51.545905 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 07:36:51 crc kubenswrapper[4599]: E1012 07:36:51.545990 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 07:36:51 crc kubenswrapper[4599]: E1012 07:36:51.546072 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 07:36:52 crc kubenswrapper[4599]: I1012 07:36:52.545192 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kwphq" Oct 12 07:36:52 crc kubenswrapper[4599]: E1012 07:36:52.545390 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kwphq" podUID="3c3e76cc-139b-4a2a-b96b-6077e3706376" Oct 12 07:36:53 crc kubenswrapper[4599]: I1012 07:36:53.544731 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 07:36:53 crc kubenswrapper[4599]: I1012 07:36:53.544884 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 07:36:53 crc kubenswrapper[4599]: I1012 07:36:53.545250 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 07:36:53 crc kubenswrapper[4599]: E1012 07:36:53.545766 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 07:36:53 crc kubenswrapper[4599]: E1012 07:36:53.545946 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 07:36:53 crc kubenswrapper[4599]: E1012 07:36:53.546042 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 07:36:54 crc kubenswrapper[4599]: I1012 07:36:54.003740 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8hm26_ce311f52-0501-45d3-8209-b1d2aa25028b/kube-multus/1.log" Oct 12 07:36:54 crc kubenswrapper[4599]: I1012 07:36:54.004301 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8hm26_ce311f52-0501-45d3-8209-b1d2aa25028b/kube-multus/0.log" Oct 12 07:36:54 crc kubenswrapper[4599]: I1012 07:36:54.004374 4599 generic.go:334] "Generic (PLEG): container finished" podID="ce311f52-0501-45d3-8209-b1d2aa25028b" containerID="1d07bb1c67c880fb49b68be21180f0a5a053da62097b38605440625d10650033" exitCode=1 Oct 12 07:36:54 crc kubenswrapper[4599]: I1012 07:36:54.004416 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8hm26" event={"ID":"ce311f52-0501-45d3-8209-b1d2aa25028b","Type":"ContainerDied","Data":"1d07bb1c67c880fb49b68be21180f0a5a053da62097b38605440625d10650033"} Oct 12 07:36:54 crc kubenswrapper[4599]: I1012 07:36:54.004469 4599 scope.go:117] "RemoveContainer" containerID="611114a1fecbd461507dd2f566f975c7ef3e8d7655d90c1a5b31cd05a430cb38" Oct 12 07:36:54 crc kubenswrapper[4599]: I1012 07:36:54.004915 4599 scope.go:117] "RemoveContainer" containerID="1d07bb1c67c880fb49b68be21180f0a5a053da62097b38605440625d10650033" Oct 12 07:36:54 crc kubenswrapper[4599]: E1012 07:36:54.005116 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-8hm26_openshift-multus(ce311f52-0501-45d3-8209-b1d2aa25028b)\"" pod="openshift-multus/multus-8hm26" podUID="ce311f52-0501-45d3-8209-b1d2aa25028b" Oct 12 07:36:54 crc kubenswrapper[4599]: I1012 07:36:54.544751 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kwphq" Oct 12 07:36:54 crc kubenswrapper[4599]: E1012 07:36:54.544928 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kwphq" podUID="3c3e76cc-139b-4a2a-b96b-6077e3706376" Oct 12 07:36:55 crc kubenswrapper[4599]: I1012 07:36:55.008749 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8hm26_ce311f52-0501-45d3-8209-b1d2aa25028b/kube-multus/1.log" Oct 12 07:36:55 crc kubenswrapper[4599]: I1012 07:36:55.545159 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 07:36:55 crc kubenswrapper[4599]: I1012 07:36:55.545205 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 07:36:55 crc kubenswrapper[4599]: E1012 07:36:55.545311 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 07:36:55 crc kubenswrapper[4599]: I1012 07:36:55.545231 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 07:36:55 crc kubenswrapper[4599]: E1012 07:36:55.545464 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 07:36:55 crc kubenswrapper[4599]: E1012 07:36:55.545960 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 07:36:55 crc kubenswrapper[4599]: I1012 07:36:55.546364 4599 scope.go:117] "RemoveContainer" containerID="22386ac9ba5ff4e4575b13c71cd5865e1a6ea9c403197c2c0ea796f37f1085d1" Oct 12 07:36:55 crc kubenswrapper[4599]: E1012 07:36:55.546591 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-whk5b_openshift-ovn-kubernetes(1a95b7ab-8632-4332-a30f-64f28ef8d313)\"" pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" podUID="1a95b7ab-8632-4332-a30f-64f28ef8d313" Oct 12 07:36:56 crc kubenswrapper[4599]: I1012 07:36:56.545177 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kwphq" Oct 12 07:36:56 crc kubenswrapper[4599]: E1012 07:36:56.545378 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kwphq" podUID="3c3e76cc-139b-4a2a-b96b-6077e3706376" Oct 12 07:36:57 crc kubenswrapper[4599]: I1012 07:36:57.545019 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 07:36:57 crc kubenswrapper[4599]: I1012 07:36:57.545167 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 07:36:57 crc kubenswrapper[4599]: I1012 07:36:57.545312 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 07:36:57 crc kubenswrapper[4599]: E1012 07:36:57.545418 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 07:36:57 crc kubenswrapper[4599]: E1012 07:36:57.545175 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 07:36:57 crc kubenswrapper[4599]: E1012 07:36:57.546228 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 07:36:58 crc kubenswrapper[4599]: I1012 07:36:58.544987 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kwphq" Oct 12 07:36:58 crc kubenswrapper[4599]: E1012 07:36:58.545130 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kwphq" podUID="3c3e76cc-139b-4a2a-b96b-6077e3706376" Oct 12 07:36:59 crc kubenswrapper[4599]: I1012 07:36:59.544701 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 07:36:59 crc kubenswrapper[4599]: I1012 07:36:59.544829 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 07:36:59 crc kubenswrapper[4599]: E1012 07:36:59.544954 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 07:36:59 crc kubenswrapper[4599]: I1012 07:36:59.544974 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 07:36:59 crc kubenswrapper[4599]: E1012 07:36:59.545103 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 07:36:59 crc kubenswrapper[4599]: E1012 07:36:59.545209 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 07:37:00 crc kubenswrapper[4599]: I1012 07:37:00.544838 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kwphq" Oct 12 07:37:00 crc kubenswrapper[4599]: E1012 07:37:00.545026 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kwphq" podUID="3c3e76cc-139b-4a2a-b96b-6077e3706376" Oct 12 07:37:01 crc kubenswrapper[4599]: I1012 07:37:01.544605 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 07:37:01 crc kubenswrapper[4599]: I1012 07:37:01.544663 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 07:37:01 crc kubenswrapper[4599]: E1012 07:37:01.544750 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 07:37:01 crc kubenswrapper[4599]: E1012 07:37:01.544899 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 07:37:01 crc kubenswrapper[4599]: I1012 07:37:01.544626 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 07:37:01 crc kubenswrapper[4599]: E1012 07:37:01.544998 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 07:37:02 crc kubenswrapper[4599]: I1012 07:37:02.544541 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kwphq" Oct 12 07:37:02 crc kubenswrapper[4599]: E1012 07:37:02.544706 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kwphq" podUID="3c3e76cc-139b-4a2a-b96b-6077e3706376" Oct 12 07:37:03 crc kubenswrapper[4599]: E1012 07:37:03.519251 4599 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Oct 12 07:37:03 crc kubenswrapper[4599]: I1012 07:37:03.544861 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 07:37:03 crc kubenswrapper[4599]: E1012 07:37:03.545973 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 07:37:03 crc kubenswrapper[4599]: I1012 07:37:03.546079 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 07:37:03 crc kubenswrapper[4599]: I1012 07:37:03.546120 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 07:37:03 crc kubenswrapper[4599]: E1012 07:37:03.546187 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 07:37:03 crc kubenswrapper[4599]: E1012 07:37:03.546523 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 07:37:03 crc kubenswrapper[4599]: E1012 07:37:03.619568 4599 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 12 07:37:04 crc kubenswrapper[4599]: I1012 07:37:04.545021 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kwphq" Oct 12 07:37:04 crc kubenswrapper[4599]: E1012 07:37:04.545169 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kwphq" podUID="3c3e76cc-139b-4a2a-b96b-6077e3706376" Oct 12 07:37:05 crc kubenswrapper[4599]: I1012 07:37:05.544590 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 07:37:05 crc kubenswrapper[4599]: E1012 07:37:05.544729 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 07:37:05 crc kubenswrapper[4599]: I1012 07:37:05.544592 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 07:37:05 crc kubenswrapper[4599]: I1012 07:37:05.544817 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 07:37:05 crc kubenswrapper[4599]: E1012 07:37:05.544843 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 07:37:05 crc kubenswrapper[4599]: E1012 07:37:05.544925 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 07:37:06 crc kubenswrapper[4599]: I1012 07:37:06.545166 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kwphq" Oct 12 07:37:06 crc kubenswrapper[4599]: E1012 07:37:06.545311 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kwphq" podUID="3c3e76cc-139b-4a2a-b96b-6077e3706376" Oct 12 07:37:07 crc kubenswrapper[4599]: I1012 07:37:07.545082 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 07:37:07 crc kubenswrapper[4599]: I1012 07:37:07.545363 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 07:37:07 crc kubenswrapper[4599]: E1012 07:37:07.545598 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 07:37:07 crc kubenswrapper[4599]: E1012 07:37:07.545666 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 07:37:07 crc kubenswrapper[4599]: I1012 07:37:07.545873 4599 scope.go:117] "RemoveContainer" containerID="1d07bb1c67c880fb49b68be21180f0a5a053da62097b38605440625d10650033" Oct 12 07:37:07 crc kubenswrapper[4599]: I1012 07:37:07.545979 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 07:37:07 crc kubenswrapper[4599]: E1012 07:37:07.546290 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 07:37:08 crc kubenswrapper[4599]: I1012 07:37:08.050512 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8hm26_ce311f52-0501-45d3-8209-b1d2aa25028b/kube-multus/1.log" Oct 12 07:37:08 crc kubenswrapper[4599]: I1012 07:37:08.050832 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8hm26" event={"ID":"ce311f52-0501-45d3-8209-b1d2aa25028b","Type":"ContainerStarted","Data":"cba10cb566eaf12febccccdd40f3241b968188857ba61ada0445c5f8b6b5b363"} Oct 12 07:37:08 crc kubenswrapper[4599]: I1012 07:37:08.544445 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kwphq" Oct 12 07:37:08 crc kubenswrapper[4599]: E1012 07:37:08.544596 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kwphq" podUID="3c3e76cc-139b-4a2a-b96b-6077e3706376" Oct 12 07:37:08 crc kubenswrapper[4599]: E1012 07:37:08.620866 4599 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 12 07:37:09 crc kubenswrapper[4599]: I1012 07:37:09.544498 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 07:37:09 crc kubenswrapper[4599]: I1012 07:37:09.544530 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 07:37:09 crc kubenswrapper[4599]: E1012 07:37:09.544635 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 07:37:09 crc kubenswrapper[4599]: I1012 07:37:09.544666 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 07:37:09 crc kubenswrapper[4599]: E1012 07:37:09.544734 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 07:37:09 crc kubenswrapper[4599]: E1012 07:37:09.544784 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 07:37:10 crc kubenswrapper[4599]: I1012 07:37:10.544669 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kwphq" Oct 12 07:37:10 crc kubenswrapper[4599]: E1012 07:37:10.545833 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kwphq" podUID="3c3e76cc-139b-4a2a-b96b-6077e3706376" Oct 12 07:37:10 crc kubenswrapper[4599]: I1012 07:37:10.546175 4599 scope.go:117] "RemoveContainer" containerID="22386ac9ba5ff4e4575b13c71cd5865e1a6ea9c403197c2c0ea796f37f1085d1" Oct 12 07:37:11 crc kubenswrapper[4599]: I1012 07:37:11.061383 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-whk5b_1a95b7ab-8632-4332-a30f-64f28ef8d313/ovnkube-controller/3.log" Oct 12 07:37:11 crc kubenswrapper[4599]: I1012 07:37:11.063602 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" event={"ID":"1a95b7ab-8632-4332-a30f-64f28ef8d313","Type":"ContainerStarted","Data":"6baea2afbd1f3920c0713234756041f600567cc0d372d1e6e4007bfe2b9e663f"} Oct 12 07:37:11 crc kubenswrapper[4599]: I1012 07:37:11.064046 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" Oct 12 07:37:11 crc kubenswrapper[4599]: I1012 07:37:11.085207 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" podStartSLOduration=110.085191731 podStartE2EDuration="1m50.085191731s" podCreationTimestamp="2025-10-12 07:35:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:37:11.084412385 +0000 UTC m=+127.873607887" watchObservedRunningTime="2025-10-12 07:37:11.085191731 +0000 UTC m=+127.874387234" Oct 12 07:37:11 crc kubenswrapper[4599]: I1012 07:37:11.193652 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-kwphq"] Oct 12 07:37:11 crc kubenswrapper[4599]: I1012 07:37:11.194061 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kwphq" Oct 12 07:37:11 crc kubenswrapper[4599]: E1012 07:37:11.194219 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kwphq" podUID="3c3e76cc-139b-4a2a-b96b-6077e3706376" Oct 12 07:37:11 crc kubenswrapper[4599]: I1012 07:37:11.544406 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 07:37:11 crc kubenswrapper[4599]: I1012 07:37:11.544480 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 07:37:11 crc kubenswrapper[4599]: E1012 07:37:11.544524 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 07:37:11 crc kubenswrapper[4599]: I1012 07:37:11.544491 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 07:37:11 crc kubenswrapper[4599]: E1012 07:37:11.544628 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 07:37:11 crc kubenswrapper[4599]: E1012 07:37:11.544703 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 07:37:12 crc kubenswrapper[4599]: I1012 07:37:12.544567 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kwphq" Oct 12 07:37:12 crc kubenswrapper[4599]: E1012 07:37:12.544714 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kwphq" podUID="3c3e76cc-139b-4a2a-b96b-6077e3706376" Oct 12 07:37:13 crc kubenswrapper[4599]: I1012 07:37:13.544629 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 07:37:13 crc kubenswrapper[4599]: I1012 07:37:13.544654 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 07:37:13 crc kubenswrapper[4599]: I1012 07:37:13.544676 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 07:37:13 crc kubenswrapper[4599]: E1012 07:37:13.545463 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 07:37:13 crc kubenswrapper[4599]: E1012 07:37:13.545560 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 07:37:13 crc kubenswrapper[4599]: E1012 07:37:13.545643 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 07:37:14 crc kubenswrapper[4599]: I1012 07:37:14.544626 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kwphq" Oct 12 07:37:14 crc kubenswrapper[4599]: I1012 07:37:14.546441 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Oct 12 07:37:14 crc kubenswrapper[4599]: I1012 07:37:14.547159 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Oct 12 07:37:15 crc kubenswrapper[4599]: I1012 07:37:15.544930 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 07:37:15 crc kubenswrapper[4599]: I1012 07:37:15.544977 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 07:37:15 crc kubenswrapper[4599]: I1012 07:37:15.545009 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 07:37:15 crc kubenswrapper[4599]: I1012 07:37:15.547260 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Oct 12 07:37:15 crc kubenswrapper[4599]: I1012 07:37:15.547506 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Oct 12 07:37:15 crc kubenswrapper[4599]: I1012 07:37:15.547742 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Oct 12 07:37:15 crc kubenswrapper[4599]: I1012 07:37:15.547923 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.543144 4599 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.573195 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-4twv2"] Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.573701 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-4twv2" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.573809 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-tgck5"] Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.574311 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-tgck5" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.574835 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-nlffm"] Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.575126 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-nlffm" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.575867 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-f8wgl"] Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.576188 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f8wgl" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.578782 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-gj5kj"] Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.579425 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-v2b5m"] Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.580289 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-v2b5m" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.580610 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-gj5kj" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.582793 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.583317 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 12 07:37:16 crc kubenswrapper[4599]: W1012 07:37:16.583322 4599 reflector.go:561] object-"openshift-cluster-machine-approver"/"machine-approver-tls": failed to list *v1.Secret: secrets "machine-approver-tls" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-cluster-machine-approver": no relationship found between node 'crc' and this object Oct 12 07:37:16 crc kubenswrapper[4599]: E1012 07:37:16.583390 4599 reflector.go:158] "Unhandled Error" err="object-\"openshift-cluster-machine-approver\"/\"machine-approver-tls\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"machine-approver-tls\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-cluster-machine-approver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 12 07:37:16 crc kubenswrapper[4599]: W1012 07:37:16.583467 4599 reflector.go:561] object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4": failed to list *v1.Secret: secrets "machine-approver-sa-dockercfg-nl2j4" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-cluster-machine-approver": no relationship found between node 'crc' and this object Oct 12 07:37:16 crc kubenswrapper[4599]: E1012 07:37:16.583481 4599 reflector.go:158] "Unhandled Error" err="object-\"openshift-cluster-machine-approver\"/\"machine-approver-sa-dockercfg-nl2j4\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"machine-approver-sa-dockercfg-nl2j4\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-cluster-machine-approver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.583581 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.583582 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.584145 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Oct 12 07:37:16 crc kubenswrapper[4599]: W1012 07:37:16.584421 4599 reflector.go:561] object-"openshift-cluster-machine-approver"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-cluster-machine-approver": no relationship found between node 'crc' and this object Oct 12 07:37:16 crc kubenswrapper[4599]: E1012 07:37:16.584444 4599 reflector.go:158] "Unhandled Error" err="object-\"openshift-cluster-machine-approver\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-cluster-machine-approver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.584484 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.584618 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Oct 12 07:37:16 crc kubenswrapper[4599]: W1012 07:37:16.584650 4599 reflector.go:561] object-"openshift-cluster-machine-approver"/"kube-rbac-proxy": failed to list *v1.ConfigMap: configmaps "kube-rbac-proxy" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-cluster-machine-approver": no relationship found between node 'crc' and this object Oct 12 07:37:16 crc kubenswrapper[4599]: E1012 07:37:16.584669 4599 reflector.go:158] "Unhandled Error" err="object-\"openshift-cluster-machine-approver\"/\"kube-rbac-proxy\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-rbac-proxy\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-cluster-machine-approver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.584622 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.584894 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Oct 12 07:37:16 crc kubenswrapper[4599]: W1012 07:37:16.584999 4599 reflector.go:561] object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-cluster-machine-approver": no relationship found between node 'crc' and this object Oct 12 07:37:16 crc kubenswrapper[4599]: E1012 07:37:16.585042 4599 reflector.go:158] "Unhandled Error" err="object-\"openshift-cluster-machine-approver\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-cluster-machine-approver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.585078 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.585242 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.596722 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-kf2t6"] Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.597251 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-kf2t6" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.597480 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-dn2jf"] Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.598121 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-dn2jf" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.600101 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-wvhc6"] Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.600658 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ggnjr"] Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.600953 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kvpk5"] Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.601281 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kvpk5" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.601673 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wvhc6" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.601939 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-r64z4"] Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.601960 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ggnjr" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.602416 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.602549 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-r64z4" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.603311 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.603317 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Oct 12 07:37:16 crc kubenswrapper[4599]: W1012 07:37:16.604167 4599 reflector.go:561] object-"openshift-cluster-machine-approver"/"machine-approver-config": failed to list *v1.ConfigMap: configmaps "machine-approver-config" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-cluster-machine-approver": no relationship found between node 'crc' and this object Oct 12 07:37:16 crc kubenswrapper[4599]: E1012 07:37:16.604190 4599 reflector.go:158] "Unhandled Error" err="object-\"openshift-cluster-machine-approver\"/\"machine-approver-config\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"machine-approver-config\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-cluster-machine-approver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.604288 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.604399 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.604481 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.604401 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.604604 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.604701 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.604855 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.605395 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.605497 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.605638 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.605744 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.606633 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.606688 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.607076 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.607134 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.607158 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.607282 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.607321 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.604490 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.607485 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.607517 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.607284 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.607637 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.608074 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.608298 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.608561 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.610504 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.610544 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h94q9"] Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.610778 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.610922 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.610997 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h94q9" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.611093 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.610930 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-psd6r"] Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.612268 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-26bbs"] Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.611251 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.612638 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-26bbs" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.612922 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-psd6r" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.613476 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.613662 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.614053 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.614066 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.615029 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-jlnn9"] Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.615372 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-42f4k"] Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.615622 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-42f4k" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.615847 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-jlnn9" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.616079 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-pnj7m"] Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.616447 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pnj7m" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.617027 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.617103 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.618547 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.618549 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.618681 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.627397 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.627527 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.628456 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.629401 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.629954 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.630094 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.630390 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.630534 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.630659 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.632977 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.633241 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-njb7r"] Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.633320 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.633530 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.634510 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ngfnl"] Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.640010 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5mhff"] Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.640715 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5mhff" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.641457 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.641659 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-njb7r" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.641877 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ngfnl" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.643865 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.644027 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.644145 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.644183 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.644366 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.644397 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.644460 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.644757 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.644784 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.644823 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.644779 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.644938 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-8nzkj"] Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.645023 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.645163 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.645674 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.645817 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-8r7fw"] Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.645908 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.646375 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-8nzkj" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.663897 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.664110 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.664465 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.664821 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.664921 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.665027 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8r7fw" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.665567 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.665599 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.665851 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.666530 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5kmjp"] Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.666879 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.667021 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.667131 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-mhj6z"] Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.667511 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-mhj6z" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.667798 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5kmjp" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.668375 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r6fd4"] Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.669119 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r6fd4" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.669316 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-44z27"] Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.669685 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-44z27" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.671765 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-4jqfc"] Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.672123 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-gjqh4"] Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.672491 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-qw5t7"] Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.672876 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qw5t7" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.672934 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-4jqfc" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.673063 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-gjqh4" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.676548 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-9m75z"] Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.676617 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.677084 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9m75z" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.677178 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.677255 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ks8z9"] Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.677456 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.677637 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-ks8z9" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.677763 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.680199 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.680321 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29337570-l9r8r"] Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.680665 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29337570-l9r8r" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.681250 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1-audit-dir\") pod \"oauth-openshift-558db77b4-jlnn9\" (UID: \"06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1\") " pod="openshift-authentication/oauth-openshift-558db77b4-jlnn9" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.681279 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-jlnn9\" (UID: \"06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1\") " pod="openshift-authentication/oauth-openshift-558db77b4-jlnn9" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.681299 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9ffdc69e-7658-4b27-949a-594688dbee92-trusted-ca\") pod \"ingress-operator-5b745b69d9-wvhc6\" (UID: \"9ffdc69e-7658-4b27-949a-594688dbee92\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wvhc6" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.681316 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67s26\" (UniqueName: \"kubernetes.io/projected/fd61fed5-8370-4d59-8604-be1c73527f77-kube-api-access-67s26\") pod \"packageserver-d55dfcdfc-h94q9\" (UID: \"fd61fed5-8370-4d59-8604-be1c73527f77\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h94q9" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.681345 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1712254f-c8df-4c98-bfb2-bf79d98a6161-service-ca-bundle\") pod \"router-default-5444994796-r64z4\" (UID: \"1712254f-c8df-4c98-bfb2-bf79d98a6161\") " pod="openshift-ingress/router-default-5444994796-r64z4" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.681361 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ddcd57da-ffda-491f-bfaf-c484979a8121-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-ngfnl\" (UID: \"ddcd57da-ffda-491f-bfaf-c484979a8121\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ngfnl" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.681375 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jd2kf\" (UniqueName: \"kubernetes.io/projected/09a80831-32a7-4583-909d-96fa119c7aa1-kube-api-access-jd2kf\") pod \"etcd-operator-b45778765-8nzkj\" (UID: \"09a80831-32a7-4583-909d-96fa119c7aa1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8nzkj" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.681388 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/69d5f53c-375a-424e-a120-93d89d06ae50-client-ca\") pod \"route-controller-manager-6576b87f9c-8r7fw\" (UID: \"69d5f53c-375a-424e-a120-93d89d06ae50\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8r7fw" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.681414 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/10b365db-4478-4250-abe6-fa9e77354d70-node-pullsecrets\") pod \"apiserver-76f77b778f-v2b5m\" (UID: \"10b365db-4478-4250-abe6-fa9e77354d70\") " pod="openshift-apiserver/apiserver-76f77b778f-v2b5m" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.681430 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/1ce36f77-945f-473d-8fa5-011ee88d9adb-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-42f4k\" (UID: \"1ce36f77-945f-473d-8fa5-011ee88d9adb\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-42f4k" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.681448 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3422bfa7-f155-488a-a72c-129bed440646-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-26bbs\" (UID: \"3422bfa7-f155-488a-a72c-129bed440646\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-26bbs" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.681464 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3422bfa7-f155-488a-a72c-129bed440646-config\") pod \"openshift-apiserver-operator-796bbdcf4f-26bbs\" (UID: \"3422bfa7-f155-488a-a72c-129bed440646\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-26bbs" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.681479 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6xzs\" (UniqueName: \"kubernetes.io/projected/0409709f-7b11-4bcd-aec9-b8922c4474c9-kube-api-access-j6xzs\") pod \"machine-approver-56656f9798-f8wgl\" (UID: \"0409709f-7b11-4bcd-aec9-b8922c4474c9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f8wgl" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.681586 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-lwg5q"] Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.682199 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-lwg5q" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.682830 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-jlnn9\" (UID: \"06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1\") " pod="openshift-authentication/oauth-openshift-558db77b4-jlnn9" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.682881 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5vsk\" (UniqueName: \"kubernetes.io/projected/3cdcf878-abd9-481a-9f95-adeda51c77c8-kube-api-access-s5vsk\") pod \"machine-config-controller-84d6567774-njb7r\" (UID: \"3cdcf878-abd9-481a-9f95-adeda51c77c8\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-njb7r" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.682900 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/da19aa12-0eea-49f0-820a-b2d6da99484a-serving-cert\") pod \"authentication-operator-69f744f599-nlffm\" (UID: \"da19aa12-0eea-49f0-820a-b2d6da99484a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-nlffm" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.682916 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09a80831-32a7-4583-909d-96fa119c7aa1-etcd-service-ca\") pod \"etcd-operator-b45778765-8nzkj\" (UID: \"09a80831-32a7-4583-909d-96fa119c7aa1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8nzkj" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.682932 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7r5m6\" (UniqueName: \"kubernetes.io/projected/a823a439-112f-4160-b925-b9b0830ec38f-kube-api-access-7r5m6\") pod \"kube-storage-version-migrator-operator-b67b599dd-kvpk5\" (UID: \"a823a439-112f-4160-b925-b9b0830ec38f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kvpk5" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.682949 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e511cbc-ca72-4380-8626-d9cade8ce3e2-config\") pod \"machine-api-operator-5694c8668f-tgck5\" (UID: \"1e511cbc-ca72-4380-8626-d9cade8ce3e2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tgck5" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.682966 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1ce36f77-945f-473d-8fa5-011ee88d9adb-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-42f4k\" (UID: \"1ce36f77-945f-473d-8fa5-011ee88d9adb\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-42f4k" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.682980 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ddcd57da-ffda-491f-bfaf-c484979a8121-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-ngfnl\" (UID: \"ddcd57da-ffda-491f-bfaf-c484979a8121\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ngfnl" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.683034 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bab40ec2-3024-45f4-9efb-f07829d8786b-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ggnjr\" (UID: \"bab40ec2-3024-45f4-9efb-f07829d8786b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ggnjr" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.683076 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ce6df133-1d7c-41a7-a9a8-a18676bae045-audit-dir\") pod \"apiserver-7bbb656c7d-pnj7m\" (UID: \"ce6df133-1d7c-41a7-a9a8-a18676bae045\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pnj7m" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.683111 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2b0cadaa-2cc9-45a1-add5-6d1b13c114f8-serving-cert\") pod \"controller-manager-879f6c89f-4twv2\" (UID: \"2b0cadaa-2cc9-45a1-add5-6d1b13c114f8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4twv2" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.683132 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-jlnn9\" (UID: \"06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1\") " pod="openshift-authentication/oauth-openshift-558db77b4-jlnn9" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.683148 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0409709f-7b11-4bcd-aec9-b8922c4474c9-config\") pod \"machine-approver-56656f9798-f8wgl\" (UID: \"0409709f-7b11-4bcd-aec9-b8922c4474c9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f8wgl" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.683165 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09a80831-32a7-4583-909d-96fa119c7aa1-etcd-client\") pod \"etcd-operator-b45778765-8nzkj\" (UID: \"09a80831-32a7-4583-909d-96fa119c7aa1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8nzkj" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.683186 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8skzv\" (UniqueName: \"kubernetes.io/projected/2bd8b379-5b19-453f-af09-9025f6b127db-kube-api-access-8skzv\") pod \"dns-operator-744455d44c-dn2jf\" (UID: \"2bd8b379-5b19-453f-af09-9025f6b127db\") " pod="openshift-dns-operator/dns-operator-744455d44c-dn2jf" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.683212 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mk4p2\" (UniqueName: \"kubernetes.io/projected/da19aa12-0eea-49f0-820a-b2d6da99484a-kube-api-access-mk4p2\") pod \"authentication-operator-69f744f599-nlffm\" (UID: \"da19aa12-0eea-49f0-820a-b2d6da99484a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-nlffm" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.683230 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7d7kc\" (UniqueName: \"kubernetes.io/projected/3422bfa7-f155-488a-a72c-129bed440646-kube-api-access-7d7kc\") pod \"openshift-apiserver-operator-796bbdcf4f-26bbs\" (UID: \"3422bfa7-f155-488a-a72c-129bed440646\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-26bbs" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.683257 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ce6df133-1d7c-41a7-a9a8-a18676bae045-encryption-config\") pod \"apiserver-7bbb656c7d-pnj7m\" (UID: \"ce6df133-1d7c-41a7-a9a8-a18676bae045\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pnj7m" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.683273 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09a80831-32a7-4583-909d-96fa119c7aa1-config\") pod \"etcd-operator-b45778765-8nzkj\" (UID: \"09a80831-32a7-4583-909d-96fa119c7aa1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8nzkj" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.683316 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1ce36f77-945f-473d-8fa5-011ee88d9adb-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-42f4k\" (UID: \"1ce36f77-945f-473d-8fa5-011ee88d9adb\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-42f4k" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.683368 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1e511cbc-ca72-4380-8626-d9cade8ce3e2-images\") pod \"machine-api-operator-5694c8668f-tgck5\" (UID: \"1e511cbc-ca72-4380-8626-d9cade8ce3e2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tgck5" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.683389 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/be2a1543-ec30-4cc1-923f-62123740ac1a-available-featuregates\") pod \"openshift-config-operator-7777fb866f-gj5kj\" (UID: \"be2a1543-ec30-4cc1-923f-62123740ac1a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-gj5kj" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.683404 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pf7z9\" (UniqueName: \"kubernetes.io/projected/4459d391-5fe4-4159-b385-9c35fe4cfe77-kube-api-access-pf7z9\") pod \"cluster-samples-operator-665b6dd947-psd6r\" (UID: \"4459d391-5fe4-4159-b385-9c35fe4cfe77\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-psd6r" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.683433 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wggb5\" (UniqueName: \"kubernetes.io/projected/1712254f-c8df-4c98-bfb2-bf79d98a6161-kube-api-access-wggb5\") pod \"router-default-5444994796-r64z4\" (UID: \"1712254f-c8df-4c98-bfb2-bf79d98a6161\") " pod="openshift-ingress/router-default-5444994796-r64z4" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.683449 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26gj4\" (UniqueName: \"kubernetes.io/projected/ce6df133-1d7c-41a7-a9a8-a18676bae045-kube-api-access-26gj4\") pod \"apiserver-7bbb656c7d-pnj7m\" (UID: \"ce6df133-1d7c-41a7-a9a8-a18676bae045\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pnj7m" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.683473 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10b365db-4478-4250-abe6-fa9e77354d70-serving-cert\") pod \"apiserver-76f77b778f-v2b5m\" (UID: \"10b365db-4478-4250-abe6-fa9e77354d70\") " pod="openshift-apiserver/apiserver-76f77b778f-v2b5m" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.683489 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ce6df133-1d7c-41a7-a9a8-a18676bae045-audit-policies\") pod \"apiserver-7bbb656c7d-pnj7m\" (UID: \"ce6df133-1d7c-41a7-a9a8-a18676bae045\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pnj7m" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.683522 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhz5w\" (UniqueName: \"kubernetes.io/projected/b84f6b8e-f98e-4d84-823f-f83d7912bc6b-kube-api-access-mhz5w\") pod \"downloads-7954f5f757-kf2t6\" (UID: \"b84f6b8e-f98e-4d84-823f-f83d7912bc6b\") " pod="openshift-console/downloads-7954f5f757-kf2t6" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.683539 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-jlnn9\" (UID: \"06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1\") " pod="openshift-authentication/oauth-openshift-558db77b4-jlnn9" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.683557 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9ffdc69e-7658-4b27-949a-594688dbee92-metrics-tls\") pod \"ingress-operator-5b745b69d9-wvhc6\" (UID: \"9ffdc69e-7658-4b27-949a-594688dbee92\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wvhc6" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.683577 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9ffdc69e-7658-4b27-949a-594688dbee92-bound-sa-token\") pod \"ingress-operator-5b745b69d9-wvhc6\" (UID: \"9ffdc69e-7658-4b27-949a-594688dbee92\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wvhc6" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.683593 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fd61fed5-8370-4d59-8604-be1c73527f77-webhook-cert\") pod \"packageserver-d55dfcdfc-h94q9\" (UID: \"fd61fed5-8370-4d59-8604-be1c73527f77\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h94q9" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.683608 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1712254f-c8df-4c98-bfb2-bf79d98a6161-metrics-certs\") pod \"router-default-5444994796-r64z4\" (UID: \"1712254f-c8df-4c98-bfb2-bf79d98a6161\") " pod="openshift-ingress/router-default-5444994796-r64z4" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.683622 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0409709f-7b11-4bcd-aec9-b8922c4474c9-auth-proxy-config\") pod \"machine-approver-56656f9798-f8wgl\" (UID: \"0409709f-7b11-4bcd-aec9-b8922c4474c9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f8wgl" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.683650 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/003c1afa-e2c6-45c8-bdb3-1da462a231d3-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-5mhff\" (UID: \"003c1afa-e2c6-45c8-bdb3-1da462a231d3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5mhff" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.683675 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/10b365db-4478-4250-abe6-fa9e77354d70-etcd-client\") pod \"apiserver-76f77b778f-v2b5m\" (UID: \"10b365db-4478-4250-abe6-fa9e77354d70\") " pod="openshift-apiserver/apiserver-76f77b778f-v2b5m" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.683699 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/10b365db-4478-4250-abe6-fa9e77354d70-image-import-ca\") pod \"apiserver-76f77b778f-v2b5m\" (UID: \"10b365db-4478-4250-abe6-fa9e77354d70\") " pod="openshift-apiserver/apiserver-76f77b778f-v2b5m" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.683720 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10b365db-4478-4250-abe6-fa9e77354d70-trusted-ca-bundle\") pod \"apiserver-76f77b778f-v2b5m\" (UID: \"10b365db-4478-4250-abe6-fa9e77354d70\") " pod="openshift-apiserver/apiserver-76f77b778f-v2b5m" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.683749 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1-audit-policies\") pod \"oauth-openshift-558db77b4-jlnn9\" (UID: \"06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1\") " pod="openshift-authentication/oauth-openshift-558db77b4-jlnn9" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.683773 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-jlnn9\" (UID: \"06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1\") " pod="openshift-authentication/oauth-openshift-558db77b4-jlnn9" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.683789 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/69d5f53c-375a-424e-a120-93d89d06ae50-serving-cert\") pod \"route-controller-manager-6576b87f9c-8r7fw\" (UID: \"69d5f53c-375a-424e-a120-93d89d06ae50\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8r7fw" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.683827 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2b0cadaa-2cc9-45a1-add5-6d1b13c114f8-client-ca\") pod \"controller-manager-879f6c89f-4twv2\" (UID: \"2b0cadaa-2cc9-45a1-add5-6d1b13c114f8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4twv2" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.683842 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-jlnn9\" (UID: \"06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1\") " pod="openshift-authentication/oauth-openshift-558db77b4-jlnn9" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.683864 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/fd61fed5-8370-4d59-8604-be1c73527f77-tmpfs\") pod \"packageserver-d55dfcdfc-h94q9\" (UID: \"fd61fed5-8370-4d59-8604-be1c73527f77\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h94q9" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.683896 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ce6df133-1d7c-41a7-a9a8-a18676bae045-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-pnj7m\" (UID: \"ce6df133-1d7c-41a7-a9a8-a18676bae045\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pnj7m" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.683954 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69hmp\" (UniqueName: \"kubernetes.io/projected/9ffdc69e-7658-4b27-949a-594688dbee92-kube-api-access-69hmp\") pod \"ingress-operator-5b745b69d9-wvhc6\" (UID: \"9ffdc69e-7658-4b27-949a-594688dbee92\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wvhc6" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.683975 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/1712254f-c8df-4c98-bfb2-bf79d98a6161-default-certificate\") pod \"router-default-5444994796-r64z4\" (UID: \"1712254f-c8df-4c98-bfb2-bf79d98a6161\") " pod="openshift-ingress/router-default-5444994796-r64z4" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.683991 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ce6df133-1d7c-41a7-a9a8-a18676bae045-etcd-client\") pod \"apiserver-7bbb656c7d-pnj7m\" (UID: \"ce6df133-1d7c-41a7-a9a8-a18676bae045\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pnj7m" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.684007 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/003c1afa-e2c6-45c8-bdb3-1da462a231d3-config\") pod \"kube-controller-manager-operator-78b949d7b-5mhff\" (UID: \"003c1afa-e2c6-45c8-bdb3-1da462a231d3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5mhff" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.684021 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09a80831-32a7-4583-909d-96fa119c7aa1-etcd-ca\") pod \"etcd-operator-b45778765-8nzkj\" (UID: \"09a80831-32a7-4583-909d-96fa119c7aa1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8nzkj" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.684053 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b0cadaa-2cc9-45a1-add5-6d1b13c114f8-config\") pod \"controller-manager-879f6c89f-4twv2\" (UID: \"2b0cadaa-2cc9-45a1-add5-6d1b13c114f8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4twv2" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.684071 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/10b365db-4478-4250-abe6-fa9e77354d70-audit\") pod \"apiserver-76f77b778f-v2b5m\" (UID: \"10b365db-4478-4250-abe6-fa9e77354d70\") " pod="openshift-apiserver/apiserver-76f77b778f-v2b5m" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.684086 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-jlnn9\" (UID: \"06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1\") " pod="openshift-authentication/oauth-openshift-558db77b4-jlnn9" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.684110 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/1e511cbc-ca72-4380-8626-d9cade8ce3e2-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-tgck5\" (UID: \"1e511cbc-ca72-4380-8626-d9cade8ce3e2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tgck5" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.684132 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n26sd\" (UniqueName: \"kubernetes.io/projected/ddcd57da-ffda-491f-bfaf-c484979a8121-kube-api-access-n26sd\") pod \"openshift-controller-manager-operator-756b6f6bc6-ngfnl\" (UID: \"ddcd57da-ffda-491f-bfaf-c484979a8121\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ngfnl" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.684164 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/10b365db-4478-4250-abe6-fa9e77354d70-audit-dir\") pod \"apiserver-76f77b778f-v2b5m\" (UID: \"10b365db-4478-4250-abe6-fa9e77354d70\") " pod="openshift-apiserver/apiserver-76f77b778f-v2b5m" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.684180 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ghlf\" (UniqueName: \"kubernetes.io/projected/1ce36f77-945f-473d-8fa5-011ee88d9adb-kube-api-access-9ghlf\") pod \"cluster-image-registry-operator-dc59b4c8b-42f4k\" (UID: \"1ce36f77-945f-473d-8fa5-011ee88d9adb\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-42f4k" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.684195 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/da19aa12-0eea-49f0-820a-b2d6da99484a-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-nlffm\" (UID: \"da19aa12-0eea-49f0-820a-b2d6da99484a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-nlffm" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.684217 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7j2ww\" (UniqueName: \"kubernetes.io/projected/69d5f53c-375a-424e-a120-93d89d06ae50-kube-api-access-7j2ww\") pod \"route-controller-manager-6576b87f9c-8r7fw\" (UID: \"69d5f53c-375a-424e-a120-93d89d06ae50\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8r7fw" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.684233 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-jlnn9\" (UID: \"06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1\") " pod="openshift-authentication/oauth-openshift-558db77b4-jlnn9" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.684252 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-jlnn9\" (UID: \"06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1\") " pod="openshift-authentication/oauth-openshift-558db77b4-jlnn9" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.684266 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2bd8b379-5b19-453f-af09-9025f6b127db-metrics-tls\") pod \"dns-operator-744455d44c-dn2jf\" (UID: \"2bd8b379-5b19-453f-af09-9025f6b127db\") " pod="openshift-dns-operator/dns-operator-744455d44c-dn2jf" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.684292 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/da19aa12-0eea-49f0-820a-b2d6da99484a-service-ca-bundle\") pod \"authentication-operator-69f744f599-nlffm\" (UID: \"da19aa12-0eea-49f0-820a-b2d6da99484a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-nlffm" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.684312 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-jlnn9\" (UID: \"06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1\") " pod="openshift-authentication/oauth-openshift-558db77b4-jlnn9" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.684327 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da19aa12-0eea-49f0-820a-b2d6da99484a-config\") pod \"authentication-operator-69f744f599-nlffm\" (UID: \"da19aa12-0eea-49f0-820a-b2d6da99484a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-nlffm" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.684383 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4257\" (UniqueName: \"kubernetes.io/projected/10b365db-4478-4250-abe6-fa9e77354d70-kube-api-access-x4257\") pod \"apiserver-76f77b778f-v2b5m\" (UID: \"10b365db-4478-4250-abe6-fa9e77354d70\") " pod="openshift-apiserver/apiserver-76f77b778f-v2b5m" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.684401 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fd61fed5-8370-4d59-8604-be1c73527f77-apiservice-cert\") pod \"packageserver-d55dfcdfc-h94q9\" (UID: \"fd61fed5-8370-4d59-8604-be1c73527f77\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h94q9" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.684416 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tl8hm\" (UniqueName: \"kubernetes.io/projected/1e511cbc-ca72-4380-8626-d9cade8ce3e2-kube-api-access-tl8hm\") pod \"machine-api-operator-5694c8668f-tgck5\" (UID: \"1e511cbc-ca72-4380-8626-d9cade8ce3e2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tgck5" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.684437 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bab40ec2-3024-45f4-9efb-f07829d8786b-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ggnjr\" (UID: \"bab40ec2-3024-45f4-9efb-f07829d8786b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ggnjr" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.684466 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bab40ec2-3024-45f4-9efb-f07829d8786b-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ggnjr\" (UID: \"bab40ec2-3024-45f4-9efb-f07829d8786b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ggnjr" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.684483 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/003c1afa-e2c6-45c8-bdb3-1da462a231d3-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-5mhff\" (UID: \"003c1afa-e2c6-45c8-bdb3-1da462a231d3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5mhff" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.684499 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09a80831-32a7-4583-909d-96fa119c7aa1-serving-cert\") pod \"etcd-operator-b45778765-8nzkj\" (UID: \"09a80831-32a7-4583-909d-96fa119c7aa1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8nzkj" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.684528 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-jlnn9\" (UID: \"06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1\") " pod="openshift-authentication/oauth-openshift-558db77b4-jlnn9" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.684545 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/1712254f-c8df-4c98-bfb2-bf79d98a6161-stats-auth\") pod \"router-default-5444994796-r64z4\" (UID: \"1712254f-c8df-4c98-bfb2-bf79d98a6161\") " pod="openshift-ingress/router-default-5444994796-r64z4" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.684560 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ce6df133-1d7c-41a7-a9a8-a18676bae045-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-pnj7m\" (UID: \"ce6df133-1d7c-41a7-a9a8-a18676bae045\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pnj7m" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.684581 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/be2a1543-ec30-4cc1-923f-62123740ac1a-serving-cert\") pod \"openshift-config-operator-7777fb866f-gj5kj\" (UID: \"be2a1543-ec30-4cc1-923f-62123740ac1a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-gj5kj" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.684597 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69d5f53c-375a-424e-a120-93d89d06ae50-config\") pod \"route-controller-manager-6576b87f9c-8r7fw\" (UID: \"69d5f53c-375a-424e-a120-93d89d06ae50\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8r7fw" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.684643 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3cdcf878-abd9-481a-9f95-adeda51c77c8-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-njb7r\" (UID: \"3cdcf878-abd9-481a-9f95-adeda51c77c8\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-njb7r" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.684660 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdzsp\" (UniqueName: \"kubernetes.io/projected/be2a1543-ec30-4cc1-923f-62123740ac1a-kube-api-access-xdzsp\") pod \"openshift-config-operator-7777fb866f-gj5kj\" (UID: \"be2a1543-ec30-4cc1-923f-62123740ac1a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-gj5kj" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.684699 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2b0cadaa-2cc9-45a1-add5-6d1b13c114f8-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-4twv2\" (UID: \"2b0cadaa-2cc9-45a1-add5-6d1b13c114f8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4twv2" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.684716 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8f6vn\" (UniqueName: \"kubernetes.io/projected/2b0cadaa-2cc9-45a1-add5-6d1b13c114f8-kube-api-access-8f6vn\") pod \"controller-manager-879f6c89f-4twv2\" (UID: \"2b0cadaa-2cc9-45a1-add5-6d1b13c114f8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4twv2" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.684740 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/10b365db-4478-4250-abe6-fa9e77354d70-etcd-serving-ca\") pod \"apiserver-76f77b778f-v2b5m\" (UID: \"10b365db-4478-4250-abe6-fa9e77354d70\") " pod="openshift-apiserver/apiserver-76f77b778f-v2b5m" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.684758 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3cdcf878-abd9-481a-9f95-adeda51c77c8-proxy-tls\") pod \"machine-config-controller-84d6567774-njb7r\" (UID: \"3cdcf878-abd9-481a-9f95-adeda51c77c8\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-njb7r" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.684773 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a823a439-112f-4160-b925-b9b0830ec38f-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-kvpk5\" (UID: \"a823a439-112f-4160-b925-b9b0830ec38f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kvpk5" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.684807 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/10b365db-4478-4250-abe6-fa9e77354d70-encryption-config\") pod \"apiserver-76f77b778f-v2b5m\" (UID: \"10b365db-4478-4250-abe6-fa9e77354d70\") " pod="openshift-apiserver/apiserver-76f77b778f-v2b5m" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.684823 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a823a439-112f-4160-b925-b9b0830ec38f-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-kvpk5\" (UID: \"a823a439-112f-4160-b925-b9b0830ec38f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kvpk5" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.684854 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10b365db-4478-4250-abe6-fa9e77354d70-config\") pod \"apiserver-76f77b778f-v2b5m\" (UID: \"10b365db-4478-4250-abe6-fa9e77354d70\") " pod="openshift-apiserver/apiserver-76f77b778f-v2b5m" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.684871 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zws9k\" (UniqueName: \"kubernetes.io/projected/06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1-kube-api-access-zws9k\") pod \"oauth-openshift-558db77b4-jlnn9\" (UID: \"06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1\") " pod="openshift-authentication/oauth-openshift-558db77b4-jlnn9" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.684886 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ce6df133-1d7c-41a7-a9a8-a18676bae045-serving-cert\") pod \"apiserver-7bbb656c7d-pnj7m\" (UID: \"ce6df133-1d7c-41a7-a9a8-a18676bae045\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pnj7m" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.684903 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/0409709f-7b11-4bcd-aec9-b8922c4474c9-machine-approver-tls\") pod \"machine-approver-56656f9798-f8wgl\" (UID: \"0409709f-7b11-4bcd-aec9-b8922c4474c9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f8wgl" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.684917 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/4459d391-5fe4-4159-b385-9c35fe4cfe77-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-psd6r\" (UID: \"4459d391-5fe4-4159-b385-9c35fe4cfe77\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-psd6r" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.685804 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-4b6ss"] Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.687142 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-4b6ss" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.687262 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-gj5kj"] Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.695045 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wqwjn"] Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.695992 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wqwjn" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.697622 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xq6bd"] Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.698660 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xq6bd" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.702265 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.702785 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-fwlgl"] Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.703534 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-fwlgl" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.708499 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-26bbs"] Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.708778 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-nlffm"] Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.709552 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-h9rz4"] Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.711255 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-tgck5"] Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.711348 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-h9rz4" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.711954 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-v2b5m"] Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.712922 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ggnjr"] Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.713801 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-psd6r"] Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.714994 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.715106 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-4twv2"] Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.715431 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-dn2jf"] Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.716212 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ngfnl"] Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.716998 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-qw5t7"] Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.717944 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5kmjp"] Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.718808 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-kf2t6"] Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.719629 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-44z27"] Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.720394 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-8nzkj"] Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.721159 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-mhj6z"] Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.721971 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-njb7r"] Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.722961 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-wvhc6"] Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.723677 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h94q9"] Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.724477 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29337570-l9r8r"] Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.725428 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-42f4k"] Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.726196 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-4jqfc"] Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.726991 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5mhff"] Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.727808 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-8r7fw"] Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.728635 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kvpk5"] Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.729493 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-gjqh4"] Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.730433 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-4b6ss"] Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.731212 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r6fd4"] Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.732413 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-9m75z"] Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.732858 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-pnj7m"] Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.733624 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ks8z9"] Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.734474 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-jlnn9"] Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.734936 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.735235 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-h9rz4"] Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.736924 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-lwg5q"] Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.737737 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wqwjn"] Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.738772 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-fwlgl"] Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.739387 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-k258m"] Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.740019 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-k258m" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.740196 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xq6bd"] Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.755794 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.775313 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.787144 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/da19aa12-0eea-49f0-820a-b2d6da99484a-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-nlffm\" (UID: \"da19aa12-0eea-49f0-820a-b2d6da99484a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-nlffm" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.787182 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/503dce4f-a5a4-4a20-8546-a0311a212b84-config\") pod \"service-ca-operator-777779d784-4jqfc\" (UID: \"503dce4f-a5a4-4a20-8546-a0311a212b84\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4jqfc" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.787206 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/10b365db-4478-4250-abe6-fa9e77354d70-audit-dir\") pod \"apiserver-76f77b778f-v2b5m\" (UID: \"10b365db-4478-4250-abe6-fa9e77354d70\") " pod="openshift-apiserver/apiserver-76f77b778f-v2b5m" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.787233 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/da19aa12-0eea-49f0-820a-b2d6da99484a-service-ca-bundle\") pod \"authentication-operator-69f744f599-nlffm\" (UID: \"da19aa12-0eea-49f0-820a-b2d6da99484a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-nlffm" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.787252 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-jlnn9\" (UID: \"06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1\") " pod="openshift-authentication/oauth-openshift-558db77b4-jlnn9" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.787268 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-jlnn9\" (UID: \"06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1\") " pod="openshift-authentication/oauth-openshift-558db77b4-jlnn9" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.787282 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2bd8b379-5b19-453f-af09-9025f6b127db-metrics-tls\") pod \"dns-operator-744455d44c-dn2jf\" (UID: \"2bd8b379-5b19-453f-af09-9025f6b127db\") " pod="openshift-dns-operator/dns-operator-744455d44c-dn2jf" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.787296 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da19aa12-0eea-49f0-820a-b2d6da99484a-config\") pod \"authentication-operator-69f744f599-nlffm\" (UID: \"da19aa12-0eea-49f0-820a-b2d6da99484a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-nlffm" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.787313 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tl8hm\" (UniqueName: \"kubernetes.io/projected/1e511cbc-ca72-4380-8626-d9cade8ce3e2-kube-api-access-tl8hm\") pod \"machine-api-operator-5694c8668f-tgck5\" (UID: \"1e511cbc-ca72-4380-8626-d9cade8ce3e2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tgck5" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.787347 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09a80831-32a7-4583-909d-96fa119c7aa1-serving-cert\") pod \"etcd-operator-b45778765-8nzkj\" (UID: \"09a80831-32a7-4583-909d-96fa119c7aa1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8nzkj" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.787368 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bzvg\" (UniqueName: \"kubernetes.io/projected/c2d2a6b0-e99f-4c3b-8535-77d97c89d3ed-kube-api-access-5bzvg\") pod \"multus-admission-controller-857f4d67dd-gjqh4\" (UID: \"c2d2a6b0-e99f-4c3b-8535-77d97c89d3ed\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-gjqh4" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.787403 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bab40ec2-3024-45f4-9efb-f07829d8786b-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ggnjr\" (UID: \"bab40ec2-3024-45f4-9efb-f07829d8786b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ggnjr" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.787419 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ce6df133-1d7c-41a7-a9a8-a18676bae045-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-pnj7m\" (UID: \"ce6df133-1d7c-41a7-a9a8-a18676bae045\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pnj7m" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.787437 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f1daac45-8c29-46d6-a4ca-6c42bc99f1f7-trusted-ca-bundle\") pod \"console-f9d7485db-lwg5q\" (UID: \"f1daac45-8c29-46d6-a4ca-6c42bc99f1f7\") " pod="openshift-console/console-f9d7485db-lwg5q" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.787458 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-jlnn9\" (UID: \"06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1\") " pod="openshift-authentication/oauth-openshift-558db77b4-jlnn9" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.787474 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/1712254f-c8df-4c98-bfb2-bf79d98a6161-stats-auth\") pod \"router-default-5444994796-r64z4\" (UID: \"1712254f-c8df-4c98-bfb2-bf79d98a6161\") " pod="openshift-ingress/router-default-5444994796-r64z4" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.787497 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3cdcf878-abd9-481a-9f95-adeda51c77c8-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-njb7r\" (UID: \"3cdcf878-abd9-481a-9f95-adeda51c77c8\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-njb7r" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.787521 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2b0cadaa-2cc9-45a1-add5-6d1b13c114f8-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-4twv2\" (UID: \"2b0cadaa-2cc9-45a1-add5-6d1b13c114f8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4twv2" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.787537 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8f6vn\" (UniqueName: \"kubernetes.io/projected/2b0cadaa-2cc9-45a1-add5-6d1b13c114f8-kube-api-access-8f6vn\") pod \"controller-manager-879f6c89f-4twv2\" (UID: \"2b0cadaa-2cc9-45a1-add5-6d1b13c114f8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4twv2" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.787553 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/10b365db-4478-4250-abe6-fa9e77354d70-etcd-serving-ca\") pod \"apiserver-76f77b778f-v2b5m\" (UID: \"10b365db-4478-4250-abe6-fa9e77354d70\") " pod="openshift-apiserver/apiserver-76f77b778f-v2b5m" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.787570 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3cdcf878-abd9-481a-9f95-adeda51c77c8-proxy-tls\") pod \"machine-config-controller-84d6567774-njb7r\" (UID: \"3cdcf878-abd9-481a-9f95-adeda51c77c8\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-njb7r" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.787588 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a823a439-112f-4160-b925-b9b0830ec38f-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-kvpk5\" (UID: \"a823a439-112f-4160-b925-b9b0830ec38f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kvpk5" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.787613 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ee778128-189f-4177-9d3d-bbc1468da627-auth-proxy-config\") pod \"machine-config-operator-74547568cd-9m75z\" (UID: \"ee778128-189f-4177-9d3d-bbc1468da627\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9m75z" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.787630 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a823a439-112f-4160-b925-b9b0830ec38f-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-kvpk5\" (UID: \"a823a439-112f-4160-b925-b9b0830ec38f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kvpk5" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.787648 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ce6df133-1d7c-41a7-a9a8-a18676bae045-serving-cert\") pod \"apiserver-7bbb656c7d-pnj7m\" (UID: \"ce6df133-1d7c-41a7-a9a8-a18676bae045\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pnj7m" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.787666 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/4459d391-5fe4-4159-b385-9c35fe4cfe77-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-psd6r\" (UID: \"4459d391-5fe4-4159-b385-9c35fe4cfe77\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-psd6r" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.787681 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f1daac45-8c29-46d6-a4ca-6c42bc99f1f7-console-oauth-config\") pod \"console-f9d7485db-lwg5q\" (UID: \"f1daac45-8c29-46d6-a4ca-6c42bc99f1f7\") " pod="openshift-console/console-f9d7485db-lwg5q" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.787698 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-jlnn9\" (UID: \"06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1\") " pod="openshift-authentication/oauth-openshift-558db77b4-jlnn9" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.787714 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9ffdc69e-7658-4b27-949a-594688dbee92-trusted-ca\") pod \"ingress-operator-5b745b69d9-wvhc6\" (UID: \"9ffdc69e-7658-4b27-949a-594688dbee92\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wvhc6" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.787740 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1712254f-c8df-4c98-bfb2-bf79d98a6161-service-ca-bundle\") pod \"router-default-5444994796-r64z4\" (UID: \"1712254f-c8df-4c98-bfb2-bf79d98a6161\") " pod="openshift-ingress/router-default-5444994796-r64z4" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.787764 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1-audit-dir\") pod \"oauth-openshift-558db77b4-jlnn9\" (UID: \"06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1\") " pod="openshift-authentication/oauth-openshift-558db77b4-jlnn9" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.787778 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/10b365db-4478-4250-abe6-fa9e77354d70-node-pullsecrets\") pod \"apiserver-76f77b778f-v2b5m\" (UID: \"10b365db-4478-4250-abe6-fa9e77354d70\") " pod="openshift-apiserver/apiserver-76f77b778f-v2b5m" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.787796 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72x27\" (UniqueName: \"kubernetes.io/projected/b22d491a-9add-4ec5-ad3e-e593f9ca93bd-kube-api-access-72x27\") pod \"control-plane-machine-set-operator-78cbb6b69f-44z27\" (UID: \"b22d491a-9add-4ec5-ad3e-e593f9ca93bd\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-44z27" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.787812 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/b5519e59-bf53-4ae8-bcb0-4f7ff5b050e5-certs\") pod \"machine-config-server-k258m\" (UID: \"b5519e59-bf53-4ae8-bcb0-4f7ff5b050e5\") " pod="openshift-machine-config-operator/machine-config-server-k258m" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.787829 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3422bfa7-f155-488a-a72c-129bed440646-config\") pod \"openshift-apiserver-operator-796bbdcf4f-26bbs\" (UID: \"3422bfa7-f155-488a-a72c-129bed440646\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-26bbs" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.787844 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/da19aa12-0eea-49f0-820a-b2d6da99484a-serving-cert\") pod \"authentication-operator-69f744f599-nlffm\" (UID: \"da19aa12-0eea-49f0-820a-b2d6da99484a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-nlffm" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.787860 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-jlnn9\" (UID: \"06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1\") " pod="openshift-authentication/oauth-openshift-558db77b4-jlnn9" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.787877 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7r5m6\" (UniqueName: \"kubernetes.io/projected/a823a439-112f-4160-b925-b9b0830ec38f-kube-api-access-7r5m6\") pod \"kube-storage-version-migrator-operator-b67b599dd-kvpk5\" (UID: \"a823a439-112f-4160-b925-b9b0830ec38f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kvpk5" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.787896 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e511cbc-ca72-4380-8626-d9cade8ce3e2-config\") pod \"machine-api-operator-5694c8668f-tgck5\" (UID: \"1e511cbc-ca72-4380-8626-d9cade8ce3e2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tgck5" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.787911 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1ce36f77-945f-473d-8fa5-011ee88d9adb-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-42f4k\" (UID: \"1ce36f77-945f-473d-8fa5-011ee88d9adb\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-42f4k" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.789664 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ddcd57da-ffda-491f-bfaf-c484979a8121-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-ngfnl\" (UID: \"ddcd57da-ffda-491f-bfaf-c484979a8121\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ngfnl" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.789703 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bab40ec2-3024-45f4-9efb-f07829d8786b-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ggnjr\" (UID: \"bab40ec2-3024-45f4-9efb-f07829d8786b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ggnjr" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.789752 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ce6df133-1d7c-41a7-a9a8-a18676bae045-audit-dir\") pod \"apiserver-7bbb656c7d-pnj7m\" (UID: \"ce6df133-1d7c-41a7-a9a8-a18676bae045\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pnj7m" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.789788 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ee778128-189f-4177-9d3d-bbc1468da627-proxy-tls\") pod \"machine-config-operator-74547568cd-9m75z\" (UID: \"ee778128-189f-4177-9d3d-bbc1468da627\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9m75z" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.789837 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/b5519e59-bf53-4ae8-bcb0-4f7ff5b050e5-node-bootstrap-token\") pod \"machine-config-server-k258m\" (UID: \"b5519e59-bf53-4ae8-bcb0-4f7ff5b050e5\") " pod="openshift-machine-config-operator/machine-config-server-k258m" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.789881 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mk4p2\" (UniqueName: \"kubernetes.io/projected/da19aa12-0eea-49f0-820a-b2d6da99484a-kube-api-access-mk4p2\") pod \"authentication-operator-69f744f599-nlffm\" (UID: \"da19aa12-0eea-49f0-820a-b2d6da99484a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-nlffm" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.789903 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ce6df133-1d7c-41a7-a9a8-a18676bae045-encryption-config\") pod \"apiserver-7bbb656c7d-pnj7m\" (UID: \"ce6df133-1d7c-41a7-a9a8-a18676bae045\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pnj7m" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.789928 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1e511cbc-ca72-4380-8626-d9cade8ce3e2-images\") pod \"machine-api-operator-5694c8668f-tgck5\" (UID: \"1e511cbc-ca72-4380-8626-d9cade8ce3e2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tgck5" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.789953 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/be2a1543-ec30-4cc1-923f-62123740ac1a-available-featuregates\") pod \"openshift-config-operator-7777fb866f-gj5kj\" (UID: \"be2a1543-ec30-4cc1-923f-62123740ac1a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-gj5kj" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.789979 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f1daac45-8c29-46d6-a4ca-6c42bc99f1f7-service-ca\") pod \"console-f9d7485db-lwg5q\" (UID: \"f1daac45-8c29-46d6-a4ca-6c42bc99f1f7\") " pod="openshift-console/console-f9d7485db-lwg5q" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.790008 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ee778128-189f-4177-9d3d-bbc1468da627-images\") pod \"machine-config-operator-74547568cd-9m75z\" (UID: \"ee778128-189f-4177-9d3d-bbc1468da627\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9m75z" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.790048 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10b365db-4478-4250-abe6-fa9e77354d70-serving-cert\") pod \"apiserver-76f77b778f-v2b5m\" (UID: \"10b365db-4478-4250-abe6-fa9e77354d70\") " pod="openshift-apiserver/apiserver-76f77b778f-v2b5m" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.790074 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/003c1afa-e2c6-45c8-bdb3-1da462a231d3-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-5mhff\" (UID: \"003c1afa-e2c6-45c8-bdb3-1da462a231d3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5mhff" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.790099 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9ffdc69e-7658-4b27-949a-594688dbee92-bound-sa-token\") pod \"ingress-operator-5b745b69d9-wvhc6\" (UID: \"9ffdc69e-7658-4b27-949a-594688dbee92\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wvhc6" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.790120 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0409709f-7b11-4bcd-aec9-b8922c4474c9-auth-proxy-config\") pod \"machine-approver-56656f9798-f8wgl\" (UID: \"0409709f-7b11-4bcd-aec9-b8922c4474c9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f8wgl" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.790139 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/10b365db-4478-4250-abe6-fa9e77354d70-etcd-client\") pod \"apiserver-76f77b778f-v2b5m\" (UID: \"10b365db-4478-4250-abe6-fa9e77354d70\") " pod="openshift-apiserver/apiserver-76f77b778f-v2b5m" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.790162 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/10b365db-4478-4250-abe6-fa9e77354d70-image-import-ca\") pod \"apiserver-76f77b778f-v2b5m\" (UID: \"10b365db-4478-4250-abe6-fa9e77354d70\") " pod="openshift-apiserver/apiserver-76f77b778f-v2b5m" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.790189 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-jlnn9\" (UID: \"06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1\") " pod="openshift-authentication/oauth-openshift-558db77b4-jlnn9" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.790218 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a560875d-c07e-457b-a77b-809cc770867c-secret-volume\") pod \"collect-profiles-29337570-l9r8r\" (UID: \"a560875d-c07e-457b-a77b-809cc770867c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337570-l9r8r" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.790238 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2b0cadaa-2cc9-45a1-add5-6d1b13c114f8-client-ca\") pod \"controller-manager-879f6c89f-4twv2\" (UID: \"2b0cadaa-2cc9-45a1-add5-6d1b13c114f8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4twv2" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.790281 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ce6df133-1d7c-41a7-a9a8-a18676bae045-etcd-client\") pod \"apiserver-7bbb656c7d-pnj7m\" (UID: \"ce6df133-1d7c-41a7-a9a8-a18676bae045\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pnj7m" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.790305 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/003c1afa-e2c6-45c8-bdb3-1da462a231d3-config\") pod \"kube-controller-manager-operator-78b949d7b-5mhff\" (UID: \"003c1afa-e2c6-45c8-bdb3-1da462a231d3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5mhff" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.790323 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09a80831-32a7-4583-909d-96fa119c7aa1-etcd-ca\") pod \"etcd-operator-b45778765-8nzkj\" (UID: \"09a80831-32a7-4583-909d-96fa119c7aa1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8nzkj" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.790364 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f1daac45-8c29-46d6-a4ca-6c42bc99f1f7-console-config\") pod \"console-f9d7485db-lwg5q\" (UID: \"f1daac45-8c29-46d6-a4ca-6c42bc99f1f7\") " pod="openshift-console/console-f9d7485db-lwg5q" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.790392 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69hmp\" (UniqueName: \"kubernetes.io/projected/9ffdc69e-7658-4b27-949a-594688dbee92-kube-api-access-69hmp\") pod \"ingress-operator-5b745b69d9-wvhc6\" (UID: \"9ffdc69e-7658-4b27-949a-594688dbee92\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wvhc6" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.790415 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/1712254f-c8df-4c98-bfb2-bf79d98a6161-default-certificate\") pod \"router-default-5444994796-r64z4\" (UID: \"1712254f-c8df-4c98-bfb2-bf79d98a6161\") " pod="openshift-ingress/router-default-5444994796-r64z4" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.790434 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-jlnn9\" (UID: \"06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1\") " pod="openshift-authentication/oauth-openshift-558db77b4-jlnn9" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.790458 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b0cadaa-2cc9-45a1-add5-6d1b13c114f8-config\") pod \"controller-manager-879f6c89f-4twv2\" (UID: \"2b0cadaa-2cc9-45a1-add5-6d1b13c114f8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4twv2" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.790480 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/10b365db-4478-4250-abe6-fa9e77354d70-audit\") pod \"apiserver-76f77b778f-v2b5m\" (UID: \"10b365db-4478-4250-abe6-fa9e77354d70\") " pod="openshift-apiserver/apiserver-76f77b778f-v2b5m" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.790504 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kd5gw\" (UniqueName: \"kubernetes.io/projected/f1daac45-8c29-46d6-a4ca-6c42bc99f1f7-kube-api-access-kd5gw\") pod \"console-f9d7485db-lwg5q\" (UID: \"f1daac45-8c29-46d6-a4ca-6c42bc99f1f7\") " pod="openshift-console/console-f9d7485db-lwg5q" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.790982 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/10b365db-4478-4250-abe6-fa9e77354d70-audit-dir\") pod \"apiserver-76f77b778f-v2b5m\" (UID: \"10b365db-4478-4250-abe6-fa9e77354d70\") " pod="openshift-apiserver/apiserver-76f77b778f-v2b5m" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.791035 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/1e511cbc-ca72-4380-8626-d9cade8ce3e2-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-tgck5\" (UID: \"1e511cbc-ca72-4380-8626-d9cade8ce3e2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tgck5" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.791110 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n26sd\" (UniqueName: \"kubernetes.io/projected/ddcd57da-ffda-491f-bfaf-c484979a8121-kube-api-access-n26sd\") pod \"openshift-controller-manager-operator-756b6f6bc6-ngfnl\" (UID: \"ddcd57da-ffda-491f-bfaf-c484979a8121\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ngfnl" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.791178 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f1daac45-8c29-46d6-a4ca-6c42bc99f1f7-console-serving-cert\") pod \"console-f9d7485db-lwg5q\" (UID: \"f1daac45-8c29-46d6-a4ca-6c42bc99f1f7\") " pod="openshift-console/console-f9d7485db-lwg5q" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.791215 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ghlf\" (UniqueName: \"kubernetes.io/projected/1ce36f77-945f-473d-8fa5-011ee88d9adb-kube-api-access-9ghlf\") pod \"cluster-image-registry-operator-dc59b4c8b-42f4k\" (UID: \"1ce36f77-945f-473d-8fa5-011ee88d9adb\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-42f4k" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.791264 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7j2ww\" (UniqueName: \"kubernetes.io/projected/69d5f53c-375a-424e-a120-93d89d06ae50-kube-api-access-7j2ww\") pod \"route-controller-manager-6576b87f9c-8r7fw\" (UID: \"69d5f53c-375a-424e-a120-93d89d06ae50\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8r7fw" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.791293 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-jlnn9\" (UID: \"06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1\") " pod="openshift-authentication/oauth-openshift-558db77b4-jlnn9" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.791359 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bab40ec2-3024-45f4-9efb-f07829d8786b-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ggnjr\" (UID: \"bab40ec2-3024-45f4-9efb-f07829d8786b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ggnjr" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.791391 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4257\" (UniqueName: \"kubernetes.io/projected/10b365db-4478-4250-abe6-fa9e77354d70-kube-api-access-x4257\") pod \"apiserver-76f77b778f-v2b5m\" (UID: \"10b365db-4478-4250-abe6-fa9e77354d70\") " pod="openshift-apiserver/apiserver-76f77b778f-v2b5m" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.791441 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fd61fed5-8370-4d59-8604-be1c73527f77-apiservice-cert\") pod \"packageserver-d55dfcdfc-h94q9\" (UID: \"fd61fed5-8370-4d59-8604-be1c73527f77\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h94q9" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.791470 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/003c1afa-e2c6-45c8-bdb3-1da462a231d3-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-5mhff\" (UID: \"003c1afa-e2c6-45c8-bdb3-1da462a231d3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5mhff" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.791507 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/be2a1543-ec30-4cc1-923f-62123740ac1a-serving-cert\") pod \"openshift-config-operator-7777fb866f-gj5kj\" (UID: \"be2a1543-ec30-4cc1-923f-62123740ac1a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-gj5kj" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.791541 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69d5f53c-375a-424e-a120-93d89d06ae50-config\") pod \"route-controller-manager-6576b87f9c-8r7fw\" (UID: \"69d5f53c-375a-424e-a120-93d89d06ae50\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8r7fw" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.791587 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdzsp\" (UniqueName: \"kubernetes.io/projected/be2a1543-ec30-4cc1-923f-62123740ac1a-kube-api-access-xdzsp\") pod \"openshift-config-operator-7777fb866f-gj5kj\" (UID: \"be2a1543-ec30-4cc1-923f-62123740ac1a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-gj5kj" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.791618 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/11c1c8aa-1469-4229-9163-9df08ae4192f-srv-cert\") pod \"catalog-operator-68c6474976-wqwjn\" (UID: \"11c1c8aa-1469-4229-9163-9df08ae4192f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wqwjn" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.791642 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/10b365db-4478-4250-abe6-fa9e77354d70-encryption-config\") pod \"apiserver-76f77b778f-v2b5m\" (UID: \"10b365db-4478-4250-abe6-fa9e77354d70\") " pod="openshift-apiserver/apiserver-76f77b778f-v2b5m" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.791684 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10b365db-4478-4250-abe6-fa9e77354d70-config\") pod \"apiserver-76f77b778f-v2b5m\" (UID: \"10b365db-4478-4250-abe6-fa9e77354d70\") " pod="openshift-apiserver/apiserver-76f77b778f-v2b5m" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.791713 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/63a0b5d9-8401-4c7d-b897-295c91ffc508-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-r6fd4\" (UID: \"63a0b5d9-8401-4c7d-b897-295c91ffc508\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r6fd4" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.791783 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zws9k\" (UniqueName: \"kubernetes.io/projected/06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1-kube-api-access-zws9k\") pod \"oauth-openshift-558db77b4-jlnn9\" (UID: \"06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1\") " pod="openshift-authentication/oauth-openshift-558db77b4-jlnn9" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.791808 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/0409709f-7b11-4bcd-aec9-b8922c4474c9-machine-approver-tls\") pod \"machine-approver-56656f9798-f8wgl\" (UID: \"0409709f-7b11-4bcd-aec9-b8922c4474c9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f8wgl" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.791857 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jd2kf\" (UniqueName: \"kubernetes.io/projected/09a80831-32a7-4583-909d-96fa119c7aa1-kube-api-access-jd2kf\") pod \"etcd-operator-b45778765-8nzkj\" (UID: \"09a80831-32a7-4583-909d-96fa119c7aa1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8nzkj" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.791883 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/69d5f53c-375a-424e-a120-93d89d06ae50-client-ca\") pod \"route-controller-manager-6576b87f9c-8r7fw\" (UID: \"69d5f53c-375a-424e-a120-93d89d06ae50\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8r7fw" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.791930 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a560875d-c07e-457b-a77b-809cc770867c-config-volume\") pod \"collect-profiles-29337570-l9r8r\" (UID: \"a560875d-c07e-457b-a77b-809cc770867c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337570-l9r8r" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.791962 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/b22d491a-9add-4ec5-ad3e-e593f9ca93bd-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-44z27\" (UID: \"b22d491a-9add-4ec5-ad3e-e593f9ca93bd\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-44z27" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.792008 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c2d2a6b0-e99f-4c3b-8535-77d97c89d3ed-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-gjqh4\" (UID: \"c2d2a6b0-e99f-4c3b-8535-77d97c89d3ed\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-gjqh4" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.792038 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67s26\" (UniqueName: \"kubernetes.io/projected/fd61fed5-8370-4d59-8604-be1c73527f77-kube-api-access-67s26\") pod \"packageserver-d55dfcdfc-h94q9\" (UID: \"fd61fed5-8370-4d59-8604-be1c73527f77\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h94q9" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.792083 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ddcd57da-ffda-491f-bfaf-c484979a8121-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-ngfnl\" (UID: \"ddcd57da-ffda-491f-bfaf-c484979a8121\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ngfnl" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.792110 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/1ce36f77-945f-473d-8fa5-011ee88d9adb-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-42f4k\" (UID: \"1ce36f77-945f-473d-8fa5-011ee88d9adb\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-42f4k" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.792138 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-jlnn9\" (UID: \"06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1\") " pod="openshift-authentication/oauth-openshift-558db77b4-jlnn9" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.792175 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3422bfa7-f155-488a-a72c-129bed440646-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-26bbs\" (UID: \"3422bfa7-f155-488a-a72c-129bed440646\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-26bbs" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.792205 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6xzs\" (UniqueName: \"kubernetes.io/projected/0409709f-7b11-4bcd-aec9-b8922c4474c9-kube-api-access-j6xzs\") pod \"machine-approver-56656f9798-f8wgl\" (UID: \"0409709f-7b11-4bcd-aec9-b8922c4474c9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f8wgl" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.792258 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5vsk\" (UniqueName: \"kubernetes.io/projected/3cdcf878-abd9-481a-9f95-adeda51c77c8-kube-api-access-s5vsk\") pod \"machine-config-controller-84d6567774-njb7r\" (UID: \"3cdcf878-abd9-481a-9f95-adeda51c77c8\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-njb7r" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.792278 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09a80831-32a7-4583-909d-96fa119c7aa1-etcd-service-ca\") pod \"etcd-operator-b45778765-8nzkj\" (UID: \"09a80831-32a7-4583-909d-96fa119c7aa1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8nzkj" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.792320 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2b0cadaa-2cc9-45a1-add5-6d1b13c114f8-serving-cert\") pod \"controller-manager-879f6c89f-4twv2\" (UID: \"2b0cadaa-2cc9-45a1-add5-6d1b13c114f8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4twv2" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.792376 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-jlnn9\" (UID: \"06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1\") " pod="openshift-authentication/oauth-openshift-558db77b4-jlnn9" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.792402 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0409709f-7b11-4bcd-aec9-b8922c4474c9-config\") pod \"machine-approver-56656f9798-f8wgl\" (UID: \"0409709f-7b11-4bcd-aec9-b8922c4474c9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f8wgl" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.792438 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09a80831-32a7-4583-909d-96fa119c7aa1-etcd-client\") pod \"etcd-operator-b45778765-8nzkj\" (UID: \"09a80831-32a7-4583-909d-96fa119c7aa1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8nzkj" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.792468 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8skzv\" (UniqueName: \"kubernetes.io/projected/2bd8b379-5b19-453f-af09-9025f6b127db-kube-api-access-8skzv\") pod \"dns-operator-744455d44c-dn2jf\" (UID: \"2bd8b379-5b19-453f-af09-9025f6b127db\") " pod="openshift-dns-operator/dns-operator-744455d44c-dn2jf" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.792493 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cw7h7\" (UniqueName: \"kubernetes.io/projected/ee778128-189f-4177-9d3d-bbc1468da627-kube-api-access-cw7h7\") pod \"machine-config-operator-74547568cd-9m75z\" (UID: \"ee778128-189f-4177-9d3d-bbc1468da627\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9m75z" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.792540 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09a80831-32a7-4583-909d-96fa119c7aa1-config\") pod \"etcd-operator-b45778765-8nzkj\" (UID: \"09a80831-32a7-4583-909d-96fa119c7aa1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8nzkj" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.792566 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzz22\" (UniqueName: \"kubernetes.io/projected/a560875d-c07e-457b-a77b-809cc770867c-kube-api-access-pzz22\") pod \"collect-profiles-29337570-l9r8r\" (UID: \"a560875d-c07e-457b-a77b-809cc770867c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337570-l9r8r" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.792603 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfwgl\" (UniqueName: \"kubernetes.io/projected/b5519e59-bf53-4ae8-bcb0-4f7ff5b050e5-kube-api-access-bfwgl\") pod \"machine-config-server-k258m\" (UID: \"b5519e59-bf53-4ae8-bcb0-4f7ff5b050e5\") " pod="openshift-machine-config-operator/machine-config-server-k258m" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.792628 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7d7kc\" (UniqueName: \"kubernetes.io/projected/3422bfa7-f155-488a-a72c-129bed440646-kube-api-access-7d7kc\") pod \"openshift-apiserver-operator-796bbdcf4f-26bbs\" (UID: \"3422bfa7-f155-488a-a72c-129bed440646\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-26bbs" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.792651 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1ce36f77-945f-473d-8fa5-011ee88d9adb-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-42f4k\" (UID: \"1ce36f77-945f-473d-8fa5-011ee88d9adb\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-42f4k" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.792692 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/503dce4f-a5a4-4a20-8546-a0311a212b84-serving-cert\") pod \"service-ca-operator-777779d784-4jqfc\" (UID: \"503dce4f-a5a4-4a20-8546-a0311a212b84\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4jqfc" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.792713 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/11c1c8aa-1469-4229-9163-9df08ae4192f-profile-collector-cert\") pod \"catalog-operator-68c6474976-wqwjn\" (UID: \"11c1c8aa-1469-4229-9163-9df08ae4192f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wqwjn" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.792765 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pf7z9\" (UniqueName: \"kubernetes.io/projected/4459d391-5fe4-4159-b385-9c35fe4cfe77-kube-api-access-pf7z9\") pod \"cluster-samples-operator-665b6dd947-psd6r\" (UID: \"4459d391-5fe4-4159-b385-9c35fe4cfe77\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-psd6r" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.792794 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wggb5\" (UniqueName: \"kubernetes.io/projected/1712254f-c8df-4c98-bfb2-bf79d98a6161-kube-api-access-wggb5\") pod \"router-default-5444994796-r64z4\" (UID: \"1712254f-c8df-4c98-bfb2-bf79d98a6161\") " pod="openshift-ingress/router-default-5444994796-r64z4" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.792834 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26gj4\" (UniqueName: \"kubernetes.io/projected/ce6df133-1d7c-41a7-a9a8-a18676bae045-kube-api-access-26gj4\") pod \"apiserver-7bbb656c7d-pnj7m\" (UID: \"ce6df133-1d7c-41a7-a9a8-a18676bae045\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pnj7m" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.792860 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ce6df133-1d7c-41a7-a9a8-a18676bae045-audit-policies\") pod \"apiserver-7bbb656c7d-pnj7m\" (UID: \"ce6df133-1d7c-41a7-a9a8-a18676bae045\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pnj7m" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.792880 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-jlnn9\" (UID: \"06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1\") " pod="openshift-authentication/oauth-openshift-558db77b4-jlnn9" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.792924 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9ffdc69e-7658-4b27-949a-594688dbee92-metrics-tls\") pod \"ingress-operator-5b745b69d9-wvhc6\" (UID: \"9ffdc69e-7658-4b27-949a-594688dbee92\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wvhc6" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.792949 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fd61fed5-8370-4d59-8604-be1c73527f77-webhook-cert\") pod \"packageserver-d55dfcdfc-h94q9\" (UID: \"fd61fed5-8370-4d59-8604-be1c73527f77\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h94q9" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.792991 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1712254f-c8df-4c98-bfb2-bf79d98a6161-metrics-certs\") pod \"router-default-5444994796-r64z4\" (UID: \"1712254f-c8df-4c98-bfb2-bf79d98a6161\") " pod="openshift-ingress/router-default-5444994796-r64z4" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.793015 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhz5w\" (UniqueName: \"kubernetes.io/projected/b84f6b8e-f98e-4d84-823f-f83d7912bc6b-kube-api-access-mhz5w\") pod \"downloads-7954f5f757-kf2t6\" (UID: \"b84f6b8e-f98e-4d84-823f-f83d7912bc6b\") " pod="openshift-console/downloads-7954f5f757-kf2t6" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.793038 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/69d5f53c-375a-424e-a120-93d89d06ae50-serving-cert\") pod \"route-controller-manager-6576b87f9c-8r7fw\" (UID: \"69d5f53c-375a-424e-a120-93d89d06ae50\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8r7fw" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.793092 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f1daac45-8c29-46d6-a4ca-6c42bc99f1f7-oauth-serving-cert\") pod \"console-f9d7485db-lwg5q\" (UID: \"f1daac45-8c29-46d6-a4ca-6c42bc99f1f7\") " pod="openshift-console/console-f9d7485db-lwg5q" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.793122 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10b365db-4478-4250-abe6-fa9e77354d70-trusted-ca-bundle\") pod \"apiserver-76f77b778f-v2b5m\" (UID: \"10b365db-4478-4250-abe6-fa9e77354d70\") " pod="openshift-apiserver/apiserver-76f77b778f-v2b5m" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.793186 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1-audit-policies\") pod \"oauth-openshift-558db77b4-jlnn9\" (UID: \"06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1\") " pod="openshift-authentication/oauth-openshift-558db77b4-jlnn9" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.793207 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-jlnn9\" (UID: \"06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1\") " pod="openshift-authentication/oauth-openshift-558db77b4-jlnn9" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.793268 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/fd61fed5-8370-4d59-8604-be1c73527f77-tmpfs\") pod \"packageserver-d55dfcdfc-h94q9\" (UID: \"fd61fed5-8370-4d59-8604-be1c73527f77\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h94q9" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.793292 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ce6df133-1d7c-41a7-a9a8-a18676bae045-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-pnj7m\" (UID: \"ce6df133-1d7c-41a7-a9a8-a18676bae045\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pnj7m" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.793365 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbqms\" (UniqueName: \"kubernetes.io/projected/11c1c8aa-1469-4229-9163-9df08ae4192f-kube-api-access-gbqms\") pod \"catalog-operator-68c6474976-wqwjn\" (UID: \"11c1c8aa-1469-4229-9163-9df08ae4192f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wqwjn" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.793387 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5f74\" (UniqueName: \"kubernetes.io/projected/63a0b5d9-8401-4c7d-b897-295c91ffc508-kube-api-access-d5f74\") pod \"package-server-manager-789f6589d5-r6fd4\" (UID: \"63a0b5d9-8401-4c7d-b897-295c91ffc508\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r6fd4" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.793412 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6lxw\" (UniqueName: \"kubernetes.io/projected/0c13d504-e266-472f-9ca7-6644f8b37eef-kube-api-access-w6lxw\") pod \"migrator-59844c95c7-qw5t7\" (UID: \"0c13d504-e266-472f-9ca7-6644f8b37eef\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qw5t7" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.793459 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjb7q\" (UniqueName: \"kubernetes.io/projected/503dce4f-a5a4-4a20-8546-a0311a212b84-kube-api-access-vjb7q\") pod \"service-ca-operator-777779d784-4jqfc\" (UID: \"503dce4f-a5a4-4a20-8546-a0311a212b84\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4jqfc" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.793905 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/da19aa12-0eea-49f0-820a-b2d6da99484a-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-nlffm\" (UID: \"da19aa12-0eea-49f0-820a-b2d6da99484a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-nlffm" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.794283 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/da19aa12-0eea-49f0-820a-b2d6da99484a-service-ca-bundle\") pod \"authentication-operator-69f744f599-nlffm\" (UID: \"da19aa12-0eea-49f0-820a-b2d6da99484a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-nlffm" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.794917 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2b0cadaa-2cc9-45a1-add5-6d1b13c114f8-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-4twv2\" (UID: \"2b0cadaa-2cc9-45a1-add5-6d1b13c114f8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4twv2" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.796708 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2bd8b379-5b19-453f-af09-9025f6b127db-metrics-tls\") pod \"dns-operator-744455d44c-dn2jf\" (UID: \"2bd8b379-5b19-453f-af09-9025f6b127db\") " pod="openshift-dns-operator/dns-operator-744455d44c-dn2jf" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.797014 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1712254f-c8df-4c98-bfb2-bf79d98a6161-service-ca-bundle\") pod \"router-default-5444994796-r64z4\" (UID: \"1712254f-c8df-4c98-bfb2-bf79d98a6161\") " pod="openshift-ingress/router-default-5444994796-r64z4" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.797205 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da19aa12-0eea-49f0-820a-b2d6da99484a-config\") pod \"authentication-operator-69f744f599-nlffm\" (UID: \"da19aa12-0eea-49f0-820a-b2d6da99484a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-nlffm" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.797478 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-jlnn9\" (UID: \"06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1\") " pod="openshift-authentication/oauth-openshift-558db77b4-jlnn9" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.798039 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3cdcf878-abd9-481a-9f95-adeda51c77c8-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-njb7r\" (UID: \"3cdcf878-abd9-481a-9f95-adeda51c77c8\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-njb7r" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.798356 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/10b365db-4478-4250-abe6-fa9e77354d70-etcd-serving-ca\") pod \"apiserver-76f77b778f-v2b5m\" (UID: \"10b365db-4478-4250-abe6-fa9e77354d70\") " pod="openshift-apiserver/apiserver-76f77b778f-v2b5m" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.799407 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/1e511cbc-ca72-4380-8626-d9cade8ce3e2-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-tgck5\" (UID: \"1e511cbc-ca72-4380-8626-d9cade8ce3e2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tgck5" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.799392 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1-audit-dir\") pod \"oauth-openshift-558db77b4-jlnn9\" (UID: \"06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1\") " pod="openshift-authentication/oauth-openshift-558db77b4-jlnn9" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.799451 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/10b365db-4478-4250-abe6-fa9e77354d70-node-pullsecrets\") pod \"apiserver-76f77b778f-v2b5m\" (UID: \"10b365db-4478-4250-abe6-fa9e77354d70\") " pod="openshift-apiserver/apiserver-76f77b778f-v2b5m" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.799796 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e511cbc-ca72-4380-8626-d9cade8ce3e2-config\") pod \"machine-api-operator-5694c8668f-tgck5\" (UID: \"1e511cbc-ca72-4380-8626-d9cade8ce3e2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tgck5" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.800016 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bab40ec2-3024-45f4-9efb-f07829d8786b-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ggnjr\" (UID: \"bab40ec2-3024-45f4-9efb-f07829d8786b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ggnjr" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.800082 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ce6df133-1d7c-41a7-a9a8-a18676bae045-audit-dir\") pod \"apiserver-7bbb656c7d-pnj7m\" (UID: \"ce6df133-1d7c-41a7-a9a8-a18676bae045\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pnj7m" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.800637 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.800939 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-jlnn9\" (UID: \"06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1\") " pod="openshift-authentication/oauth-openshift-558db77b4-jlnn9" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.801394 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/1712254f-c8df-4c98-bfb2-bf79d98a6161-stats-auth\") pod \"router-default-5444994796-r64z4\" (UID: \"1712254f-c8df-4c98-bfb2-bf79d98a6161\") " pod="openshift-ingress/router-default-5444994796-r64z4" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.801691 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bab40ec2-3024-45f4-9efb-f07829d8786b-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ggnjr\" (UID: \"bab40ec2-3024-45f4-9efb-f07829d8786b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ggnjr" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.802103 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1e511cbc-ca72-4380-8626-d9cade8ce3e2-images\") pod \"machine-api-operator-5694c8668f-tgck5\" (UID: \"1e511cbc-ca72-4380-8626-d9cade8ce3e2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tgck5" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.802553 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-jlnn9\" (UID: \"06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1\") " pod="openshift-authentication/oauth-openshift-558db77b4-jlnn9" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.802571 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/be2a1543-ec30-4cc1-923f-62123740ac1a-available-featuregates\") pod \"openshift-config-operator-7777fb866f-gj5kj\" (UID: \"be2a1543-ec30-4cc1-923f-62123740ac1a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-gj5kj" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.802695 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/da19aa12-0eea-49f0-820a-b2d6da99484a-serving-cert\") pod \"authentication-operator-69f744f599-nlffm\" (UID: \"da19aa12-0eea-49f0-820a-b2d6da99484a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-nlffm" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.803126 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ce6df133-1d7c-41a7-a9a8-a18676bae045-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-pnj7m\" (UID: \"ce6df133-1d7c-41a7-a9a8-a18676bae045\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pnj7m" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.803168 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fd61fed5-8370-4d59-8604-be1c73527f77-apiservice-cert\") pod \"packageserver-d55dfcdfc-h94q9\" (UID: \"fd61fed5-8370-4d59-8604-be1c73527f77\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h94q9" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.803264 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3422bfa7-f155-488a-a72c-129bed440646-config\") pod \"openshift-apiserver-operator-796bbdcf4f-26bbs\" (UID: \"3422bfa7-f155-488a-a72c-129bed440646\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-26bbs" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.804835 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ce6df133-1d7c-41a7-a9a8-a18676bae045-audit-policies\") pod \"apiserver-7bbb656c7d-pnj7m\" (UID: \"ce6df133-1d7c-41a7-a9a8-a18676bae045\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pnj7m" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.805158 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ce6df133-1d7c-41a7-a9a8-a18676bae045-encryption-config\") pod \"apiserver-7bbb656c7d-pnj7m\" (UID: \"ce6df133-1d7c-41a7-a9a8-a18676bae045\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pnj7m" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.804961 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1ce36f77-945f-473d-8fa5-011ee88d9adb-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-42f4k\" (UID: \"1ce36f77-945f-473d-8fa5-011ee88d9adb\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-42f4k" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.805823 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-jlnn9\" (UID: \"06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1\") " pod="openshift-authentication/oauth-openshift-558db77b4-jlnn9" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.805830 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2b0cadaa-2cc9-45a1-add5-6d1b13c114f8-client-ca\") pod \"controller-manager-879f6c89f-4twv2\" (UID: \"2b0cadaa-2cc9-45a1-add5-6d1b13c114f8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4twv2" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.806866 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10b365db-4478-4250-abe6-fa9e77354d70-config\") pod \"apiserver-76f77b778f-v2b5m\" (UID: \"10b365db-4478-4250-abe6-fa9e77354d70\") " pod="openshift-apiserver/apiserver-76f77b778f-v2b5m" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.806943 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/10b365db-4478-4250-abe6-fa9e77354d70-encryption-config\") pod \"apiserver-76f77b778f-v2b5m\" (UID: \"10b365db-4478-4250-abe6-fa9e77354d70\") " pod="openshift-apiserver/apiserver-76f77b778f-v2b5m" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.806954 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-jlnn9\" (UID: \"06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1\") " pod="openshift-authentication/oauth-openshift-558db77b4-jlnn9" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.807703 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b0cadaa-2cc9-45a1-add5-6d1b13c114f8-config\") pod \"controller-manager-879f6c89f-4twv2\" (UID: \"2b0cadaa-2cc9-45a1-add5-6d1b13c114f8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4twv2" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.807708 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a823a439-112f-4160-b925-b9b0830ec38f-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-kvpk5\" (UID: \"a823a439-112f-4160-b925-b9b0830ec38f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kvpk5" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.807610 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10b365db-4478-4250-abe6-fa9e77354d70-serving-cert\") pod \"apiserver-76f77b778f-v2b5m\" (UID: \"10b365db-4478-4250-abe6-fa9e77354d70\") " pod="openshift-apiserver/apiserver-76f77b778f-v2b5m" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.807948 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a823a439-112f-4160-b925-b9b0830ec38f-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-kvpk5\" (UID: \"a823a439-112f-4160-b925-b9b0830ec38f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kvpk5" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.808173 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-ljckr"] Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.808241 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/1ce36f77-945f-473d-8fa5-011ee88d9adb-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-42f4k\" (UID: \"1ce36f77-945f-473d-8fa5-011ee88d9adb\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-42f4k" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.808484 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/10b365db-4478-4250-abe6-fa9e77354d70-image-import-ca\") pod \"apiserver-76f77b778f-v2b5m\" (UID: \"10b365db-4478-4250-abe6-fa9e77354d70\") " pod="openshift-apiserver/apiserver-76f77b778f-v2b5m" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.808554 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/10b365db-4478-4250-abe6-fa9e77354d70-audit\") pod \"apiserver-76f77b778f-v2b5m\" (UID: \"10b365db-4478-4250-abe6-fa9e77354d70\") " pod="openshift-apiserver/apiserver-76f77b778f-v2b5m" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.809246 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9ffdc69e-7658-4b27-949a-594688dbee92-trusted-ca\") pod \"ingress-operator-5b745b69d9-wvhc6\" (UID: \"9ffdc69e-7658-4b27-949a-594688dbee92\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wvhc6" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.809255 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-jlnn9\" (UID: \"06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1\") " pod="openshift-authentication/oauth-openshift-558db77b4-jlnn9" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.809512 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fd61fed5-8370-4d59-8604-be1c73527f77-webhook-cert\") pod \"packageserver-d55dfcdfc-h94q9\" (UID: \"fd61fed5-8370-4d59-8604-be1c73527f77\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h94q9" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.809982 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/fd61fed5-8370-4d59-8604-be1c73527f77-tmpfs\") pod \"packageserver-d55dfcdfc-h94q9\" (UID: \"fd61fed5-8370-4d59-8604-be1c73527f77\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h94q9" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.810110 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ce6df133-1d7c-41a7-a9a8-a18676bae045-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-pnj7m\" (UID: \"ce6df133-1d7c-41a7-a9a8-a18676bae045\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pnj7m" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.810188 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1-audit-policies\") pod \"oauth-openshift-558db77b4-jlnn9\" (UID: \"06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1\") " pod="openshift-authentication/oauth-openshift-558db77b4-jlnn9" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.810450 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10b365db-4478-4250-abe6-fa9e77354d70-trusted-ca-bundle\") pod \"apiserver-76f77b778f-v2b5m\" (UID: \"10b365db-4478-4250-abe6-fa9e77354d70\") " pod="openshift-apiserver/apiserver-76f77b778f-v2b5m" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.811849 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-ljckr" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.812344 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-jlnn9\" (UID: \"06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1\") " pod="openshift-authentication/oauth-openshift-558db77b4-jlnn9" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.812376 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/be2a1543-ec30-4cc1-923f-62123740ac1a-serving-cert\") pod \"openshift-config-operator-7777fb866f-gj5kj\" (UID: \"be2a1543-ec30-4cc1-923f-62123740ac1a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-gj5kj" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.812789 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1712254f-c8df-4c98-bfb2-bf79d98a6161-metrics-certs\") pod \"router-default-5444994796-r64z4\" (UID: \"1712254f-c8df-4c98-bfb2-bf79d98a6161\") " pod="openshift-ingress/router-default-5444994796-r64z4" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.812809 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-jlnn9\" (UID: \"06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1\") " pod="openshift-authentication/oauth-openshift-558db77b4-jlnn9" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.813203 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9ffdc69e-7658-4b27-949a-594688dbee92-metrics-tls\") pod \"ingress-operator-5b745b69d9-wvhc6\" (UID: \"9ffdc69e-7658-4b27-949a-594688dbee92\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wvhc6" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.813286 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/4459d391-5fe4-4159-b385-9c35fe4cfe77-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-psd6r\" (UID: \"4459d391-5fe4-4159-b385-9c35fe4cfe77\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-psd6r" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.813514 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3422bfa7-f155-488a-a72c-129bed440646-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-26bbs\" (UID: \"3422bfa7-f155-488a-a72c-129bed440646\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-26bbs" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.813560 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-26f4s"] Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.814075 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-26f4s" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.814326 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ce6df133-1d7c-41a7-a9a8-a18676bae045-etcd-client\") pod \"apiserver-7bbb656c7d-pnj7m\" (UID: \"ce6df133-1d7c-41a7-a9a8-a18676bae045\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pnj7m" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.814493 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-ljckr"] Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.814565 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-jlnn9\" (UID: \"06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1\") " pod="openshift-authentication/oauth-openshift-558db77b4-jlnn9" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.815322 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-26f4s"] Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.815479 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/1712254f-c8df-4c98-bfb2-bf79d98a6161-default-certificate\") pod \"router-default-5444994796-r64z4\" (UID: \"1712254f-c8df-4c98-bfb2-bf79d98a6161\") " pod="openshift-ingress/router-default-5444994796-r64z4" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.815496 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2b0cadaa-2cc9-45a1-add5-6d1b13c114f8-serving-cert\") pod \"controller-manager-879f6c89f-4twv2\" (UID: \"2b0cadaa-2cc9-45a1-add5-6d1b13c114f8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4twv2" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.815535 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.815738 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/10b365db-4478-4250-abe6-fa9e77354d70-etcd-client\") pod \"apiserver-76f77b778f-v2b5m\" (UID: \"10b365db-4478-4250-abe6-fa9e77354d70\") " pod="openshift-apiserver/apiserver-76f77b778f-v2b5m" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.815751 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ce6df133-1d7c-41a7-a9a8-a18676bae045-serving-cert\") pod \"apiserver-7bbb656c7d-pnj7m\" (UID: \"ce6df133-1d7c-41a7-a9a8-a18676bae045\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pnj7m" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.816383 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-jlnn9\" (UID: \"06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1\") " pod="openshift-authentication/oauth-openshift-558db77b4-jlnn9" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.835745 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.842255 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3cdcf878-abd9-481a-9f95-adeda51c77c8-proxy-tls\") pod \"machine-config-controller-84d6567774-njb7r\" (UID: \"3cdcf878-abd9-481a-9f95-adeda51c77c8\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-njb7r" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.855187 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.875043 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.894706 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f1daac45-8c29-46d6-a4ca-6c42bc99f1f7-console-oauth-config\") pod \"console-f9d7485db-lwg5q\" (UID: \"f1daac45-8c29-46d6-a4ca-6c42bc99f1f7\") " pod="openshift-console/console-f9d7485db-lwg5q" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.894804 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72x27\" (UniqueName: \"kubernetes.io/projected/b22d491a-9add-4ec5-ad3e-e593f9ca93bd-kube-api-access-72x27\") pod \"control-plane-machine-set-operator-78cbb6b69f-44z27\" (UID: \"b22d491a-9add-4ec5-ad3e-e593f9ca93bd\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-44z27" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.894838 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/b5519e59-bf53-4ae8-bcb0-4f7ff5b050e5-certs\") pod \"machine-config-server-k258m\" (UID: \"b5519e59-bf53-4ae8-bcb0-4f7ff5b050e5\") " pod="openshift-machine-config-operator/machine-config-server-k258m" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.894894 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ee778128-189f-4177-9d3d-bbc1468da627-proxy-tls\") pod \"machine-config-operator-74547568cd-9m75z\" (UID: \"ee778128-189f-4177-9d3d-bbc1468da627\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9m75z" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.894920 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/b5519e59-bf53-4ae8-bcb0-4f7ff5b050e5-node-bootstrap-token\") pod \"machine-config-server-k258m\" (UID: \"b5519e59-bf53-4ae8-bcb0-4f7ff5b050e5\") " pod="openshift-machine-config-operator/machine-config-server-k258m" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.894953 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f1daac45-8c29-46d6-a4ca-6c42bc99f1f7-service-ca\") pod \"console-f9d7485db-lwg5q\" (UID: \"f1daac45-8c29-46d6-a4ca-6c42bc99f1f7\") " pod="openshift-console/console-f9d7485db-lwg5q" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.894974 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ee778128-189f-4177-9d3d-bbc1468da627-images\") pod \"machine-config-operator-74547568cd-9m75z\" (UID: \"ee778128-189f-4177-9d3d-bbc1468da627\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9m75z" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.895023 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a560875d-c07e-457b-a77b-809cc770867c-secret-volume\") pod \"collect-profiles-29337570-l9r8r\" (UID: \"a560875d-c07e-457b-a77b-809cc770867c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337570-l9r8r" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.895071 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f1daac45-8c29-46d6-a4ca-6c42bc99f1f7-console-config\") pod \"console-f9d7485db-lwg5q\" (UID: \"f1daac45-8c29-46d6-a4ca-6c42bc99f1f7\") " pod="openshift-console/console-f9d7485db-lwg5q" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.895107 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f1daac45-8c29-46d6-a4ca-6c42bc99f1f7-console-serving-cert\") pod \"console-f9d7485db-lwg5q\" (UID: \"f1daac45-8c29-46d6-a4ca-6c42bc99f1f7\") " pod="openshift-console/console-f9d7485db-lwg5q" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.895136 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kd5gw\" (UniqueName: \"kubernetes.io/projected/f1daac45-8c29-46d6-a4ca-6c42bc99f1f7-kube-api-access-kd5gw\") pod \"console-f9d7485db-lwg5q\" (UID: \"f1daac45-8c29-46d6-a4ca-6c42bc99f1f7\") " pod="openshift-console/console-f9d7485db-lwg5q" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.895228 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/11c1c8aa-1469-4229-9163-9df08ae4192f-srv-cert\") pod \"catalog-operator-68c6474976-wqwjn\" (UID: \"11c1c8aa-1469-4229-9163-9df08ae4192f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wqwjn" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.895257 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/63a0b5d9-8401-4c7d-b897-295c91ffc508-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-r6fd4\" (UID: \"63a0b5d9-8401-4c7d-b897-295c91ffc508\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r6fd4" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.895318 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a560875d-c07e-457b-a77b-809cc770867c-config-volume\") pod \"collect-profiles-29337570-l9r8r\" (UID: \"a560875d-c07e-457b-a77b-809cc770867c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337570-l9r8r" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.895371 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/b22d491a-9add-4ec5-ad3e-e593f9ca93bd-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-44z27\" (UID: \"b22d491a-9add-4ec5-ad3e-e593f9ca93bd\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-44z27" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.895401 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c2d2a6b0-e99f-4c3b-8535-77d97c89d3ed-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-gjqh4\" (UID: \"c2d2a6b0-e99f-4c3b-8535-77d97c89d3ed\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-gjqh4" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.895474 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cw7h7\" (UniqueName: \"kubernetes.io/projected/ee778128-189f-4177-9d3d-bbc1468da627-kube-api-access-cw7h7\") pod \"machine-config-operator-74547568cd-9m75z\" (UID: \"ee778128-189f-4177-9d3d-bbc1468da627\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9m75z" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.895514 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzz22\" (UniqueName: \"kubernetes.io/projected/a560875d-c07e-457b-a77b-809cc770867c-kube-api-access-pzz22\") pod \"collect-profiles-29337570-l9r8r\" (UID: \"a560875d-c07e-457b-a77b-809cc770867c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337570-l9r8r" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.895534 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfwgl\" (UniqueName: \"kubernetes.io/projected/b5519e59-bf53-4ae8-bcb0-4f7ff5b050e5-kube-api-access-bfwgl\") pod \"machine-config-server-k258m\" (UID: \"b5519e59-bf53-4ae8-bcb0-4f7ff5b050e5\") " pod="openshift-machine-config-operator/machine-config-server-k258m" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.895570 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/503dce4f-a5a4-4a20-8546-a0311a212b84-serving-cert\") pod \"service-ca-operator-777779d784-4jqfc\" (UID: \"503dce4f-a5a4-4a20-8546-a0311a212b84\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4jqfc" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.895589 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/11c1c8aa-1469-4229-9163-9df08ae4192f-profile-collector-cert\") pod \"catalog-operator-68c6474976-wqwjn\" (UID: \"11c1c8aa-1469-4229-9163-9df08ae4192f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wqwjn" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.895642 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f1daac45-8c29-46d6-a4ca-6c42bc99f1f7-oauth-serving-cert\") pod \"console-f9d7485db-lwg5q\" (UID: \"f1daac45-8c29-46d6-a4ca-6c42bc99f1f7\") " pod="openshift-console/console-f9d7485db-lwg5q" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.895665 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbqms\" (UniqueName: \"kubernetes.io/projected/11c1c8aa-1469-4229-9163-9df08ae4192f-kube-api-access-gbqms\") pod \"catalog-operator-68c6474976-wqwjn\" (UID: \"11c1c8aa-1469-4229-9163-9df08ae4192f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wqwjn" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.895689 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6lxw\" (UniqueName: \"kubernetes.io/projected/0c13d504-e266-472f-9ca7-6644f8b37eef-kube-api-access-w6lxw\") pod \"migrator-59844c95c7-qw5t7\" (UID: \"0c13d504-e266-472f-9ca7-6644f8b37eef\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qw5t7" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.895712 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5f74\" (UniqueName: \"kubernetes.io/projected/63a0b5d9-8401-4c7d-b897-295c91ffc508-kube-api-access-d5f74\") pod \"package-server-manager-789f6589d5-r6fd4\" (UID: \"63a0b5d9-8401-4c7d-b897-295c91ffc508\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r6fd4" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.895712 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.895780 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjb7q\" (UniqueName: \"kubernetes.io/projected/503dce4f-a5a4-4a20-8546-a0311a212b84-kube-api-access-vjb7q\") pod \"service-ca-operator-777779d784-4jqfc\" (UID: \"503dce4f-a5a4-4a20-8546-a0311a212b84\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4jqfc" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.895806 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/503dce4f-a5a4-4a20-8546-a0311a212b84-config\") pod \"service-ca-operator-777779d784-4jqfc\" (UID: \"503dce4f-a5a4-4a20-8546-a0311a212b84\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4jqfc" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.895849 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bzvg\" (UniqueName: \"kubernetes.io/projected/c2d2a6b0-e99f-4c3b-8535-77d97c89d3ed-kube-api-access-5bzvg\") pod \"multus-admission-controller-857f4d67dd-gjqh4\" (UID: \"c2d2a6b0-e99f-4c3b-8535-77d97c89d3ed\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-gjqh4" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.895871 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f1daac45-8c29-46d6-a4ca-6c42bc99f1f7-trusted-ca-bundle\") pod \"console-f9d7485db-lwg5q\" (UID: \"f1daac45-8c29-46d6-a4ca-6c42bc99f1f7\") " pod="openshift-console/console-f9d7485db-lwg5q" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.895925 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ee778128-189f-4177-9d3d-bbc1468da627-auth-proxy-config\") pod \"machine-config-operator-74547568cd-9m75z\" (UID: \"ee778128-189f-4177-9d3d-bbc1468da627\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9m75z" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.896850 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ee778128-189f-4177-9d3d-bbc1468da627-auth-proxy-config\") pod \"machine-config-operator-74547568cd-9m75z\" (UID: \"ee778128-189f-4177-9d3d-bbc1468da627\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9m75z" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.906094 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/003c1afa-e2c6-45c8-bdb3-1da462a231d3-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-5mhff\" (UID: \"003c1afa-e2c6-45c8-bdb3-1da462a231d3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5mhff" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.915190 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.925505 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/003c1afa-e2c6-45c8-bdb3-1da462a231d3-config\") pod \"kube-controller-manager-operator-78b949d7b-5mhff\" (UID: \"003c1afa-e2c6-45c8-bdb3-1da462a231d3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5mhff" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.935553 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.955506 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.974886 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.980520 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ddcd57da-ffda-491f-bfaf-c484979a8121-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-ngfnl\" (UID: \"ddcd57da-ffda-491f-bfaf-c484979a8121\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ngfnl" Oct 12 07:37:16 crc kubenswrapper[4599]: I1012 07:37:16.994756 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Oct 12 07:37:17 crc kubenswrapper[4599]: I1012 07:37:17.006635 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ddcd57da-ffda-491f-bfaf-c484979a8121-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-ngfnl\" (UID: \"ddcd57da-ffda-491f-bfaf-c484979a8121\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ngfnl" Oct 12 07:37:17 crc kubenswrapper[4599]: I1012 07:37:17.014887 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Oct 12 07:37:17 crc kubenswrapper[4599]: I1012 07:37:17.035439 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Oct 12 07:37:17 crc kubenswrapper[4599]: I1012 07:37:17.035958 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09a80831-32a7-4583-909d-96fa119c7aa1-etcd-ca\") pod \"etcd-operator-b45778765-8nzkj\" (UID: \"09a80831-32a7-4583-909d-96fa119c7aa1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8nzkj" Oct 12 07:37:17 crc kubenswrapper[4599]: I1012 07:37:17.056005 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Oct 12 07:37:17 crc kubenswrapper[4599]: I1012 07:37:17.075902 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Oct 12 07:37:17 crc kubenswrapper[4599]: I1012 07:37:17.095695 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Oct 12 07:37:17 crc kubenswrapper[4599]: I1012 07:37:17.106560 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09a80831-32a7-4583-909d-96fa119c7aa1-config\") pod \"etcd-operator-b45778765-8nzkj\" (UID: \"09a80831-32a7-4583-909d-96fa119c7aa1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8nzkj" Oct 12 07:37:17 crc kubenswrapper[4599]: I1012 07:37:17.115146 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Oct 12 07:37:17 crc kubenswrapper[4599]: I1012 07:37:17.128864 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09a80831-32a7-4583-909d-96fa119c7aa1-etcd-client\") pod \"etcd-operator-b45778765-8nzkj\" (UID: \"09a80831-32a7-4583-909d-96fa119c7aa1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8nzkj" Oct 12 07:37:17 crc kubenswrapper[4599]: I1012 07:37:17.134942 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Oct 12 07:37:17 crc kubenswrapper[4599]: I1012 07:37:17.156001 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Oct 12 07:37:17 crc kubenswrapper[4599]: I1012 07:37:17.165098 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09a80831-32a7-4583-909d-96fa119c7aa1-serving-cert\") pod \"etcd-operator-b45778765-8nzkj\" (UID: \"09a80831-32a7-4583-909d-96fa119c7aa1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8nzkj" Oct 12 07:37:17 crc kubenswrapper[4599]: I1012 07:37:17.175470 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Oct 12 07:37:17 crc kubenswrapper[4599]: I1012 07:37:17.185463 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09a80831-32a7-4583-909d-96fa119c7aa1-etcd-service-ca\") pod \"etcd-operator-b45778765-8nzkj\" (UID: \"09a80831-32a7-4583-909d-96fa119c7aa1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8nzkj" Oct 12 07:37:17 crc kubenswrapper[4599]: I1012 07:37:17.195013 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 12 07:37:17 crc kubenswrapper[4599]: I1012 07:37:17.216040 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 12 07:37:17 crc kubenswrapper[4599]: I1012 07:37:17.235721 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 12 07:37:17 crc kubenswrapper[4599]: I1012 07:37:17.241749 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/69d5f53c-375a-424e-a120-93d89d06ae50-serving-cert\") pod \"route-controller-manager-6576b87f9c-8r7fw\" (UID: \"69d5f53c-375a-424e-a120-93d89d06ae50\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8r7fw" Oct 12 07:37:17 crc kubenswrapper[4599]: I1012 07:37:17.255527 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 12 07:37:17 crc kubenswrapper[4599]: I1012 07:37:17.264821 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69d5f53c-375a-424e-a120-93d89d06ae50-config\") pod \"route-controller-manager-6576b87f9c-8r7fw\" (UID: \"69d5f53c-375a-424e-a120-93d89d06ae50\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8r7fw" Oct 12 07:37:17 crc kubenswrapper[4599]: I1012 07:37:17.276222 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 12 07:37:17 crc kubenswrapper[4599]: I1012 07:37:17.285132 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/69d5f53c-375a-424e-a120-93d89d06ae50-client-ca\") pod \"route-controller-manager-6576b87f9c-8r7fw\" (UID: \"69d5f53c-375a-424e-a120-93d89d06ae50\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8r7fw" Oct 12 07:37:17 crc kubenswrapper[4599]: I1012 07:37:17.295290 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 12 07:37:17 crc kubenswrapper[4599]: I1012 07:37:17.315694 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Oct 12 07:37:17 crc kubenswrapper[4599]: I1012 07:37:17.335844 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Oct 12 07:37:17 crc kubenswrapper[4599]: I1012 07:37:17.355944 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Oct 12 07:37:17 crc kubenswrapper[4599]: I1012 07:37:17.374942 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Oct 12 07:37:17 crc kubenswrapper[4599]: I1012 07:37:17.395185 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Oct 12 07:37:17 crc kubenswrapper[4599]: I1012 07:37:17.419932 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Oct 12 07:37:17 crc kubenswrapper[4599]: I1012 07:37:17.435823 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Oct 12 07:37:17 crc kubenswrapper[4599]: I1012 07:37:17.455656 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Oct 12 07:37:17 crc kubenswrapper[4599]: I1012 07:37:17.474918 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Oct 12 07:37:17 crc kubenswrapper[4599]: I1012 07:37:17.494953 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Oct 12 07:37:17 crc kubenswrapper[4599]: I1012 07:37:17.515436 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Oct 12 07:37:17 crc kubenswrapper[4599]: I1012 07:37:17.529126 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/63a0b5d9-8401-4c7d-b897-295c91ffc508-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-r6fd4\" (UID: \"63a0b5d9-8401-4c7d-b897-295c91ffc508\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r6fd4" Oct 12 07:37:17 crc kubenswrapper[4599]: I1012 07:37:17.535128 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Oct 12 07:37:17 crc kubenswrapper[4599]: I1012 07:37:17.558951 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Oct 12 07:37:17 crc kubenswrapper[4599]: I1012 07:37:17.569252 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/b22d491a-9add-4ec5-ad3e-e593f9ca93bd-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-44z27\" (UID: \"b22d491a-9add-4ec5-ad3e-e593f9ca93bd\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-44z27" Oct 12 07:37:17 crc kubenswrapper[4599]: I1012 07:37:17.576159 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Oct 12 07:37:17 crc kubenswrapper[4599]: I1012 07:37:17.595187 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Oct 12 07:37:17 crc kubenswrapper[4599]: I1012 07:37:17.615407 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Oct 12 07:37:17 crc kubenswrapper[4599]: I1012 07:37:17.635582 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Oct 12 07:37:17 crc kubenswrapper[4599]: I1012 07:37:17.638313 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/503dce4f-a5a4-4a20-8546-a0311a212b84-serving-cert\") pod \"service-ca-operator-777779d784-4jqfc\" (UID: \"503dce4f-a5a4-4a20-8546-a0311a212b84\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4jqfc" Oct 12 07:37:17 crc kubenswrapper[4599]: I1012 07:37:17.655259 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Oct 12 07:37:17 crc kubenswrapper[4599]: I1012 07:37:17.656939 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/503dce4f-a5a4-4a20-8546-a0311a212b84-config\") pod \"service-ca-operator-777779d784-4jqfc\" (UID: \"503dce4f-a5a4-4a20-8546-a0311a212b84\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4jqfc" Oct 12 07:37:17 crc kubenswrapper[4599]: I1012 07:37:17.674808 4599 request.go:700] Waited for 1.001634485s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-service-ca-operator/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&limit=500&resourceVersion=0 Oct 12 07:37:17 crc kubenswrapper[4599]: I1012 07:37:17.675914 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Oct 12 07:37:17 crc kubenswrapper[4599]: I1012 07:37:17.695182 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Oct 12 07:37:17 crc kubenswrapper[4599]: I1012 07:37:17.698323 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c2d2a6b0-e99f-4c3b-8535-77d97c89d3ed-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-gjqh4\" (UID: \"c2d2a6b0-e99f-4c3b-8535-77d97c89d3ed\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-gjqh4" Oct 12 07:37:17 crc kubenswrapper[4599]: I1012 07:37:17.715753 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Oct 12 07:37:17 crc kubenswrapper[4599]: I1012 07:37:17.735096 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Oct 12 07:37:17 crc kubenswrapper[4599]: I1012 07:37:17.755378 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Oct 12 07:37:17 crc kubenswrapper[4599]: I1012 07:37:17.776051 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Oct 12 07:37:17 crc kubenswrapper[4599]: I1012 07:37:17.786215 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ee778128-189f-4177-9d3d-bbc1468da627-images\") pod \"machine-config-operator-74547568cd-9m75z\" (UID: \"ee778128-189f-4177-9d3d-bbc1468da627\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9m75z" Oct 12 07:37:17 crc kubenswrapper[4599]: I1012 07:37:17.795358 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Oct 12 07:37:17 crc kubenswrapper[4599]: E1012 07:37:17.804727 4599 secret.go:188] Couldn't get secret openshift-cluster-machine-approver/machine-approver-tls: failed to sync secret cache: timed out waiting for the condition Oct 12 07:37:17 crc kubenswrapper[4599]: E1012 07:37:17.804849 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0409709f-7b11-4bcd-aec9-b8922c4474c9-machine-approver-tls podName:0409709f-7b11-4bcd-aec9-b8922c4474c9 nodeName:}" failed. No retries permitted until 2025-10-12 07:37:18.304817206 +0000 UTC m=+135.094012708 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "machine-approver-tls" (UniqueName: "kubernetes.io/secret/0409709f-7b11-4bcd-aec9-b8922c4474c9-machine-approver-tls") pod "machine-approver-56656f9798-f8wgl" (UID: "0409709f-7b11-4bcd-aec9-b8922c4474c9") : failed to sync secret cache: timed out waiting for the condition Oct 12 07:37:17 crc kubenswrapper[4599]: E1012 07:37:17.808421 4599 configmap.go:193] Couldn't get configMap openshift-cluster-machine-approver/machine-approver-config: failed to sync configmap cache: timed out waiting for the condition Oct 12 07:37:17 crc kubenswrapper[4599]: E1012 07:37:17.808485 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0409709f-7b11-4bcd-aec9-b8922c4474c9-config podName:0409709f-7b11-4bcd-aec9-b8922c4474c9 nodeName:}" failed. No retries permitted until 2025-10-12 07:37:18.308470019 +0000 UTC m=+135.097665522 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/0409709f-7b11-4bcd-aec9-b8922c4474c9-config") pod "machine-approver-56656f9798-f8wgl" (UID: "0409709f-7b11-4bcd-aec9-b8922c4474c9") : failed to sync configmap cache: timed out waiting for the condition Oct 12 07:37:17 crc kubenswrapper[4599]: E1012 07:37:17.809823 4599 configmap.go:193] Couldn't get configMap openshift-cluster-machine-approver/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Oct 12 07:37:17 crc kubenswrapper[4599]: E1012 07:37:17.809894 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0409709f-7b11-4bcd-aec9-b8922c4474c9-auth-proxy-config podName:0409709f-7b11-4bcd-aec9-b8922c4474c9 nodeName:}" failed. No retries permitted until 2025-10-12 07:37:18.309855473 +0000 UTC m=+135.099050966 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "auth-proxy-config" (UniqueName: "kubernetes.io/configmap/0409709f-7b11-4bcd-aec9-b8922c4474c9-auth-proxy-config") pod "machine-approver-56656f9798-f8wgl" (UID: "0409709f-7b11-4bcd-aec9-b8922c4474c9") : failed to sync configmap cache: timed out waiting for the condition Oct 12 07:37:17 crc kubenswrapper[4599]: I1012 07:37:17.816226 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Oct 12 07:37:17 crc kubenswrapper[4599]: I1012 07:37:17.828939 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ee778128-189f-4177-9d3d-bbc1468da627-proxy-tls\") pod \"machine-config-operator-74547568cd-9m75z\" (UID: \"ee778128-189f-4177-9d3d-bbc1468da627\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9m75z" Oct 12 07:37:17 crc kubenswrapper[4599]: I1012 07:37:17.835823 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Oct 12 07:37:17 crc kubenswrapper[4599]: I1012 07:37:17.875324 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Oct 12 07:37:17 crc kubenswrapper[4599]: I1012 07:37:17.894873 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Oct 12 07:37:17 crc kubenswrapper[4599]: E1012 07:37:17.894934 4599 secret.go:188] Couldn't get secret openshift-console/console-oauth-config: failed to sync secret cache: timed out waiting for the condition Oct 12 07:37:17 crc kubenswrapper[4599]: E1012 07:37:17.894982 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f1daac45-8c29-46d6-a4ca-6c42bc99f1f7-console-oauth-config podName:f1daac45-8c29-46d6-a4ca-6c42bc99f1f7 nodeName:}" failed. No retries permitted until 2025-10-12 07:37:18.394970832 +0000 UTC m=+135.184166333 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "console-oauth-config" (UniqueName: "kubernetes.io/secret/f1daac45-8c29-46d6-a4ca-6c42bc99f1f7-console-oauth-config") pod "console-f9d7485db-lwg5q" (UID: "f1daac45-8c29-46d6-a4ca-6c42bc99f1f7") : failed to sync secret cache: timed out waiting for the condition Oct 12 07:37:17 crc kubenswrapper[4599]: E1012 07:37:17.895004 4599 secret.go:188] Couldn't get secret openshift-machine-config-operator/machine-config-server-tls: failed to sync secret cache: timed out waiting for the condition Oct 12 07:37:17 crc kubenswrapper[4599]: E1012 07:37:17.895034 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b5519e59-bf53-4ae8-bcb0-4f7ff5b050e5-certs podName:b5519e59-bf53-4ae8-bcb0-4f7ff5b050e5 nodeName:}" failed. No retries permitted until 2025-10-12 07:37:18.395024122 +0000 UTC m=+135.184219624 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certs" (UniqueName: "kubernetes.io/secret/b5519e59-bf53-4ae8-bcb0-4f7ff5b050e5-certs") pod "machine-config-server-k258m" (UID: "b5519e59-bf53-4ae8-bcb0-4f7ff5b050e5") : failed to sync secret cache: timed out waiting for the condition Oct 12 07:37:17 crc kubenswrapper[4599]: E1012 07:37:17.895056 4599 secret.go:188] Couldn't get secret openshift-machine-config-operator/node-bootstrapper-token: failed to sync secret cache: timed out waiting for the condition Oct 12 07:37:17 crc kubenswrapper[4599]: E1012 07:37:17.895078 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b5519e59-bf53-4ae8-bcb0-4f7ff5b050e5-node-bootstrap-token podName:b5519e59-bf53-4ae8-bcb0-4f7ff5b050e5 nodeName:}" failed. No retries permitted until 2025-10-12 07:37:18.395072924 +0000 UTC m=+135.184268426 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-bootstrap-token" (UniqueName: "kubernetes.io/secret/b5519e59-bf53-4ae8-bcb0-4f7ff5b050e5-node-bootstrap-token") pod "machine-config-server-k258m" (UID: "b5519e59-bf53-4ae8-bcb0-4f7ff5b050e5") : failed to sync secret cache: timed out waiting for the condition Oct 12 07:37:17 crc kubenswrapper[4599]: E1012 07:37:17.895112 4599 configmap.go:193] Couldn't get configMap openshift-console/service-ca: failed to sync configmap cache: timed out waiting for the condition Oct 12 07:37:17 crc kubenswrapper[4599]: E1012 07:37:17.895135 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f1daac45-8c29-46d6-a4ca-6c42bc99f1f7-service-ca podName:f1daac45-8c29-46d6-a4ca-6c42bc99f1f7 nodeName:}" failed. No retries permitted until 2025-10-12 07:37:18.395131054 +0000 UTC m=+135.184326555 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca" (UniqueName: "kubernetes.io/configmap/f1daac45-8c29-46d6-a4ca-6c42bc99f1f7-service-ca") pod "console-f9d7485db-lwg5q" (UID: "f1daac45-8c29-46d6-a4ca-6c42bc99f1f7") : failed to sync configmap cache: timed out waiting for the condition Oct 12 07:37:17 crc kubenswrapper[4599]: E1012 07:37:17.895155 4599 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/pprof-cert: failed to sync secret cache: timed out waiting for the condition Oct 12 07:37:17 crc kubenswrapper[4599]: E1012 07:37:17.895177 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a560875d-c07e-457b-a77b-809cc770867c-secret-volume podName:a560875d-c07e-457b-a77b-809cc770867c nodeName:}" failed. No retries permitted until 2025-10-12 07:37:18.39517161 +0000 UTC m=+135.184367112 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-volume" (UniqueName: "kubernetes.io/secret/a560875d-c07e-457b-a77b-809cc770867c-secret-volume") pod "collect-profiles-29337570-l9r8r" (UID: "a560875d-c07e-457b-a77b-809cc770867c") : failed to sync secret cache: timed out waiting for the condition Oct 12 07:37:17 crc kubenswrapper[4599]: E1012 07:37:17.895194 4599 secret.go:188] Couldn't get secret openshift-console/console-serving-cert: failed to sync secret cache: timed out waiting for the condition Oct 12 07:37:17 crc kubenswrapper[4599]: E1012 07:37:17.895229 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f1daac45-8c29-46d6-a4ca-6c42bc99f1f7-console-serving-cert podName:f1daac45-8c29-46d6-a4ca-6c42bc99f1f7 nodeName:}" failed. No retries permitted until 2025-10-12 07:37:18.395223568 +0000 UTC m=+135.184419070 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "console-serving-cert" (UniqueName: "kubernetes.io/secret/f1daac45-8c29-46d6-a4ca-6c42bc99f1f7-console-serving-cert") pod "console-f9d7485db-lwg5q" (UID: "f1daac45-8c29-46d6-a4ca-6c42bc99f1f7") : failed to sync secret cache: timed out waiting for the condition Oct 12 07:37:17 crc kubenswrapper[4599]: E1012 07:37:17.895247 4599 configmap.go:193] Couldn't get configMap openshift-console/console-config: failed to sync configmap cache: timed out waiting for the condition Oct 12 07:37:17 crc kubenswrapper[4599]: E1012 07:37:17.895279 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f1daac45-8c29-46d6-a4ca-6c42bc99f1f7-console-config podName:f1daac45-8c29-46d6-a4ca-6c42bc99f1f7 nodeName:}" failed. No retries permitted until 2025-10-12 07:37:18.395269505 +0000 UTC m=+135.184465007 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "console-config" (UniqueName: "kubernetes.io/configmap/f1daac45-8c29-46d6-a4ca-6c42bc99f1f7-console-config") pod "console-f9d7485db-lwg5q" (UID: "f1daac45-8c29-46d6-a4ca-6c42bc99f1f7") : failed to sync configmap cache: timed out waiting for the condition Oct 12 07:37:17 crc kubenswrapper[4599]: E1012 07:37:17.895693 4599 configmap.go:193] Couldn't get configMap openshift-operator-lifecycle-manager/collect-profiles-config: failed to sync configmap cache: timed out waiting for the condition Oct 12 07:37:17 crc kubenswrapper[4599]: E1012 07:37:17.895749 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a560875d-c07e-457b-a77b-809cc770867c-config-volume podName:a560875d-c07e-457b-a77b-809cc770867c nodeName:}" failed. No retries permitted until 2025-10-12 07:37:18.395741646 +0000 UTC m=+135.184937147 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/a560875d-c07e-457b-a77b-809cc770867c-config-volume") pod "collect-profiles-29337570-l9r8r" (UID: "a560875d-c07e-457b-a77b-809cc770867c") : failed to sync configmap cache: timed out waiting for the condition Oct 12 07:37:17 crc kubenswrapper[4599]: E1012 07:37:17.895709 4599 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Oct 12 07:37:17 crc kubenswrapper[4599]: E1012 07:37:17.895779 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/11c1c8aa-1469-4229-9163-9df08ae4192f-srv-cert podName:11c1c8aa-1469-4229-9163-9df08ae4192f nodeName:}" failed. No retries permitted until 2025-10-12 07:37:18.395773516 +0000 UTC m=+135.184969019 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/11c1c8aa-1469-4229-9163-9df08ae4192f-srv-cert") pod "catalog-operator-68c6474976-wqwjn" (UID: "11c1c8aa-1469-4229-9163-9df08ae4192f") : failed to sync secret cache: timed out waiting for the condition Oct 12 07:37:17 crc kubenswrapper[4599]: E1012 07:37:17.895717 4599 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/pprof-cert: failed to sync secret cache: timed out waiting for the condition Oct 12 07:37:17 crc kubenswrapper[4599]: E1012 07:37:17.895804 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/11c1c8aa-1469-4229-9163-9df08ae4192f-profile-collector-cert podName:11c1c8aa-1469-4229-9163-9df08ae4192f nodeName:}" failed. No retries permitted until 2025-10-12 07:37:18.395798974 +0000 UTC m=+135.184994476 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "profile-collector-cert" (UniqueName: "kubernetes.io/secret/11c1c8aa-1469-4229-9163-9df08ae4192f-profile-collector-cert") pod "catalog-operator-68c6474976-wqwjn" (UID: "11c1c8aa-1469-4229-9163-9df08ae4192f") : failed to sync secret cache: timed out waiting for the condition Oct 12 07:37:17 crc kubenswrapper[4599]: E1012 07:37:17.895841 4599 configmap.go:193] Couldn't get configMap openshift-console/oauth-serving-cert: failed to sync configmap cache: timed out waiting for the condition Oct 12 07:37:17 crc kubenswrapper[4599]: E1012 07:37:17.895865 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f1daac45-8c29-46d6-a4ca-6c42bc99f1f7-oauth-serving-cert podName:f1daac45-8c29-46d6-a4ca-6c42bc99f1f7 nodeName:}" failed. No retries permitted until 2025-10-12 07:37:18.395857425 +0000 UTC m=+135.185052926 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "oauth-serving-cert" (UniqueName: "kubernetes.io/configmap/f1daac45-8c29-46d6-a4ca-6c42bc99f1f7-oauth-serving-cert") pod "console-f9d7485db-lwg5q" (UID: "f1daac45-8c29-46d6-a4ca-6c42bc99f1f7") : failed to sync configmap cache: timed out waiting for the condition Oct 12 07:37:17 crc kubenswrapper[4599]: E1012 07:37:17.896824 4599 configmap.go:193] Couldn't get configMap openshift-console/trusted-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Oct 12 07:37:17 crc kubenswrapper[4599]: E1012 07:37:17.896867 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f1daac45-8c29-46d6-a4ca-6c42bc99f1f7-trusted-ca-bundle podName:f1daac45-8c29-46d6-a4ca-6c42bc99f1f7 nodeName:}" failed. No retries permitted until 2025-10-12 07:37:18.396858354 +0000 UTC m=+135.186053855 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/f1daac45-8c29-46d6-a4ca-6c42bc99f1f7-trusted-ca-bundle") pod "console-f9d7485db-lwg5q" (UID: "f1daac45-8c29-46d6-a4ca-6c42bc99f1f7") : failed to sync configmap cache: timed out waiting for the condition Oct 12 07:37:17 crc kubenswrapper[4599]: I1012 07:37:17.920461 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Oct 12 07:37:17 crc kubenswrapper[4599]: I1012 07:37:17.935587 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Oct 12 07:37:17 crc kubenswrapper[4599]: I1012 07:37:17.954966 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 12 07:37:17 crc kubenswrapper[4599]: I1012 07:37:17.975308 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 12 07:37:17 crc kubenswrapper[4599]: I1012 07:37:17.995727 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Oct 12 07:37:18 crc kubenswrapper[4599]: I1012 07:37:18.015631 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Oct 12 07:37:18 crc kubenswrapper[4599]: I1012 07:37:18.036003 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Oct 12 07:37:18 crc kubenswrapper[4599]: I1012 07:37:18.055416 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Oct 12 07:37:18 crc kubenswrapper[4599]: I1012 07:37:18.074936 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Oct 12 07:37:18 crc kubenswrapper[4599]: I1012 07:37:18.095494 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Oct 12 07:37:18 crc kubenswrapper[4599]: I1012 07:37:18.120064 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Oct 12 07:37:18 crc kubenswrapper[4599]: I1012 07:37:18.135714 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Oct 12 07:37:18 crc kubenswrapper[4599]: I1012 07:37:18.155774 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Oct 12 07:37:18 crc kubenswrapper[4599]: I1012 07:37:18.175152 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Oct 12 07:37:18 crc kubenswrapper[4599]: I1012 07:37:18.195639 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Oct 12 07:37:18 crc kubenswrapper[4599]: I1012 07:37:18.215412 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Oct 12 07:37:18 crc kubenswrapper[4599]: I1012 07:37:18.235769 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Oct 12 07:37:18 crc kubenswrapper[4599]: I1012 07:37:18.255860 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Oct 12 07:37:18 crc kubenswrapper[4599]: I1012 07:37:18.275741 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Oct 12 07:37:18 crc kubenswrapper[4599]: I1012 07:37:18.295744 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Oct 12 07:37:18 crc kubenswrapper[4599]: I1012 07:37:18.314144 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0409709f-7b11-4bcd-aec9-b8922c4474c9-auth-proxy-config\") pod \"machine-approver-56656f9798-f8wgl\" (UID: \"0409709f-7b11-4bcd-aec9-b8922c4474c9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f8wgl" Oct 12 07:37:18 crc kubenswrapper[4599]: I1012 07:37:18.314264 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/0409709f-7b11-4bcd-aec9-b8922c4474c9-machine-approver-tls\") pod \"machine-approver-56656f9798-f8wgl\" (UID: \"0409709f-7b11-4bcd-aec9-b8922c4474c9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f8wgl" Oct 12 07:37:18 crc kubenswrapper[4599]: I1012 07:37:18.314370 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0409709f-7b11-4bcd-aec9-b8922c4474c9-config\") pod \"machine-approver-56656f9798-f8wgl\" (UID: \"0409709f-7b11-4bcd-aec9-b8922c4474c9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f8wgl" Oct 12 07:37:18 crc kubenswrapper[4599]: I1012 07:37:18.315999 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Oct 12 07:37:18 crc kubenswrapper[4599]: I1012 07:37:18.335421 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Oct 12 07:37:18 crc kubenswrapper[4599]: I1012 07:37:18.355276 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Oct 12 07:37:18 crc kubenswrapper[4599]: I1012 07:37:18.375167 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Oct 12 07:37:18 crc kubenswrapper[4599]: I1012 07:37:18.396144 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Oct 12 07:37:18 crc kubenswrapper[4599]: I1012 07:37:18.415239 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Oct 12 07:37:18 crc kubenswrapper[4599]: I1012 07:37:18.415469 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f1daac45-8c29-46d6-a4ca-6c42bc99f1f7-trusted-ca-bundle\") pod \"console-f9d7485db-lwg5q\" (UID: \"f1daac45-8c29-46d6-a4ca-6c42bc99f1f7\") " pod="openshift-console/console-f9d7485db-lwg5q" Oct 12 07:37:18 crc kubenswrapper[4599]: I1012 07:37:18.415571 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f1daac45-8c29-46d6-a4ca-6c42bc99f1f7-console-oauth-config\") pod \"console-f9d7485db-lwg5q\" (UID: \"f1daac45-8c29-46d6-a4ca-6c42bc99f1f7\") " pod="openshift-console/console-f9d7485db-lwg5q" Oct 12 07:37:18 crc kubenswrapper[4599]: I1012 07:37:18.415630 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/b5519e59-bf53-4ae8-bcb0-4f7ff5b050e5-certs\") pod \"machine-config-server-k258m\" (UID: \"b5519e59-bf53-4ae8-bcb0-4f7ff5b050e5\") " pod="openshift-machine-config-operator/machine-config-server-k258m" Oct 12 07:37:18 crc kubenswrapper[4599]: I1012 07:37:18.415695 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/b5519e59-bf53-4ae8-bcb0-4f7ff5b050e5-node-bootstrap-token\") pod \"machine-config-server-k258m\" (UID: \"b5519e59-bf53-4ae8-bcb0-4f7ff5b050e5\") " pod="openshift-machine-config-operator/machine-config-server-k258m" Oct 12 07:37:18 crc kubenswrapper[4599]: I1012 07:37:18.415766 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f1daac45-8c29-46d6-a4ca-6c42bc99f1f7-service-ca\") pod \"console-f9d7485db-lwg5q\" (UID: \"f1daac45-8c29-46d6-a4ca-6c42bc99f1f7\") " pod="openshift-console/console-f9d7485db-lwg5q" Oct 12 07:37:18 crc kubenswrapper[4599]: I1012 07:37:18.415856 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a560875d-c07e-457b-a77b-809cc770867c-secret-volume\") pod \"collect-profiles-29337570-l9r8r\" (UID: \"a560875d-c07e-457b-a77b-809cc770867c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337570-l9r8r" Oct 12 07:37:18 crc kubenswrapper[4599]: I1012 07:37:18.415924 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f1daac45-8c29-46d6-a4ca-6c42bc99f1f7-console-config\") pod \"console-f9d7485db-lwg5q\" (UID: \"f1daac45-8c29-46d6-a4ca-6c42bc99f1f7\") " pod="openshift-console/console-f9d7485db-lwg5q" Oct 12 07:37:18 crc kubenswrapper[4599]: I1012 07:37:18.415988 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f1daac45-8c29-46d6-a4ca-6c42bc99f1f7-console-serving-cert\") pod \"console-f9d7485db-lwg5q\" (UID: \"f1daac45-8c29-46d6-a4ca-6c42bc99f1f7\") " pod="openshift-console/console-f9d7485db-lwg5q" Oct 12 07:37:18 crc kubenswrapper[4599]: I1012 07:37:18.416081 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/11c1c8aa-1469-4229-9163-9df08ae4192f-srv-cert\") pod \"catalog-operator-68c6474976-wqwjn\" (UID: \"11c1c8aa-1469-4229-9163-9df08ae4192f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wqwjn" Oct 12 07:37:18 crc kubenswrapper[4599]: I1012 07:37:18.416183 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a560875d-c07e-457b-a77b-809cc770867c-config-volume\") pod \"collect-profiles-29337570-l9r8r\" (UID: \"a560875d-c07e-457b-a77b-809cc770867c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337570-l9r8r" Oct 12 07:37:18 crc kubenswrapper[4599]: I1012 07:37:18.416330 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/11c1c8aa-1469-4229-9163-9df08ae4192f-profile-collector-cert\") pod \"catalog-operator-68c6474976-wqwjn\" (UID: \"11c1c8aa-1469-4229-9163-9df08ae4192f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wqwjn" Oct 12 07:37:18 crc kubenswrapper[4599]: I1012 07:37:18.416429 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f1daac45-8c29-46d6-a4ca-6c42bc99f1f7-oauth-serving-cert\") pod \"console-f9d7485db-lwg5q\" (UID: \"f1daac45-8c29-46d6-a4ca-6c42bc99f1f7\") " pod="openshift-console/console-f9d7485db-lwg5q" Oct 12 07:37:18 crc kubenswrapper[4599]: I1012 07:37:18.417176 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f1daac45-8c29-46d6-a4ca-6c42bc99f1f7-console-config\") pod \"console-f9d7485db-lwg5q\" (UID: \"f1daac45-8c29-46d6-a4ca-6c42bc99f1f7\") " pod="openshift-console/console-f9d7485db-lwg5q" Oct 12 07:37:18 crc kubenswrapper[4599]: I1012 07:37:18.417520 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f1daac45-8c29-46d6-a4ca-6c42bc99f1f7-service-ca\") pod \"console-f9d7485db-lwg5q\" (UID: \"f1daac45-8c29-46d6-a4ca-6c42bc99f1f7\") " pod="openshift-console/console-f9d7485db-lwg5q" Oct 12 07:37:18 crc kubenswrapper[4599]: I1012 07:37:18.417587 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f1daac45-8c29-46d6-a4ca-6c42bc99f1f7-oauth-serving-cert\") pod \"console-f9d7485db-lwg5q\" (UID: \"f1daac45-8c29-46d6-a4ca-6c42bc99f1f7\") " pod="openshift-console/console-f9d7485db-lwg5q" Oct 12 07:37:18 crc kubenswrapper[4599]: I1012 07:37:18.417679 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a560875d-c07e-457b-a77b-809cc770867c-config-volume\") pod \"collect-profiles-29337570-l9r8r\" (UID: \"a560875d-c07e-457b-a77b-809cc770867c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337570-l9r8r" Oct 12 07:37:18 crc kubenswrapper[4599]: I1012 07:37:18.418596 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f1daac45-8c29-46d6-a4ca-6c42bc99f1f7-trusted-ca-bundle\") pod \"console-f9d7485db-lwg5q\" (UID: \"f1daac45-8c29-46d6-a4ca-6c42bc99f1f7\") " pod="openshift-console/console-f9d7485db-lwg5q" Oct 12 07:37:18 crc kubenswrapper[4599]: I1012 07:37:18.419181 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/b5519e59-bf53-4ae8-bcb0-4f7ff5b050e5-node-bootstrap-token\") pod \"machine-config-server-k258m\" (UID: \"b5519e59-bf53-4ae8-bcb0-4f7ff5b050e5\") " pod="openshift-machine-config-operator/machine-config-server-k258m" Oct 12 07:37:18 crc kubenswrapper[4599]: I1012 07:37:18.420204 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f1daac45-8c29-46d6-a4ca-6c42bc99f1f7-console-oauth-config\") pod \"console-f9d7485db-lwg5q\" (UID: \"f1daac45-8c29-46d6-a4ca-6c42bc99f1f7\") " pod="openshift-console/console-f9d7485db-lwg5q" Oct 12 07:37:18 crc kubenswrapper[4599]: I1012 07:37:18.420446 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/11c1c8aa-1469-4229-9163-9df08ae4192f-profile-collector-cert\") pod \"catalog-operator-68c6474976-wqwjn\" (UID: \"11c1c8aa-1469-4229-9163-9df08ae4192f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wqwjn" Oct 12 07:37:18 crc kubenswrapper[4599]: I1012 07:37:18.420759 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/11c1c8aa-1469-4229-9163-9df08ae4192f-srv-cert\") pod \"catalog-operator-68c6474976-wqwjn\" (UID: \"11c1c8aa-1469-4229-9163-9df08ae4192f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wqwjn" Oct 12 07:37:18 crc kubenswrapper[4599]: I1012 07:37:18.421709 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a560875d-c07e-457b-a77b-809cc770867c-secret-volume\") pod \"collect-profiles-29337570-l9r8r\" (UID: \"a560875d-c07e-457b-a77b-809cc770867c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337570-l9r8r" Oct 12 07:37:18 crc kubenswrapper[4599]: I1012 07:37:18.422218 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f1daac45-8c29-46d6-a4ca-6c42bc99f1f7-console-serving-cert\") pod \"console-f9d7485db-lwg5q\" (UID: \"f1daac45-8c29-46d6-a4ca-6c42bc99f1f7\") " pod="openshift-console/console-f9d7485db-lwg5q" Oct 12 07:37:18 crc kubenswrapper[4599]: I1012 07:37:18.436012 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Oct 12 07:37:18 crc kubenswrapper[4599]: I1012 07:37:18.456066 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Oct 12 07:37:18 crc kubenswrapper[4599]: I1012 07:37:18.469068 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/b5519e59-bf53-4ae8-bcb0-4f7ff5b050e5-certs\") pod \"machine-config-server-k258m\" (UID: \"b5519e59-bf53-4ae8-bcb0-4f7ff5b050e5\") " pod="openshift-machine-config-operator/machine-config-server-k258m" Oct 12 07:37:18 crc kubenswrapper[4599]: I1012 07:37:18.506942 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8f6vn\" (UniqueName: \"kubernetes.io/projected/2b0cadaa-2cc9-45a1-add5-6d1b13c114f8-kube-api-access-8f6vn\") pod \"controller-manager-879f6c89f-4twv2\" (UID: \"2b0cadaa-2cc9-45a1-add5-6d1b13c114f8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4twv2" Oct 12 07:37:18 crc kubenswrapper[4599]: I1012 07:37:18.527197 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tl8hm\" (UniqueName: \"kubernetes.io/projected/1e511cbc-ca72-4380-8626-d9cade8ce3e2-kube-api-access-tl8hm\") pod \"machine-api-operator-5694c8668f-tgck5\" (UID: \"1e511cbc-ca72-4380-8626-d9cade8ce3e2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tgck5" Oct 12 07:37:18 crc kubenswrapper[4599]: I1012 07:37:18.545827 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n26sd\" (UniqueName: \"kubernetes.io/projected/ddcd57da-ffda-491f-bfaf-c484979a8121-kube-api-access-n26sd\") pod \"openshift-controller-manager-operator-756b6f6bc6-ngfnl\" (UID: \"ddcd57da-ffda-491f-bfaf-c484979a8121\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ngfnl" Oct 12 07:37:18 crc kubenswrapper[4599]: I1012 07:37:18.566699 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ghlf\" (UniqueName: \"kubernetes.io/projected/1ce36f77-945f-473d-8fa5-011ee88d9adb-kube-api-access-9ghlf\") pod \"cluster-image-registry-operator-dc59b4c8b-42f4k\" (UID: \"1ce36f77-945f-473d-8fa5-011ee88d9adb\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-42f4k" Oct 12 07:37:18 crc kubenswrapper[4599]: I1012 07:37:18.587073 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7j2ww\" (UniqueName: \"kubernetes.io/projected/69d5f53c-375a-424e-a120-93d89d06ae50-kube-api-access-7j2ww\") pod \"route-controller-manager-6576b87f9c-8r7fw\" (UID: \"69d5f53c-375a-424e-a120-93d89d06ae50\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8r7fw" Oct 12 07:37:18 crc kubenswrapper[4599]: I1012 07:37:18.614915 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bab40ec2-3024-45f4-9efb-f07829d8786b-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ggnjr\" (UID: \"bab40ec2-3024-45f4-9efb-f07829d8786b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ggnjr" Oct 12 07:37:18 crc kubenswrapper[4599]: I1012 07:37:18.626233 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4257\" (UniqueName: \"kubernetes.io/projected/10b365db-4478-4250-abe6-fa9e77354d70-kube-api-access-x4257\") pod \"apiserver-76f77b778f-v2b5m\" (UID: \"10b365db-4478-4250-abe6-fa9e77354d70\") " pod="openshift-apiserver/apiserver-76f77b778f-v2b5m" Oct 12 07:37:18 crc kubenswrapper[4599]: I1012 07:37:18.640435 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ngfnl" Oct 12 07:37:18 crc kubenswrapper[4599]: I1012 07:37:18.646481 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mk4p2\" (UniqueName: \"kubernetes.io/projected/da19aa12-0eea-49f0-820a-b2d6da99484a-kube-api-access-mk4p2\") pod \"authentication-operator-69f744f599-nlffm\" (UID: \"da19aa12-0eea-49f0-820a-b2d6da99484a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-nlffm" Oct 12 07:37:18 crc kubenswrapper[4599]: I1012 07:37:18.656608 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8r7fw" Oct 12 07:37:18 crc kubenswrapper[4599]: I1012 07:37:18.667649 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdzsp\" (UniqueName: \"kubernetes.io/projected/be2a1543-ec30-4cc1-923f-62123740ac1a-kube-api-access-xdzsp\") pod \"openshift-config-operator-7777fb866f-gj5kj\" (UID: \"be2a1543-ec30-4cc1-923f-62123740ac1a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-gj5kj" Oct 12 07:37:18 crc kubenswrapper[4599]: I1012 07:37:18.688508 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67s26\" (UniqueName: \"kubernetes.io/projected/fd61fed5-8370-4d59-8604-be1c73527f77-kube-api-access-67s26\") pod \"packageserver-d55dfcdfc-h94q9\" (UID: \"fd61fed5-8370-4d59-8604-be1c73527f77\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h94q9" Oct 12 07:37:18 crc kubenswrapper[4599]: I1012 07:37:18.691209 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-4twv2" Oct 12 07:37:18 crc kubenswrapper[4599]: I1012 07:37:18.694313 4599 request.go:700] Waited for 1.890501402s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-cluster-samples-operator/serviceaccounts/cluster-samples-operator/token Oct 12 07:37:18 crc kubenswrapper[4599]: I1012 07:37:18.698966 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-tgck5" Oct 12 07:37:18 crc kubenswrapper[4599]: I1012 07:37:18.710244 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pf7z9\" (UniqueName: \"kubernetes.io/projected/4459d391-5fe4-4159-b385-9c35fe4cfe77-kube-api-access-pf7z9\") pod \"cluster-samples-operator-665b6dd947-psd6r\" (UID: \"4459d391-5fe4-4159-b385-9c35fe4cfe77\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-psd6r" Oct 12 07:37:18 crc kubenswrapper[4599]: I1012 07:37:18.726477 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-nlffm" Oct 12 07:37:18 crc kubenswrapper[4599]: I1012 07:37:18.730902 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26gj4\" (UniqueName: \"kubernetes.io/projected/ce6df133-1d7c-41a7-a9a8-a18676bae045-kube-api-access-26gj4\") pod \"apiserver-7bbb656c7d-pnj7m\" (UID: \"ce6df133-1d7c-41a7-a9a8-a18676bae045\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pnj7m" Oct 12 07:37:18 crc kubenswrapper[4599]: I1012 07:37:18.748115 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zws9k\" (UniqueName: \"kubernetes.io/projected/06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1-kube-api-access-zws9k\") pod \"oauth-openshift-558db77b4-jlnn9\" (UID: \"06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1\") " pod="openshift-authentication/oauth-openshift-558db77b4-jlnn9" Oct 12 07:37:18 crc kubenswrapper[4599]: I1012 07:37:18.769783 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jd2kf\" (UniqueName: \"kubernetes.io/projected/09a80831-32a7-4583-909d-96fa119c7aa1-kube-api-access-jd2kf\") pod \"etcd-operator-b45778765-8nzkj\" (UID: \"09a80831-32a7-4583-909d-96fa119c7aa1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8nzkj" Oct 12 07:37:18 crc kubenswrapper[4599]: I1012 07:37:18.776813 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-v2b5m" Oct 12 07:37:18 crc kubenswrapper[4599]: I1012 07:37:18.787111 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ngfnl"] Oct 12 07:37:18 crc kubenswrapper[4599]: I1012 07:37:18.800265 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-gj5kj" Oct 12 07:37:18 crc kubenswrapper[4599]: I1012 07:37:18.809366 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-8r7fw"] Oct 12 07:37:18 crc kubenswrapper[4599]: I1012 07:37:18.809824 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7d7kc\" (UniqueName: \"kubernetes.io/projected/3422bfa7-f155-488a-a72c-129bed440646-kube-api-access-7d7kc\") pod \"openshift-apiserver-operator-796bbdcf4f-26bbs\" (UID: \"3422bfa7-f155-488a-a72c-129bed440646\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-26bbs" Oct 12 07:37:18 crc kubenswrapper[4599]: I1012 07:37:18.828218 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ggnjr" Oct 12 07:37:18 crc kubenswrapper[4599]: I1012 07:37:18.828590 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1ce36f77-945f-473d-8fa5-011ee88d9adb-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-42f4k\" (UID: \"1ce36f77-945f-473d-8fa5-011ee88d9adb\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-42f4k" Oct 12 07:37:18 crc kubenswrapper[4599]: I1012 07:37:18.841515 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h94q9" Oct 12 07:37:18 crc kubenswrapper[4599]: I1012 07:37:18.846051 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-psd6r" Oct 12 07:37:18 crc kubenswrapper[4599]: I1012 07:37:18.853781 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-26bbs" Oct 12 07:37:18 crc kubenswrapper[4599]: I1012 07:37:18.858244 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-42f4k" Oct 12 07:37:18 crc kubenswrapper[4599]: I1012 07:37:18.866606 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-jlnn9" Oct 12 07:37:18 crc kubenswrapper[4599]: I1012 07:37:18.866863 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-4twv2"] Oct 12 07:37:18 crc kubenswrapper[4599]: I1012 07:37:18.871408 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pnj7m" Oct 12 07:37:18 crc kubenswrapper[4599]: I1012 07:37:18.877840 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wggb5\" (UniqueName: \"kubernetes.io/projected/1712254f-c8df-4c98-bfb2-bf79d98a6161-kube-api-access-wggb5\") pod \"router-default-5444994796-r64z4\" (UID: \"1712254f-c8df-4c98-bfb2-bf79d98a6161\") " pod="openshift-ingress/router-default-5444994796-r64z4" Oct 12 07:37:18 crc kubenswrapper[4599]: I1012 07:37:18.883197 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/003c1afa-e2c6-45c8-bdb3-1da462a231d3-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-5mhff\" (UID: \"003c1afa-e2c6-45c8-bdb3-1da462a231d3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5mhff" Oct 12 07:37:18 crc kubenswrapper[4599]: I1012 07:37:18.893875 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5vsk\" (UniqueName: \"kubernetes.io/projected/3cdcf878-abd9-481a-9f95-adeda51c77c8-kube-api-access-s5vsk\") pod \"machine-config-controller-84d6567774-njb7r\" (UID: \"3cdcf878-abd9-481a-9f95-adeda51c77c8\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-njb7r" Oct 12 07:37:18 crc kubenswrapper[4599]: I1012 07:37:18.910494 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69hmp\" (UniqueName: \"kubernetes.io/projected/9ffdc69e-7658-4b27-949a-594688dbee92-kube-api-access-69hmp\") pod \"ingress-operator-5b745b69d9-wvhc6\" (UID: \"9ffdc69e-7658-4b27-949a-594688dbee92\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wvhc6" Oct 12 07:37:18 crc kubenswrapper[4599]: I1012 07:37:18.925628 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-tgck5"] Oct 12 07:37:18 crc kubenswrapper[4599]: I1012 07:37:18.934509 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9ffdc69e-7658-4b27-949a-594688dbee92-bound-sa-token\") pod \"ingress-operator-5b745b69d9-wvhc6\" (UID: \"9ffdc69e-7658-4b27-949a-594688dbee92\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wvhc6" Oct 12 07:37:18 crc kubenswrapper[4599]: I1012 07:37:18.953641 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-8nzkj" Oct 12 07:37:18 crc kubenswrapper[4599]: I1012 07:37:18.958233 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-nlffm"] Oct 12 07:37:18 crc kubenswrapper[4599]: I1012 07:37:18.961150 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8skzv\" (UniqueName: \"kubernetes.io/projected/2bd8b379-5b19-453f-af09-9025f6b127db-kube-api-access-8skzv\") pod \"dns-operator-744455d44c-dn2jf\" (UID: \"2bd8b379-5b19-453f-af09-9025f6b127db\") " pod="openshift-dns-operator/dns-operator-744455d44c-dn2jf" Oct 12 07:37:18 crc kubenswrapper[4599]: I1012 07:37:18.968792 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhz5w\" (UniqueName: \"kubernetes.io/projected/b84f6b8e-f98e-4d84-823f-f83d7912bc6b-kube-api-access-mhz5w\") pod \"downloads-7954f5f757-kf2t6\" (UID: \"b84f6b8e-f98e-4d84-823f-f83d7912bc6b\") " pod="openshift-console/downloads-7954f5f757-kf2t6" Oct 12 07:37:18 crc kubenswrapper[4599]: I1012 07:37:18.992834 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7r5m6\" (UniqueName: \"kubernetes.io/projected/a823a439-112f-4160-b925-b9b0830ec38f-kube-api-access-7r5m6\") pod \"kube-storage-version-migrator-operator-b67b599dd-kvpk5\" (UID: \"a823a439-112f-4160-b925-b9b0830ec38f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kvpk5" Oct 12 07:37:18 crc kubenswrapper[4599]: I1012 07:37:18.999863 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.015152 4599 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.037971 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.055649 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.076199 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.078241 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ggnjr"] Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.097079 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.101867 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-kf2t6" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.110157 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-dn2jf" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.113854 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8r7fw" event={"ID":"69d5f53c-375a-424e-a120-93d89d06ae50","Type":"ContainerStarted","Data":"1e2e2fbce08b456e2555df4f72f7e3bacfb9ae9768ff4a4d00995b004ec99e1f"} Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.113903 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8r7fw" event={"ID":"69d5f53c-375a-424e-a120-93d89d06ae50","Type":"ContainerStarted","Data":"9d315ae2011162774cd2fcf7e3ac2a6355c14eff33398988ee2867625b67447a"} Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.114151 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8r7fw" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.115436 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.116488 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kvpk5" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.121071 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wvhc6" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.121916 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-tgck5" event={"ID":"1e511cbc-ca72-4380-8626-d9cade8ce3e2","Type":"ContainerStarted","Data":"90033d0893be338485536ef9c93fe3c3a1e9afee126798130a7353e1809aac28"} Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.123855 4599 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-8r7fw container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.33:8443/healthz\": dial tcp 10.217.0.33:8443: connect: connection refused" start-of-body= Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.123891 4599 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8r7fw" podUID="69d5f53c-375a-424e-a120-93d89d06ae50" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.33:8443/healthz\": dial tcp 10.217.0.33:8443: connect: connection refused" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.130780 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-4twv2" event={"ID":"2b0cadaa-2cc9-45a1-add5-6d1b13c114f8","Type":"ContainerStarted","Data":"e793344546dda56bb8c93f53481ee5ab7982deabefe22e9ee1934a7126048cce"} Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.130804 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-4twv2" event={"ID":"2b0cadaa-2cc9-45a1-add5-6d1b13c114f8","Type":"ContainerStarted","Data":"84287a8208edc4d28dfe446e6f2535c88edb8cf46e7c1186260e18c960562c90"} Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.131382 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-4twv2" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.133759 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-r64z4" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.134713 4599 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-4twv2 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.134740 4599 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-4twv2" podUID="2b0cadaa-2cc9-45a1-add5-6d1b13c114f8" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.138446 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ngfnl" event={"ID":"ddcd57da-ffda-491f-bfaf-c484979a8121","Type":"ContainerStarted","Data":"ee2786e929d0fb888f80fc680fe7de238bb1f92d4aead19151e42a9d2c11f6c5"} Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.138479 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ngfnl" event={"ID":"ddcd57da-ffda-491f-bfaf-c484979a8121","Type":"ContainerStarted","Data":"e00b7daf239ee9b1aaad567850ed648ec572fba6a1e6aa0d41562855a53a390e"} Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.144364 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-nlffm" event={"ID":"da19aa12-0eea-49f0-820a-b2d6da99484a","Type":"ContainerStarted","Data":"532a6f911126bd4a9f48be10d9366cbf6aeaa1641256945aa811a66023278826"} Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.150934 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72x27\" (UniqueName: \"kubernetes.io/projected/b22d491a-9add-4ec5-ad3e-e593f9ca93bd-kube-api-access-72x27\") pod \"control-plane-machine-set-operator-78cbb6b69f-44z27\" (UID: \"b22d491a-9add-4ec5-ad3e-e593f9ca93bd\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-44z27" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.174663 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kd5gw\" (UniqueName: \"kubernetes.io/projected/f1daac45-8c29-46d6-a4ca-6c42bc99f1f7-kube-api-access-kd5gw\") pod \"console-f9d7485db-lwg5q\" (UID: \"f1daac45-8c29-46d6-a4ca-6c42bc99f1f7\") " pod="openshift-console/console-f9d7485db-lwg5q" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.177500 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5mhff" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.184904 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-njb7r" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.192482 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-v2b5m"] Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.192868 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzz22\" (UniqueName: \"kubernetes.io/projected/a560875d-c07e-457b-a77b-809cc770867c-kube-api-access-pzz22\") pod \"collect-profiles-29337570-l9r8r\" (UID: \"a560875d-c07e-457b-a77b-809cc770867c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337570-l9r8r" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.219693 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfwgl\" (UniqueName: \"kubernetes.io/projected/b5519e59-bf53-4ae8-bcb0-4f7ff5b050e5-kube-api-access-bfwgl\") pod \"machine-config-server-k258m\" (UID: \"b5519e59-bf53-4ae8-bcb0-4f7ff5b050e5\") " pod="openshift-machine-config-operator/machine-config-server-k258m" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.232624 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbqms\" (UniqueName: \"kubernetes.io/projected/11c1c8aa-1469-4229-9163-9df08ae4192f-kube-api-access-gbqms\") pod \"catalog-operator-68c6474976-wqwjn\" (UID: \"11c1c8aa-1469-4229-9163-9df08ae4192f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wqwjn" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.249361 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6lxw\" (UniqueName: \"kubernetes.io/projected/0c13d504-e266-472f-9ca7-6644f8b37eef-kube-api-access-w6lxw\") pod \"migrator-59844c95c7-qw5t7\" (UID: \"0c13d504-e266-472f-9ca7-6644f8b37eef\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qw5t7" Oct 12 07:37:19 crc kubenswrapper[4599]: W1012 07:37:19.250738 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10b365db_4478_4250_abe6_fa9e77354d70.slice/crio-998542afb78b84d053410eb7ddb92f89f3c52175c772380544a0cff8acad2305 WatchSource:0}: Error finding container 998542afb78b84d053410eb7ddb92f89f3c52175c772380544a0cff8acad2305: Status 404 returned error can't find the container with id 998542afb78b84d053410eb7ddb92f89f3c52175c772380544a0cff8acad2305 Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.278795 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-44z27" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.279259 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5f74\" (UniqueName: \"kubernetes.io/projected/63a0b5d9-8401-4c7d-b897-295c91ffc508-kube-api-access-d5f74\") pod \"package-server-manager-789f6589d5-r6fd4\" (UID: \"63a0b5d9-8401-4c7d-b897-295c91ffc508\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r6fd4" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.286286 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qw5t7" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.294069 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-gj5kj"] Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.296179 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjb7q\" (UniqueName: \"kubernetes.io/projected/503dce4f-a5a4-4a20-8546-a0311a212b84-kube-api-access-vjb7q\") pod \"service-ca-operator-777779d784-4jqfc\" (UID: \"503dce4f-a5a4-4a20-8546-a0311a212b84\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4jqfc" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.310614 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bzvg\" (UniqueName: \"kubernetes.io/projected/c2d2a6b0-e99f-4c3b-8535-77d97c89d3ed-kube-api-access-5bzvg\") pod \"multus-admission-controller-857f4d67dd-gjqh4\" (UID: \"c2d2a6b0-e99f-4c3b-8535-77d97c89d3ed\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-gjqh4" Oct 12 07:37:19 crc kubenswrapper[4599]: W1012 07:37:19.312743 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe2a1543_ec30_4cc1_923f_62123740ac1a.slice/crio-6ecaead645bf82b00a678feec44001ebcf26f9473880b175a1270b0c5c8e9f78 WatchSource:0}: Error finding container 6ecaead645bf82b00a678feec44001ebcf26f9473880b175a1270b0c5c8e9f78: Status 404 returned error can't find the container with id 6ecaead645bf82b00a678feec44001ebcf26f9473880b175a1270b0c5c8e9f78 Oct 12 07:37:19 crc kubenswrapper[4599]: E1012 07:37:19.314484 4599 configmap.go:193] Couldn't get configMap openshift-cluster-machine-approver/machine-approver-config: failed to sync configmap cache: timed out waiting for the condition Oct 12 07:37:19 crc kubenswrapper[4599]: E1012 07:37:19.314557 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0409709f-7b11-4bcd-aec9-b8922c4474c9-config podName:0409709f-7b11-4bcd-aec9-b8922c4474c9 nodeName:}" failed. No retries permitted until 2025-10-12 07:37:20.31452189 +0000 UTC m=+137.103717392 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/0409709f-7b11-4bcd-aec9-b8922c4474c9-config") pod "machine-approver-56656f9798-f8wgl" (UID: "0409709f-7b11-4bcd-aec9-b8922c4474c9") : failed to sync configmap cache: timed out waiting for the condition Oct 12 07:37:19 crc kubenswrapper[4599]: E1012 07:37:19.314579 4599 secret.go:188] Couldn't get secret openshift-cluster-machine-approver/machine-approver-tls: failed to sync secret cache: timed out waiting for the condition Oct 12 07:37:19 crc kubenswrapper[4599]: E1012 07:37:19.314603 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0409709f-7b11-4bcd-aec9-b8922c4474c9-machine-approver-tls podName:0409709f-7b11-4bcd-aec9-b8922c4474c9 nodeName:}" failed. No retries permitted until 2025-10-12 07:37:20.314597113 +0000 UTC m=+137.103792614 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "machine-approver-tls" (UniqueName: "kubernetes.io/secret/0409709f-7b11-4bcd-aec9-b8922c4474c9-machine-approver-tls") pod "machine-approver-56656f9798-f8wgl" (UID: "0409709f-7b11-4bcd-aec9-b8922c4474c9") : failed to sync secret cache: timed out waiting for the condition Oct 12 07:37:19 crc kubenswrapper[4599]: E1012 07:37:19.314928 4599 configmap.go:193] Couldn't get configMap openshift-cluster-machine-approver/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Oct 12 07:37:19 crc kubenswrapper[4599]: E1012 07:37:19.314978 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0409709f-7b11-4bcd-aec9-b8922c4474c9-auth-proxy-config podName:0409709f-7b11-4bcd-aec9-b8922c4474c9 nodeName:}" failed. No retries permitted until 2025-10-12 07:37:20.314970346 +0000 UTC m=+137.104165848 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "auth-proxy-config" (UniqueName: "kubernetes.io/configmap/0409709f-7b11-4bcd-aec9-b8922c4474c9-auth-proxy-config") pod "machine-approver-56656f9798-f8wgl" (UID: "0409709f-7b11-4bcd-aec9-b8922c4474c9") : failed to sync configmap cache: timed out waiting for the condition Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.320882 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29337570-l9r8r" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.327769 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-lwg5q" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.330757 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cw7h7\" (UniqueName: \"kubernetes.io/projected/ee778128-189f-4177-9d3d-bbc1468da627-kube-api-access-cw7h7\") pod \"machine-config-operator-74547568cd-9m75z\" (UID: \"ee778128-189f-4177-9d3d-bbc1468da627\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9m75z" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.335550 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.337928 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wqwjn" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.358979 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.365056 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-k258m" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.395424 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-pnj7m"] Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.396007 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.396279 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.396908 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-jlnn9"] Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.413525 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-psd6r"] Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.415228 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h94q9"] Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.415519 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.422919 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6xzs\" (UniqueName: \"kubernetes.io/projected/0409709f-7b11-4bcd-aec9-b8922c4474c9-kube-api-access-j6xzs\") pod \"machine-approver-56656f9798-f8wgl\" (UID: \"0409709f-7b11-4bcd-aec9-b8922c4474c9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f8wgl" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.458017 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.505195 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-8nzkj"] Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.514158 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-26bbs"] Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.514180 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-42f4k"] Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.536118 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/147458bd-e3d6-4425-99ab-d2bd59fc2816-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-5kmjp\" (UID: \"147458bd-e3d6-4425-99ab-d2bd59fc2816\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5kmjp" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.536150 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c565f4c6-fef1-4ba7-be0b-cfb3ade367a9-srv-cert\") pod \"olm-operator-6b444d44fb-xq6bd\" (UID: \"c565f4c6-fef1-4ba7-be0b-cfb3ade367a9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xq6bd" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.536170 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6d9a3bfc-818e-445a-a027-264cfcfade2b-ca-trust-extracted\") pod \"image-registry-697d97f7c8-fwlgl\" (UID: \"6d9a3bfc-818e-445a-a027-264cfcfade2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-fwlgl" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.536218 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fwlgl\" (UID: \"6d9a3bfc-818e-445a-a027-264cfcfade2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-fwlgl" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.536247 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c565f4c6-fef1-4ba7-be0b-cfb3ade367a9-profile-collector-cert\") pod \"olm-operator-6b444d44fb-xq6bd\" (UID: \"c565f4c6-fef1-4ba7-be0b-cfb3ade367a9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xq6bd" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.536268 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6d9a3bfc-818e-445a-a027-264cfcfade2b-bound-sa-token\") pod \"image-registry-697d97f7c8-fwlgl\" (UID: \"6d9a3bfc-818e-445a-a027-264cfcfade2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-fwlgl" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.536283 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndr7d\" (UniqueName: \"kubernetes.io/projected/c565f4c6-fef1-4ba7-be0b-cfb3ade367a9-kube-api-access-ndr7d\") pod \"olm-operator-6b444d44fb-xq6bd\" (UID: \"c565f4c6-fef1-4ba7-be0b-cfb3ade367a9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xq6bd" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.536324 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6d9a3bfc-818e-445a-a027-264cfcfade2b-installation-pull-secrets\") pod \"image-registry-697d97f7c8-fwlgl\" (UID: \"6d9a3bfc-818e-445a-a027-264cfcfade2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-fwlgl" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.536409 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/147458bd-e3d6-4425-99ab-d2bd59fc2816-config\") pod \"kube-apiserver-operator-766d6c64bb-5kmjp\" (UID: \"147458bd-e3d6-4425-99ab-d2bd59fc2816\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5kmjp" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.536442 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6d9a3bfc-818e-445a-a027-264cfcfade2b-registry-tls\") pod \"image-registry-697d97f7c8-fwlgl\" (UID: \"6d9a3bfc-818e-445a-a027-264cfcfade2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-fwlgl" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.536460 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a67d55ed-eb6f-4660-9013-de71da44aad7-config\") pod \"console-operator-58897d9998-mhj6z\" (UID: \"a67d55ed-eb6f-4660-9013-de71da44aad7\") " pod="openshift-console-operator/console-operator-58897d9998-mhj6z" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.536493 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6d9a3bfc-818e-445a-a027-264cfcfade2b-trusted-ca\") pod \"image-registry-697d97f7c8-fwlgl\" (UID: \"6d9a3bfc-818e-445a-a027-264cfcfade2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-fwlgl" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.536519 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6d9a3bfc-818e-445a-a027-264cfcfade2b-registry-certificates\") pod \"image-registry-697d97f7c8-fwlgl\" (UID: \"6d9a3bfc-818e-445a-a027-264cfcfade2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-fwlgl" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.536534 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4ffc\" (UniqueName: \"kubernetes.io/projected/ba82c167-7692-4fe7-843a-f9d17d328cfc-kube-api-access-p4ffc\") pod \"service-ca-9c57cc56f-4b6ss\" (UID: \"ba82c167-7692-4fe7-843a-f9d17d328cfc\") " pod="openshift-service-ca/service-ca-9c57cc56f-4b6ss" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.536577 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ba82c167-7692-4fe7-843a-f9d17d328cfc-signing-key\") pod \"service-ca-9c57cc56f-4b6ss\" (UID: \"ba82c167-7692-4fe7-843a-f9d17d328cfc\") " pod="openshift-service-ca/service-ca-9c57cc56f-4b6ss" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.536591 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ba82c167-7692-4fe7-843a-f9d17d328cfc-signing-cabundle\") pod \"service-ca-9c57cc56f-4b6ss\" (UID: \"ba82c167-7692-4fe7-843a-f9d17d328cfc\") " pod="openshift-service-ca/service-ca-9c57cc56f-4b6ss" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.536605 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7a0298f9-796d-4f69-bc48-f74d34944b99-config-volume\") pod \"dns-default-h9rz4\" (UID: \"7a0298f9-796d-4f69-bc48-f74d34944b99\") " pod="openshift-dns/dns-default-h9rz4" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.536628 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrll8\" (UniqueName: \"kubernetes.io/projected/a67d55ed-eb6f-4660-9013-de71da44aad7-kube-api-access-nrll8\") pod \"console-operator-58897d9998-mhj6z\" (UID: \"a67d55ed-eb6f-4660-9013-de71da44aad7\") " pod="openshift-console-operator/console-operator-58897d9998-mhj6z" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.536646 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljjhc\" (UniqueName: \"kubernetes.io/projected/7a0298f9-796d-4f69-bc48-f74d34944b99-kube-api-access-ljjhc\") pod \"dns-default-h9rz4\" (UID: \"7a0298f9-796d-4f69-bc48-f74d34944b99\") " pod="openshift-dns/dns-default-h9rz4" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.536676 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a67d55ed-eb6f-4660-9013-de71da44aad7-trusted-ca\") pod \"console-operator-58897d9998-mhj6z\" (UID: \"a67d55ed-eb6f-4660-9013-de71da44aad7\") " pod="openshift-console-operator/console-operator-58897d9998-mhj6z" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.536693 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d7e17165-80a0-4e3a-921f-a7ccbf9a9fe5-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-ks8z9\" (UID: \"d7e17165-80a0-4e3a-921f-a7ccbf9a9fe5\") " pod="openshift-marketplace/marketplace-operator-79b997595-ks8z9" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.536753 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d7e17165-80a0-4e3a-921f-a7ccbf9a9fe5-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-ks8z9\" (UID: \"d7e17165-80a0-4e3a-921f-a7ccbf9a9fe5\") " pod="openshift-marketplace/marketplace-operator-79b997595-ks8z9" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.536769 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/147458bd-e3d6-4425-99ab-d2bd59fc2816-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-5kmjp\" (UID: \"147458bd-e3d6-4425-99ab-d2bd59fc2816\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5kmjp" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.536782 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9sst\" (UniqueName: \"kubernetes.io/projected/d7e17165-80a0-4e3a-921f-a7ccbf9a9fe5-kube-api-access-d9sst\") pod \"marketplace-operator-79b997595-ks8z9\" (UID: \"d7e17165-80a0-4e3a-921f-a7ccbf9a9fe5\") " pod="openshift-marketplace/marketplace-operator-79b997595-ks8z9" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.536838 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7a0298f9-796d-4f69-bc48-f74d34944b99-metrics-tls\") pod \"dns-default-h9rz4\" (UID: \"7a0298f9-796d-4f69-bc48-f74d34944b99\") " pod="openshift-dns/dns-default-h9rz4" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.536858 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92bvg\" (UniqueName: \"kubernetes.io/projected/6d9a3bfc-818e-445a-a027-264cfcfade2b-kube-api-access-92bvg\") pod \"image-registry-697d97f7c8-fwlgl\" (UID: \"6d9a3bfc-818e-445a-a027-264cfcfade2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-fwlgl" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.536889 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a67d55ed-eb6f-4660-9013-de71da44aad7-serving-cert\") pod \"console-operator-58897d9998-mhj6z\" (UID: \"a67d55ed-eb6f-4660-9013-de71da44aad7\") " pod="openshift-console-operator/console-operator-58897d9998-mhj6z" Oct 12 07:37:19 crc kubenswrapper[4599]: E1012 07:37:19.538987 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 07:37:20.038967633 +0000 UTC m=+136.828163135 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fwlgl" (UID: "6d9a3bfc-818e-445a-a027-264cfcfade2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.572678 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r6fd4" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.577704 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-njb7r"] Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.595144 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-4jqfc" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.599768 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-gjqh4" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.602612 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9m75z" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.629674 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-44z27"] Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.631753 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-dn2jf"] Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.637737 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.638016 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jz9st\" (UniqueName: \"kubernetes.io/projected/08fb2a9f-0906-4c64-97f8-232b3bc1cdd7-kube-api-access-jz9st\") pod \"ingress-canary-26f4s\" (UID: \"08fb2a9f-0906-4c64-97f8-232b3bc1cdd7\") " pod="openshift-ingress-canary/ingress-canary-26f4s" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.638042 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ba82c167-7692-4fe7-843a-f9d17d328cfc-signing-key\") pod \"service-ca-9c57cc56f-4b6ss\" (UID: \"ba82c167-7692-4fe7-843a-f9d17d328cfc\") " pod="openshift-service-ca/service-ca-9c57cc56f-4b6ss" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.638059 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ba82c167-7692-4fe7-843a-f9d17d328cfc-signing-cabundle\") pod \"service-ca-9c57cc56f-4b6ss\" (UID: \"ba82c167-7692-4fe7-843a-f9d17d328cfc\") " pod="openshift-service-ca/service-ca-9c57cc56f-4b6ss" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.638078 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7a0298f9-796d-4f69-bc48-f74d34944b99-config-volume\") pod \"dns-default-h9rz4\" (UID: \"7a0298f9-796d-4f69-bc48-f74d34944b99\") " pod="openshift-dns/dns-default-h9rz4" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.638110 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrll8\" (UniqueName: \"kubernetes.io/projected/a67d55ed-eb6f-4660-9013-de71da44aad7-kube-api-access-nrll8\") pod \"console-operator-58897d9998-mhj6z\" (UID: \"a67d55ed-eb6f-4660-9013-de71da44aad7\") " pod="openshift-console-operator/console-operator-58897d9998-mhj6z" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.638138 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljjhc\" (UniqueName: \"kubernetes.io/projected/7a0298f9-796d-4f69-bc48-f74d34944b99-kube-api-access-ljjhc\") pod \"dns-default-h9rz4\" (UID: \"7a0298f9-796d-4f69-bc48-f74d34944b99\") " pod="openshift-dns/dns-default-h9rz4" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.638179 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/2ca6d2a5-c322-47e7-97c5-cbae83506fe9-mountpoint-dir\") pod \"csi-hostpathplugin-ljckr\" (UID: \"2ca6d2a5-c322-47e7-97c5-cbae83506fe9\") " pod="hostpath-provisioner/csi-hostpathplugin-ljckr" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.638214 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a67d55ed-eb6f-4660-9013-de71da44aad7-trusted-ca\") pod \"console-operator-58897d9998-mhj6z\" (UID: \"a67d55ed-eb6f-4660-9013-de71da44aad7\") " pod="openshift-console-operator/console-operator-58897d9998-mhj6z" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.638230 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d7e17165-80a0-4e3a-921f-a7ccbf9a9fe5-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-ks8z9\" (UID: \"d7e17165-80a0-4e3a-921f-a7ccbf9a9fe5\") " pod="openshift-marketplace/marketplace-operator-79b997595-ks8z9" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.638247 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/2ca6d2a5-c322-47e7-97c5-cbae83506fe9-csi-data-dir\") pod \"csi-hostpathplugin-ljckr\" (UID: \"2ca6d2a5-c322-47e7-97c5-cbae83506fe9\") " pod="hostpath-provisioner/csi-hostpathplugin-ljckr" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.638280 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/2ca6d2a5-c322-47e7-97c5-cbae83506fe9-registration-dir\") pod \"csi-hostpathplugin-ljckr\" (UID: \"2ca6d2a5-c322-47e7-97c5-cbae83506fe9\") " pod="hostpath-provisioner/csi-hostpathplugin-ljckr" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.638365 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d7e17165-80a0-4e3a-921f-a7ccbf9a9fe5-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-ks8z9\" (UID: \"d7e17165-80a0-4e3a-921f-a7ccbf9a9fe5\") " pod="openshift-marketplace/marketplace-operator-79b997595-ks8z9" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.638381 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/147458bd-e3d6-4425-99ab-d2bd59fc2816-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-5kmjp\" (UID: \"147458bd-e3d6-4425-99ab-d2bd59fc2816\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5kmjp" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.638396 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9sst\" (UniqueName: \"kubernetes.io/projected/d7e17165-80a0-4e3a-921f-a7ccbf9a9fe5-kube-api-access-d9sst\") pod \"marketplace-operator-79b997595-ks8z9\" (UID: \"d7e17165-80a0-4e3a-921f-a7ccbf9a9fe5\") " pod="openshift-marketplace/marketplace-operator-79b997595-ks8z9" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.638472 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7a0298f9-796d-4f69-bc48-f74d34944b99-metrics-tls\") pod \"dns-default-h9rz4\" (UID: \"7a0298f9-796d-4f69-bc48-f74d34944b99\") " pod="openshift-dns/dns-default-h9rz4" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.638497 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/2ca6d2a5-c322-47e7-97c5-cbae83506fe9-plugins-dir\") pod \"csi-hostpathplugin-ljckr\" (UID: \"2ca6d2a5-c322-47e7-97c5-cbae83506fe9\") " pod="hostpath-provisioner/csi-hostpathplugin-ljckr" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.638537 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92bvg\" (UniqueName: \"kubernetes.io/projected/6d9a3bfc-818e-445a-a027-264cfcfade2b-kube-api-access-92bvg\") pod \"image-registry-697d97f7c8-fwlgl\" (UID: \"6d9a3bfc-818e-445a-a027-264cfcfade2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-fwlgl" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.638559 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/08fb2a9f-0906-4c64-97f8-232b3bc1cdd7-cert\") pod \"ingress-canary-26f4s\" (UID: \"08fb2a9f-0906-4c64-97f8-232b3bc1cdd7\") " pod="openshift-ingress-canary/ingress-canary-26f4s" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.638585 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a67d55ed-eb6f-4660-9013-de71da44aad7-serving-cert\") pod \"console-operator-58897d9998-mhj6z\" (UID: \"a67d55ed-eb6f-4660-9013-de71da44aad7\") " pod="openshift-console-operator/console-operator-58897d9998-mhj6z" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.638630 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/147458bd-e3d6-4425-99ab-d2bd59fc2816-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-5kmjp\" (UID: \"147458bd-e3d6-4425-99ab-d2bd59fc2816\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5kmjp" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.638645 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c565f4c6-fef1-4ba7-be0b-cfb3ade367a9-srv-cert\") pod \"olm-operator-6b444d44fb-xq6bd\" (UID: \"c565f4c6-fef1-4ba7-be0b-cfb3ade367a9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xq6bd" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.638704 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6d9a3bfc-818e-445a-a027-264cfcfade2b-ca-trust-extracted\") pod \"image-registry-697d97f7c8-fwlgl\" (UID: \"6d9a3bfc-818e-445a-a027-264cfcfade2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-fwlgl" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.638729 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c565f4c6-fef1-4ba7-be0b-cfb3ade367a9-profile-collector-cert\") pod \"olm-operator-6b444d44fb-xq6bd\" (UID: \"c565f4c6-fef1-4ba7-be0b-cfb3ade367a9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xq6bd" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.638744 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6d9a3bfc-818e-445a-a027-264cfcfade2b-bound-sa-token\") pod \"image-registry-697d97f7c8-fwlgl\" (UID: \"6d9a3bfc-818e-445a-a027-264cfcfade2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-fwlgl" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.638762 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndr7d\" (UniqueName: \"kubernetes.io/projected/c565f4c6-fef1-4ba7-be0b-cfb3ade367a9-kube-api-access-ndr7d\") pod \"olm-operator-6b444d44fb-xq6bd\" (UID: \"c565f4c6-fef1-4ba7-be0b-cfb3ade367a9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xq6bd" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.638779 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6d9a3bfc-818e-445a-a027-264cfcfade2b-installation-pull-secrets\") pod \"image-registry-697d97f7c8-fwlgl\" (UID: \"6d9a3bfc-818e-445a-a027-264cfcfade2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-fwlgl" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.638804 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/2ca6d2a5-c322-47e7-97c5-cbae83506fe9-socket-dir\") pod \"csi-hostpathplugin-ljckr\" (UID: \"2ca6d2a5-c322-47e7-97c5-cbae83506fe9\") " pod="hostpath-provisioner/csi-hostpathplugin-ljckr" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.638818 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4jqz\" (UniqueName: \"kubernetes.io/projected/2ca6d2a5-c322-47e7-97c5-cbae83506fe9-kube-api-access-k4jqz\") pod \"csi-hostpathplugin-ljckr\" (UID: \"2ca6d2a5-c322-47e7-97c5-cbae83506fe9\") " pod="hostpath-provisioner/csi-hostpathplugin-ljckr" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.638890 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/147458bd-e3d6-4425-99ab-d2bd59fc2816-config\") pod \"kube-apiserver-operator-766d6c64bb-5kmjp\" (UID: \"147458bd-e3d6-4425-99ab-d2bd59fc2816\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5kmjp" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.638928 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6d9a3bfc-818e-445a-a027-264cfcfade2b-registry-tls\") pod \"image-registry-697d97f7c8-fwlgl\" (UID: \"6d9a3bfc-818e-445a-a027-264cfcfade2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-fwlgl" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.638943 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a67d55ed-eb6f-4660-9013-de71da44aad7-config\") pod \"console-operator-58897d9998-mhj6z\" (UID: \"a67d55ed-eb6f-4660-9013-de71da44aad7\") " pod="openshift-console-operator/console-operator-58897d9998-mhj6z" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.638970 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6d9a3bfc-818e-445a-a027-264cfcfade2b-trusted-ca\") pod \"image-registry-697d97f7c8-fwlgl\" (UID: \"6d9a3bfc-818e-445a-a027-264cfcfade2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-fwlgl" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.646863 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6d9a3bfc-818e-445a-a027-264cfcfade2b-registry-certificates\") pod \"image-registry-697d97f7c8-fwlgl\" (UID: \"6d9a3bfc-818e-445a-a027-264cfcfade2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-fwlgl" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.646912 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4ffc\" (UniqueName: \"kubernetes.io/projected/ba82c167-7692-4fe7-843a-f9d17d328cfc-kube-api-access-p4ffc\") pod \"service-ca-9c57cc56f-4b6ss\" (UID: \"ba82c167-7692-4fe7-843a-f9d17d328cfc\") " pod="openshift-service-ca/service-ca-9c57cc56f-4b6ss" Oct 12 07:37:19 crc kubenswrapper[4599]: E1012 07:37:19.647883 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 07:37:20.147850993 +0000 UTC m=+136.937046495 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.648024 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-wvhc6"] Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.649980 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kvpk5"] Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.674992 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7a0298f9-796d-4f69-bc48-f74d34944b99-config-volume\") pod \"dns-default-h9rz4\" (UID: \"7a0298f9-796d-4f69-bc48-f74d34944b99\") " pod="openshift-dns/dns-default-h9rz4" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.676324 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6d9a3bfc-818e-445a-a027-264cfcfade2b-registry-certificates\") pod \"image-registry-697d97f7c8-fwlgl\" (UID: \"6d9a3bfc-818e-445a-a027-264cfcfade2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-fwlgl" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.692046 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-kf2t6"] Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.697819 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/147458bd-e3d6-4425-99ab-d2bd59fc2816-config\") pod \"kube-apiserver-operator-766d6c64bb-5kmjp\" (UID: \"147458bd-e3d6-4425-99ab-d2bd59fc2816\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5kmjp" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.710719 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c565f4c6-fef1-4ba7-be0b-cfb3ade367a9-srv-cert\") pod \"olm-operator-6b444d44fb-xq6bd\" (UID: \"c565f4c6-fef1-4ba7-be0b-cfb3ade367a9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xq6bd" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.710780 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a67d55ed-eb6f-4660-9013-de71da44aad7-trusted-ca\") pod \"console-operator-58897d9998-mhj6z\" (UID: \"a67d55ed-eb6f-4660-9013-de71da44aad7\") " pod="openshift-console-operator/console-operator-58897d9998-mhj6z" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.711983 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c565f4c6-fef1-4ba7-be0b-cfb3ade367a9-profile-collector-cert\") pod \"olm-operator-6b444d44fb-xq6bd\" (UID: \"c565f4c6-fef1-4ba7-be0b-cfb3ade367a9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xq6bd" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.714969 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d7e17165-80a0-4e3a-921f-a7ccbf9a9fe5-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-ks8z9\" (UID: \"d7e17165-80a0-4e3a-921f-a7ccbf9a9fe5\") " pod="openshift-marketplace/marketplace-operator-79b997595-ks8z9" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.717101 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a67d55ed-eb6f-4660-9013-de71da44aad7-serving-cert\") pod \"console-operator-58897d9998-mhj6z\" (UID: \"a67d55ed-eb6f-4660-9013-de71da44aad7\") " pod="openshift-console-operator/console-operator-58897d9998-mhj6z" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.718124 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7a0298f9-796d-4f69-bc48-f74d34944b99-metrics-tls\") pod \"dns-default-h9rz4\" (UID: \"7a0298f9-796d-4f69-bc48-f74d34944b99\") " pod="openshift-dns/dns-default-h9rz4" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.718576 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ba82c167-7692-4fe7-843a-f9d17d328cfc-signing-cabundle\") pod \"service-ca-9c57cc56f-4b6ss\" (UID: \"ba82c167-7692-4fe7-843a-f9d17d328cfc\") " pod="openshift-service-ca/service-ca-9c57cc56f-4b6ss" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.719141 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ba82c167-7692-4fe7-843a-f9d17d328cfc-signing-key\") pod \"service-ca-9c57cc56f-4b6ss\" (UID: \"ba82c167-7692-4fe7-843a-f9d17d328cfc\") " pod="openshift-service-ca/service-ca-9c57cc56f-4b6ss" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.726154 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4ffc\" (UniqueName: \"kubernetes.io/projected/ba82c167-7692-4fe7-843a-f9d17d328cfc-kube-api-access-p4ffc\") pod \"service-ca-9c57cc56f-4b6ss\" (UID: \"ba82c167-7692-4fe7-843a-f9d17d328cfc\") " pod="openshift-service-ca/service-ca-9c57cc56f-4b6ss" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.728428 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6d9a3bfc-818e-445a-a027-264cfcfade2b-installation-pull-secrets\") pod \"image-registry-697d97f7c8-fwlgl\" (UID: \"6d9a3bfc-818e-445a-a027-264cfcfade2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-fwlgl" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.729527 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6d9a3bfc-818e-445a-a027-264cfcfade2b-trusted-ca\") pod \"image-registry-697d97f7c8-fwlgl\" (UID: \"6d9a3bfc-818e-445a-a027-264cfcfade2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-fwlgl" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.731633 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6d9a3bfc-818e-445a-a027-264cfcfade2b-ca-trust-extracted\") pod \"image-registry-697d97f7c8-fwlgl\" (UID: \"6d9a3bfc-818e-445a-a027-264cfcfade2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-fwlgl" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.742468 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6d9a3bfc-818e-445a-a027-264cfcfade2b-registry-tls\") pod \"image-registry-697d97f7c8-fwlgl\" (UID: \"6d9a3bfc-818e-445a-a027-264cfcfade2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-fwlgl" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.742809 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/147458bd-e3d6-4425-99ab-d2bd59fc2816-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-5kmjp\" (UID: \"147458bd-e3d6-4425-99ab-d2bd59fc2816\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5kmjp" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.743782 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6d9a3bfc-818e-445a-a027-264cfcfade2b-bound-sa-token\") pod \"image-registry-697d97f7c8-fwlgl\" (UID: \"6d9a3bfc-818e-445a-a027-264cfcfade2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-fwlgl" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.750992 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d7e17165-80a0-4e3a-921f-a7ccbf9a9fe5-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-ks8z9\" (UID: \"d7e17165-80a0-4e3a-921f-a7ccbf9a9fe5\") " pod="openshift-marketplace/marketplace-operator-79b997595-ks8z9" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.751191 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jz9st\" (UniqueName: \"kubernetes.io/projected/08fb2a9f-0906-4c64-97f8-232b3bc1cdd7-kube-api-access-jz9st\") pod \"ingress-canary-26f4s\" (UID: \"08fb2a9f-0906-4c64-97f8-232b3bc1cdd7\") " pod="openshift-ingress-canary/ingress-canary-26f4s" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.751205 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a67d55ed-eb6f-4660-9013-de71da44aad7-config\") pod \"console-operator-58897d9998-mhj6z\" (UID: \"a67d55ed-eb6f-4660-9013-de71da44aad7\") " pod="openshift-console-operator/console-operator-58897d9998-mhj6z" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.751381 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/2ca6d2a5-c322-47e7-97c5-cbae83506fe9-mountpoint-dir\") pod \"csi-hostpathplugin-ljckr\" (UID: \"2ca6d2a5-c322-47e7-97c5-cbae83506fe9\") " pod="hostpath-provisioner/csi-hostpathplugin-ljckr" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.751413 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/2ca6d2a5-c322-47e7-97c5-cbae83506fe9-csi-data-dir\") pod \"csi-hostpathplugin-ljckr\" (UID: \"2ca6d2a5-c322-47e7-97c5-cbae83506fe9\") " pod="hostpath-provisioner/csi-hostpathplugin-ljckr" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.751433 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/2ca6d2a5-c322-47e7-97c5-cbae83506fe9-registration-dir\") pod \"csi-hostpathplugin-ljckr\" (UID: \"2ca6d2a5-c322-47e7-97c5-cbae83506fe9\") " pod="hostpath-provisioner/csi-hostpathplugin-ljckr" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.751493 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/2ca6d2a5-c322-47e7-97c5-cbae83506fe9-plugins-dir\") pod \"csi-hostpathplugin-ljckr\" (UID: \"2ca6d2a5-c322-47e7-97c5-cbae83506fe9\") " pod="hostpath-provisioner/csi-hostpathplugin-ljckr" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.752279 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/08fb2a9f-0906-4c64-97f8-232b3bc1cdd7-cert\") pod \"ingress-canary-26f4s\" (UID: \"08fb2a9f-0906-4c64-97f8-232b3bc1cdd7\") " pod="openshift-ingress-canary/ingress-canary-26f4s" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.752368 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fwlgl\" (UID: \"6d9a3bfc-818e-445a-a027-264cfcfade2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-fwlgl" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.752412 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/2ca6d2a5-c322-47e7-97c5-cbae83506fe9-socket-dir\") pod \"csi-hostpathplugin-ljckr\" (UID: \"2ca6d2a5-c322-47e7-97c5-cbae83506fe9\") " pod="hostpath-provisioner/csi-hostpathplugin-ljckr" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.752429 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4jqz\" (UniqueName: \"kubernetes.io/projected/2ca6d2a5-c322-47e7-97c5-cbae83506fe9-kube-api-access-k4jqz\") pod \"csi-hostpathplugin-ljckr\" (UID: \"2ca6d2a5-c322-47e7-97c5-cbae83506fe9\") " pod="hostpath-provisioner/csi-hostpathplugin-ljckr" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.752629 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/2ca6d2a5-c322-47e7-97c5-cbae83506fe9-plugins-dir\") pod \"csi-hostpathplugin-ljckr\" (UID: \"2ca6d2a5-c322-47e7-97c5-cbae83506fe9\") " pod="hostpath-provisioner/csi-hostpathplugin-ljckr" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.751570 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/2ca6d2a5-c322-47e7-97c5-cbae83506fe9-csi-data-dir\") pod \"csi-hostpathplugin-ljckr\" (UID: \"2ca6d2a5-c322-47e7-97c5-cbae83506fe9\") " pod="hostpath-provisioner/csi-hostpathplugin-ljckr" Oct 12 07:37:19 crc kubenswrapper[4599]: E1012 07:37:19.758718 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 07:37:20.258686846 +0000 UTC m=+137.047882349 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fwlgl" (UID: "6d9a3bfc-818e-445a-a027-264cfcfade2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.751606 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/2ca6d2a5-c322-47e7-97c5-cbae83506fe9-mountpoint-dir\") pod \"csi-hostpathplugin-ljckr\" (UID: \"2ca6d2a5-c322-47e7-97c5-cbae83506fe9\") " pod="hostpath-provisioner/csi-hostpathplugin-ljckr" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.751752 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/2ca6d2a5-c322-47e7-97c5-cbae83506fe9-registration-dir\") pod \"csi-hostpathplugin-ljckr\" (UID: \"2ca6d2a5-c322-47e7-97c5-cbae83506fe9\") " pod="hostpath-provisioner/csi-hostpathplugin-ljckr" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.758830 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/2ca6d2a5-c322-47e7-97c5-cbae83506fe9-socket-dir\") pod \"csi-hostpathplugin-ljckr\" (UID: \"2ca6d2a5-c322-47e7-97c5-cbae83506fe9\") " pod="hostpath-provisioner/csi-hostpathplugin-ljckr" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.769488 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92bvg\" (UniqueName: \"kubernetes.io/projected/6d9a3bfc-818e-445a-a027-264cfcfade2b-kube-api-access-92bvg\") pod \"image-registry-697d97f7c8-fwlgl\" (UID: \"6d9a3bfc-818e-445a-a027-264cfcfade2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-fwlgl" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.771563 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/08fb2a9f-0906-4c64-97f8-232b3bc1cdd7-cert\") pod \"ingress-canary-26f4s\" (UID: \"08fb2a9f-0906-4c64-97f8-232b3bc1cdd7\") " pod="openshift-ingress-canary/ingress-canary-26f4s" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.788790 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5mhff"] Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.788833 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29337570-l9r8r"] Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.794832 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/147458bd-e3d6-4425-99ab-d2bd59fc2816-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-5kmjp\" (UID: \"147458bd-e3d6-4425-99ab-d2bd59fc2816\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5kmjp" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.824558 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndr7d\" (UniqueName: \"kubernetes.io/projected/c565f4c6-fef1-4ba7-be0b-cfb3ade367a9-kube-api-access-ndr7d\") pod \"olm-operator-6b444d44fb-xq6bd\" (UID: \"c565f4c6-fef1-4ba7-be0b-cfb3ade367a9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xq6bd" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.829184 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljjhc\" (UniqueName: \"kubernetes.io/projected/7a0298f9-796d-4f69-bc48-f74d34944b99-kube-api-access-ljjhc\") pod \"dns-default-h9rz4\" (UID: \"7a0298f9-796d-4f69-bc48-f74d34944b99\") " pod="openshift-dns/dns-default-h9rz4" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.844942 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrll8\" (UniqueName: \"kubernetes.io/projected/a67d55ed-eb6f-4660-9013-de71da44aad7-kube-api-access-nrll8\") pod \"console-operator-58897d9998-mhj6z\" (UID: \"a67d55ed-eb6f-4660-9013-de71da44aad7\") " pod="openshift-console-operator/console-operator-58897d9998-mhj6z" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.855076 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 07:37:19 crc kubenswrapper[4599]: E1012 07:37:19.855586 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 07:37:20.355569219 +0000 UTC m=+137.144764721 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.860342 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wqwjn"] Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.864237 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-mhj6z" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.867168 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9sst\" (UniqueName: \"kubernetes.io/projected/d7e17165-80a0-4e3a-921f-a7ccbf9a9fe5-kube-api-access-d9sst\") pod \"marketplace-operator-79b997595-ks8z9\" (UID: \"d7e17165-80a0-4e3a-921f-a7ccbf9a9fe5\") " pod="openshift-marketplace/marketplace-operator-79b997595-ks8z9" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.874184 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5kmjp" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.877887 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jz9st\" (UniqueName: \"kubernetes.io/projected/08fb2a9f-0906-4c64-97f8-232b3bc1cdd7-kube-api-access-jz9st\") pod \"ingress-canary-26f4s\" (UID: \"08fb2a9f-0906-4c64-97f8-232b3bc1cdd7\") " pod="openshift-ingress-canary/ingress-canary-26f4s" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.914247 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-ks8z9" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.926270 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4jqz\" (UniqueName: \"kubernetes.io/projected/2ca6d2a5-c322-47e7-97c5-cbae83506fe9-kube-api-access-k4jqz\") pod \"csi-hostpathplugin-ljckr\" (UID: \"2ca6d2a5-c322-47e7-97c5-cbae83506fe9\") " pod="hostpath-provisioner/csi-hostpathplugin-ljckr" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.931263 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-4b6ss" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.956726 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-h9rz4" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.957043 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xq6bd" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.957390 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fwlgl\" (UID: \"6d9a3bfc-818e-445a-a027-264cfcfade2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-fwlgl" Oct 12 07:37:19 crc kubenswrapper[4599]: E1012 07:37:19.957845 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 07:37:20.457817658 +0000 UTC m=+137.247013151 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fwlgl" (UID: "6d9a3bfc-818e-445a-a027-264cfcfade2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.981527 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-ljckr" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.983264 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-26f4s" Oct 12 07:37:19 crc kubenswrapper[4599]: I1012 07:37:19.996221 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-qw5t7"] Oct 12 07:37:20 crc kubenswrapper[4599]: I1012 07:37:20.006314 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-lwg5q"] Oct 12 07:37:20 crc kubenswrapper[4599]: I1012 07:37:20.062679 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 07:37:20 crc kubenswrapper[4599]: E1012 07:37:20.063233 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 07:37:20.563214038 +0000 UTC m=+137.352409539 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 07:37:20 crc kubenswrapper[4599]: I1012 07:37:20.169125 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fwlgl\" (UID: \"6d9a3bfc-818e-445a-a027-264cfcfade2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-fwlgl" Oct 12 07:37:20 crc kubenswrapper[4599]: E1012 07:37:20.169462 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 07:37:20.669447727 +0000 UTC m=+137.458643229 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fwlgl" (UID: "6d9a3bfc-818e-445a-a027-264cfcfade2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 07:37:20 crc kubenswrapper[4599]: I1012 07:37:20.211852 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qw5t7" event={"ID":"0c13d504-e266-472f-9ca7-6644f8b37eef","Type":"ContainerStarted","Data":"80faabac1050d51cda45fc36d4c80525e660fac1e0a0fe209572098385f3e7d2"} Oct 12 07:37:20 crc kubenswrapper[4599]: I1012 07:37:20.220909 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-8nzkj" event={"ID":"09a80831-32a7-4583-909d-96fa119c7aa1","Type":"ContainerStarted","Data":"1ea90cd60dc1918cf72691bce5e6ad5edf920f0c359df37d8d606206287ea24a"} Oct 12 07:37:20 crc kubenswrapper[4599]: I1012 07:37:20.225410 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-psd6r" event={"ID":"4459d391-5fe4-4159-b385-9c35fe4cfe77","Type":"ContainerStarted","Data":"37cdad249d32741ac9c9d735e30a4e58dee84988602522c9d214a070a061acf4"} Oct 12 07:37:20 crc kubenswrapper[4599]: I1012 07:37:20.242143 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-kf2t6" event={"ID":"b84f6b8e-f98e-4d84-823f-f83d7912bc6b","Type":"ContainerStarted","Data":"8c1549808a3aeeafcb61dccfef0b38b7739ed8da240570037df080bb04723374"} Oct 12 07:37:20 crc kubenswrapper[4599]: I1012 07:37:20.251190 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h94q9" event={"ID":"fd61fed5-8370-4d59-8604-be1c73527f77","Type":"ContainerStarted","Data":"e72a58715a2d4c4e822934d1c649443ddd015c4e03b9795e8b08432af8e4de4d"} Oct 12 07:37:20 crc kubenswrapper[4599]: I1012 07:37:20.251239 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h94q9" event={"ID":"fd61fed5-8370-4d59-8604-be1c73527f77","Type":"ContainerStarted","Data":"2de56b0c2dd9443dad9437956eae97b31af32eb690e5b2a9f0fc1a1cd362594c"} Oct 12 07:37:20 crc kubenswrapper[4599]: I1012 07:37:20.253258 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h94q9" Oct 12 07:37:20 crc kubenswrapper[4599]: I1012 07:37:20.257216 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5mhff" event={"ID":"003c1afa-e2c6-45c8-bdb3-1da462a231d3","Type":"ContainerStarted","Data":"99137af516c7a5e8daabcc34706f976f7833a9cce06e8aeb94710648a7a35a86"} Oct 12 07:37:20 crc kubenswrapper[4599]: I1012 07:37:20.259507 4599 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-h94q9 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.23:5443/healthz\": dial tcp 10.217.0.23:5443: connect: connection refused" start-of-body= Oct 12 07:37:20 crc kubenswrapper[4599]: I1012 07:37:20.259554 4599 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h94q9" podUID="fd61fed5-8370-4d59-8604-be1c73527f77" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.23:5443/healthz\": dial tcp 10.217.0.23:5443: connect: connection refused" Oct 12 07:37:20 crc kubenswrapper[4599]: I1012 07:37:20.260976 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-42f4k" event={"ID":"1ce36f77-945f-473d-8fa5-011ee88d9adb","Type":"ContainerStarted","Data":"8fd277d6de1de2f9ee9eb0a18fb48ec7ec303ad8fbee5e1e59f0ce98960b33b0"} Oct 12 07:37:20 crc kubenswrapper[4599]: I1012 07:37:20.264077 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wqwjn" event={"ID":"11c1c8aa-1469-4229-9163-9df08ae4192f","Type":"ContainerStarted","Data":"8e75c96ea24747bb2dd06e10c4e43609c40301b920fdc80f5ca78692ef59f365"} Oct 12 07:37:20 crc kubenswrapper[4599]: I1012 07:37:20.265263 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-tgck5" event={"ID":"1e511cbc-ca72-4380-8626-d9cade8ce3e2","Type":"ContainerStarted","Data":"3f27edadb9b0ed6ae960ede595e1517cd43ddd6c864266022f3a228944790e2f"} Oct 12 07:37:20 crc kubenswrapper[4599]: I1012 07:37:20.265295 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-tgck5" event={"ID":"1e511cbc-ca72-4380-8626-d9cade8ce3e2","Type":"ContainerStarted","Data":"7b17ef9d68442350c6531845c7ba7eedec135a3c8bf789a7373f4f3414b80e9a"} Oct 12 07:37:20 crc kubenswrapper[4599]: I1012 07:37:20.270106 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 07:37:20 crc kubenswrapper[4599]: E1012 07:37:20.270499 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 07:37:20.77047926 +0000 UTC m=+137.559674762 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 07:37:20 crc kubenswrapper[4599]: I1012 07:37:20.276542 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-26bbs" event={"ID":"3422bfa7-f155-488a-a72c-129bed440646","Type":"ContainerStarted","Data":"84849e63e8566aa0822065cd50b9e4e67a58bac140b8a1e2495bc18f86fba42a"} Oct 12 07:37:20 crc kubenswrapper[4599]: I1012 07:37:20.282710 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kvpk5" event={"ID":"a823a439-112f-4160-b925-b9b0830ec38f","Type":"ContainerStarted","Data":"f1f814abb16ca7c6e5f444459c8d8a7fc2d3ceddcc427acdbb8993a7eebed443"} Oct 12 07:37:20 crc kubenswrapper[4599]: I1012 07:37:20.285815 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-dn2jf" event={"ID":"2bd8b379-5b19-453f-af09-9025f6b127db","Type":"ContainerStarted","Data":"f044d790d5a9d58f6e9fe9d73842c52d253b8544dac50e74d5add57f5ba064c2"} Oct 12 07:37:20 crc kubenswrapper[4599]: I1012 07:37:20.294938 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ggnjr" event={"ID":"bab40ec2-3024-45f4-9efb-f07829d8786b","Type":"ContainerStarted","Data":"ecf2ab1958e9de103735d55d95cc10d334bcdd34d4245d1669f2787d6758b13d"} Oct 12 07:37:20 crc kubenswrapper[4599]: I1012 07:37:20.294967 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ggnjr" event={"ID":"bab40ec2-3024-45f4-9efb-f07829d8786b","Type":"ContainerStarted","Data":"d1918e59bc34aee86d06927258fefecc5cf47bf18485d36b00d2227dcf8eb565"} Oct 12 07:37:20 crc kubenswrapper[4599]: I1012 07:37:20.313756 4599 generic.go:334] "Generic (PLEG): container finished" podID="be2a1543-ec30-4cc1-923f-62123740ac1a" containerID="69757095c23b00b6b9edee86733e9e21cc26de11c3a0de114aa09bdd5ae5c41e" exitCode=0 Oct 12 07:37:20 crc kubenswrapper[4599]: I1012 07:37:20.314474 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-gj5kj" event={"ID":"be2a1543-ec30-4cc1-923f-62123740ac1a","Type":"ContainerDied","Data":"69757095c23b00b6b9edee86733e9e21cc26de11c3a0de114aa09bdd5ae5c41e"} Oct 12 07:37:20 crc kubenswrapper[4599]: I1012 07:37:20.314510 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-gj5kj" event={"ID":"be2a1543-ec30-4cc1-923f-62123740ac1a","Type":"ContainerStarted","Data":"6ecaead645bf82b00a678feec44001ebcf26f9473880b175a1270b0c5c8e9f78"} Oct 12 07:37:20 crc kubenswrapper[4599]: I1012 07:37:20.371425 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pnj7m" event={"ID":"ce6df133-1d7c-41a7-a9a8-a18676bae045","Type":"ContainerStarted","Data":"170f7b6737777ee8004fac1f86fc1e04b9b57a27e9148981f87f9ca0c1ab1251"} Oct 12 07:37:20 crc kubenswrapper[4599]: I1012 07:37:20.371994 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0409709f-7b11-4bcd-aec9-b8922c4474c9-config\") pod \"machine-approver-56656f9798-f8wgl\" (UID: \"0409709f-7b11-4bcd-aec9-b8922c4474c9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f8wgl" Oct 12 07:37:20 crc kubenswrapper[4599]: I1012 07:37:20.372137 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0409709f-7b11-4bcd-aec9-b8922c4474c9-auth-proxy-config\") pod \"machine-approver-56656f9798-f8wgl\" (UID: \"0409709f-7b11-4bcd-aec9-b8922c4474c9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f8wgl" Oct 12 07:37:20 crc kubenswrapper[4599]: I1012 07:37:20.372196 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fwlgl\" (UID: \"6d9a3bfc-818e-445a-a027-264cfcfade2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-fwlgl" Oct 12 07:37:20 crc kubenswrapper[4599]: I1012 07:37:20.372265 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/0409709f-7b11-4bcd-aec9-b8922c4474c9-machine-approver-tls\") pod \"machine-approver-56656f9798-f8wgl\" (UID: \"0409709f-7b11-4bcd-aec9-b8922c4474c9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f8wgl" Oct 12 07:37:20 crc kubenswrapper[4599]: I1012 07:37:20.372944 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0409709f-7b11-4bcd-aec9-b8922c4474c9-auth-proxy-config\") pod \"machine-approver-56656f9798-f8wgl\" (UID: \"0409709f-7b11-4bcd-aec9-b8922c4474c9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f8wgl" Oct 12 07:37:20 crc kubenswrapper[4599]: E1012 07:37:20.373430 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 07:37:20.87341736 +0000 UTC m=+137.662612862 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fwlgl" (UID: "6d9a3bfc-818e-445a-a027-264cfcfade2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 07:37:20 crc kubenswrapper[4599]: I1012 07:37:20.373430 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0409709f-7b11-4bcd-aec9-b8922c4474c9-config\") pod \"machine-approver-56656f9798-f8wgl\" (UID: \"0409709f-7b11-4bcd-aec9-b8922c4474c9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f8wgl" Oct 12 07:37:20 crc kubenswrapper[4599]: I1012 07:37:20.405362 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29337570-l9r8r" event={"ID":"a560875d-c07e-457b-a77b-809cc770867c","Type":"ContainerStarted","Data":"32231bc6559ab9349eb4f37f7397496baa8825e0857c02803e02e1b4f4d429f1"} Oct 12 07:37:20 crc kubenswrapper[4599]: I1012 07:37:20.419917 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/0409709f-7b11-4bcd-aec9-b8922c4474c9-machine-approver-tls\") pod \"machine-approver-56656f9798-f8wgl\" (UID: \"0409709f-7b11-4bcd-aec9-b8922c4474c9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f8wgl" Oct 12 07:37:20 crc kubenswrapper[4599]: I1012 07:37:20.427377 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-44z27" event={"ID":"b22d491a-9add-4ec5-ad3e-e593f9ca93bd","Type":"ContainerStarted","Data":"184f91d73f84ed151e52247023f2db97491b6e3af657644e151a8fc084caf4b2"} Oct 12 07:37:20 crc kubenswrapper[4599]: I1012 07:37:20.474152 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 07:37:20 crc kubenswrapper[4599]: E1012 07:37:20.474967 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 07:37:20.974950108 +0000 UTC m=+137.764145610 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 07:37:20 crc kubenswrapper[4599]: I1012 07:37:20.488321 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-njb7r" event={"ID":"3cdcf878-abd9-481a-9f95-adeda51c77c8","Type":"ContainerStarted","Data":"6a724e601e48403fd3c069830636f63f7fb5a742b2a6382f19b239921975752e"} Oct 12 07:37:20 crc kubenswrapper[4599]: I1012 07:37:20.546649 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f8wgl" Oct 12 07:37:20 crc kubenswrapper[4599]: I1012 07:37:20.559137 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-4jqfc"] Oct 12 07:37:20 crc kubenswrapper[4599]: I1012 07:37:20.565817 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-9m75z"] Oct 12 07:37:20 crc kubenswrapper[4599]: I1012 07:37:20.580168 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fwlgl\" (UID: \"6d9a3bfc-818e-445a-a027-264cfcfade2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-fwlgl" Oct 12 07:37:20 crc kubenswrapper[4599]: E1012 07:37:20.580463 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 07:37:21.080448059 +0000 UTC m=+137.869643562 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fwlgl" (UID: "6d9a3bfc-818e-445a-a027-264cfcfade2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 07:37:20 crc kubenswrapper[4599]: I1012 07:37:20.596150 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-jlnn9" event={"ID":"06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1","Type":"ContainerStarted","Data":"8bbdada1748118d6e5a8ff116d1f17170316717597f8675d98a27970170f45c7"} Oct 12 07:37:20 crc kubenswrapper[4599]: I1012 07:37:20.596926 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-jlnn9" Oct 12 07:37:20 crc kubenswrapper[4599]: I1012 07:37:20.599237 4599 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-jlnn9 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.25:6443/healthz\": dial tcp 10.217.0.25:6443: connect: connection refused" start-of-body= Oct 12 07:37:20 crc kubenswrapper[4599]: I1012 07:37:20.599284 4599 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-jlnn9" podUID="06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.25:6443/healthz\": dial tcp 10.217.0.25:6443: connect: connection refused" Oct 12 07:37:20 crc kubenswrapper[4599]: I1012 07:37:20.618388 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-gjqh4"] Oct 12 07:37:20 crc kubenswrapper[4599]: I1012 07:37:20.632043 4599 generic.go:334] "Generic (PLEG): container finished" podID="10b365db-4478-4250-abe6-fa9e77354d70" containerID="68051a0dd7fcc570567bc9328a6f421c068fd30866f69722887265ace6a4917a" exitCode=0 Oct 12 07:37:20 crc kubenswrapper[4599]: I1012 07:37:20.632114 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-v2b5m" event={"ID":"10b365db-4478-4250-abe6-fa9e77354d70","Type":"ContainerDied","Data":"68051a0dd7fcc570567bc9328a6f421c068fd30866f69722887265ace6a4917a"} Oct 12 07:37:20 crc kubenswrapper[4599]: I1012 07:37:20.632140 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-v2b5m" event={"ID":"10b365db-4478-4250-abe6-fa9e77354d70","Type":"ContainerStarted","Data":"998542afb78b84d053410eb7ddb92f89f3c52175c772380544a0cff8acad2305"} Oct 12 07:37:20 crc kubenswrapper[4599]: I1012 07:37:20.645290 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r6fd4"] Oct 12 07:37:20 crc kubenswrapper[4599]: I1012 07:37:20.654065 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-k258m" event={"ID":"b5519e59-bf53-4ae8-bcb0-4f7ff5b050e5","Type":"ContainerStarted","Data":"b08988c0e9d83c57c344b18865fbb0bf396f9b7b5925a56c5a2a2fc38cd36e8b"} Oct 12 07:37:20 crc kubenswrapper[4599]: I1012 07:37:20.654104 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-k258m" event={"ID":"b5519e59-bf53-4ae8-bcb0-4f7ff5b050e5","Type":"ContainerStarted","Data":"4eb1c91d0b41b86e0e250a18bd325f781c55be67e0ac74c51cbb9a96a556f93f"} Oct 12 07:37:20 crc kubenswrapper[4599]: I1012 07:37:20.669516 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wvhc6" event={"ID":"9ffdc69e-7658-4b27-949a-594688dbee92","Type":"ContainerStarted","Data":"f32380372a519bfb2a487fc8bc3c99c833e18f0141af819a9531c298b6198f0d"} Oct 12 07:37:20 crc kubenswrapper[4599]: I1012 07:37:20.681876 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 07:37:20 crc kubenswrapper[4599]: E1012 07:37:20.682704 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 07:37:21.182684435 +0000 UTC m=+137.971879938 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 07:37:20 crc kubenswrapper[4599]: I1012 07:37:20.695488 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-r64z4" event={"ID":"1712254f-c8df-4c98-bfb2-bf79d98a6161","Type":"ContainerStarted","Data":"d473a19b167ee39a980012d79da4455e7a7558d10ba5565a147b66d32f6a1275"} Oct 12 07:37:20 crc kubenswrapper[4599]: I1012 07:37:20.695549 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-r64z4" event={"ID":"1712254f-c8df-4c98-bfb2-bf79d98a6161","Type":"ContainerStarted","Data":"e2d9997a3cafcec5965ce9598ff4163f41ad5d800f4db41f7085b77b03c27a73"} Oct 12 07:37:20 crc kubenswrapper[4599]: W1012 07:37:20.726726 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0409709f_7b11_4bcd_aec9_b8922c4474c9.slice/crio-c91e31d3be2c8ed3df86610b0f72e010694a08c77fd8f0fcc0db5d8d849c211a WatchSource:0}: Error finding container c91e31d3be2c8ed3df86610b0f72e010694a08c77fd8f0fcc0db5d8d849c211a: Status 404 returned error can't find the container with id c91e31d3be2c8ed3df86610b0f72e010694a08c77fd8f0fcc0db5d8d849c211a Oct 12 07:37:20 crc kubenswrapper[4599]: I1012 07:37:20.759666 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-nlffm" event={"ID":"da19aa12-0eea-49f0-820a-b2d6da99484a","Type":"ContainerStarted","Data":"73dedb66018bc404159da2878a084db3405ba297152ab50e418edc02be9fcf94"} Oct 12 07:37:20 crc kubenswrapper[4599]: I1012 07:37:20.759760 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8r7fw" Oct 12 07:37:20 crc kubenswrapper[4599]: I1012 07:37:20.770410 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-4twv2" Oct 12 07:37:20 crc kubenswrapper[4599]: I1012 07:37:20.773471 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-26f4s"] Oct 12 07:37:20 crc kubenswrapper[4599]: I1012 07:37:20.784210 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fwlgl\" (UID: \"6d9a3bfc-818e-445a-a027-264cfcfade2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-fwlgl" Oct 12 07:37:20 crc kubenswrapper[4599]: E1012 07:37:20.786221 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 07:37:21.286202682 +0000 UTC m=+138.075398183 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fwlgl" (UID: "6d9a3bfc-818e-445a-a027-264cfcfade2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 07:37:20 crc kubenswrapper[4599]: I1012 07:37:20.872178 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" Oct 12 07:37:20 crc kubenswrapper[4599]: I1012 07:37:20.895250 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 07:37:20 crc kubenswrapper[4599]: E1012 07:37:20.895674 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 07:37:21.395646368 +0000 UTC m=+138.184841870 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 07:37:20 crc kubenswrapper[4599]: I1012 07:37:20.895910 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fwlgl\" (UID: \"6d9a3bfc-818e-445a-a027-264cfcfade2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-fwlgl" Oct 12 07:37:20 crc kubenswrapper[4599]: E1012 07:37:20.905538 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 07:37:21.405510602 +0000 UTC m=+138.194706104 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fwlgl" (UID: "6d9a3bfc-818e-445a-a027-264cfcfade2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 07:37:20 crc kubenswrapper[4599]: I1012 07:37:20.982361 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ggnjr" podStartSLOduration=119.982344443 podStartE2EDuration="1m59.982344443s" podCreationTimestamp="2025-10-12 07:35:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:37:20.980755074 +0000 UTC m=+137.769950576" watchObservedRunningTime="2025-10-12 07:37:20.982344443 +0000 UTC m=+137.771539945" Oct 12 07:37:21 crc kubenswrapper[4599]: I1012 07:37:21.001940 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 07:37:21 crc kubenswrapper[4599]: E1012 07:37:21.002582 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 07:37:21.502566322 +0000 UTC m=+138.291761824 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 07:37:21 crc kubenswrapper[4599]: I1012 07:37:21.018692 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-r64z4" podStartSLOduration=120.018671794 podStartE2EDuration="2m0.018671794s" podCreationTimestamp="2025-10-12 07:35:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:37:21.018135682 +0000 UTC m=+137.807331185" watchObservedRunningTime="2025-10-12 07:37:21.018671794 +0000 UTC m=+137.807867296" Oct 12 07:37:21 crc kubenswrapper[4599]: I1012 07:37:21.103140 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fwlgl\" (UID: \"6d9a3bfc-818e-445a-a027-264cfcfade2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-fwlgl" Oct 12 07:37:21 crc kubenswrapper[4599]: E1012 07:37:21.103503 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 07:37:21.603490262 +0000 UTC m=+138.392685764 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fwlgl" (UID: "6d9a3bfc-818e-445a-a027-264cfcfade2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 07:37:21 crc kubenswrapper[4599]: I1012 07:37:21.125861 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-nlffm" podStartSLOduration=121.125839366 podStartE2EDuration="2m1.125839366s" podCreationTimestamp="2025-10-12 07:35:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:37:21.084947551 +0000 UTC m=+137.874143053" watchObservedRunningTime="2025-10-12 07:37:21.125839366 +0000 UTC m=+137.915034869" Oct 12 07:37:21 crc kubenswrapper[4599]: I1012 07:37:21.134586 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-r64z4" Oct 12 07:37:21 crc kubenswrapper[4599]: I1012 07:37:21.205520 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 07:37:21 crc kubenswrapper[4599]: E1012 07:37:21.206174 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 07:37:21.706155828 +0000 UTC m=+138.495351330 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 07:37:21 crc kubenswrapper[4599]: I1012 07:37:21.230883 4599 patch_prober.go:28] interesting pod/router-default-5444994796-r64z4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 12 07:37:21 crc kubenswrapper[4599]: [-]has-synced failed: reason withheld Oct 12 07:37:21 crc kubenswrapper[4599]: [+]process-running ok Oct 12 07:37:21 crc kubenswrapper[4599]: healthz check failed Oct 12 07:37:21 crc kubenswrapper[4599]: I1012 07:37:21.230930 4599 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r64z4" podUID="1712254f-c8df-4c98-bfb2-bf79d98a6161" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 12 07:37:21 crc kubenswrapper[4599]: I1012 07:37:21.263817 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-tgck5" podStartSLOduration=120.263795508 podStartE2EDuration="2m0.263795508s" podCreationTimestamp="2025-10-12 07:35:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:37:21.218669773 +0000 UTC m=+138.007865274" watchObservedRunningTime="2025-10-12 07:37:21.263795508 +0000 UTC m=+138.052991010" Oct 12 07:37:21 crc kubenswrapper[4599]: I1012 07:37:21.318714 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fwlgl\" (UID: \"6d9a3bfc-818e-445a-a027-264cfcfade2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-fwlgl" Oct 12 07:37:21 crc kubenswrapper[4599]: E1012 07:37:21.319057 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 07:37:21.819042513 +0000 UTC m=+138.608238015 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fwlgl" (UID: "6d9a3bfc-818e-445a-a027-264cfcfade2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 07:37:21 crc kubenswrapper[4599]: I1012 07:37:21.376025 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ngfnl" podStartSLOduration=121.376002639 podStartE2EDuration="2m1.376002639s" podCreationTimestamp="2025-10-12 07:35:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:37:21.374632434 +0000 UTC m=+138.163827936" watchObservedRunningTime="2025-10-12 07:37:21.376002639 +0000 UTC m=+138.165198142" Oct 12 07:37:21 crc kubenswrapper[4599]: I1012 07:37:21.419854 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 07:37:21 crc kubenswrapper[4599]: E1012 07:37:21.420432 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 07:37:21.920414667 +0000 UTC m=+138.709610169 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 07:37:21 crc kubenswrapper[4599]: I1012 07:37:21.522095 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fwlgl\" (UID: \"6d9a3bfc-818e-445a-a027-264cfcfade2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-fwlgl" Oct 12 07:37:21 crc kubenswrapper[4599]: E1012 07:37:21.522400 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 07:37:22.022387166 +0000 UTC m=+138.811582668 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fwlgl" (UID: "6d9a3bfc-818e-445a-a027-264cfcfade2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 07:37:21 crc kubenswrapper[4599]: I1012 07:37:21.534248 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h94q9" podStartSLOduration=120.534231847 podStartE2EDuration="2m0.534231847s" podCreationTimestamp="2025-10-12 07:35:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:37:21.528743721 +0000 UTC m=+138.317939223" watchObservedRunningTime="2025-10-12 07:37:21.534231847 +0000 UTC m=+138.323427349" Oct 12 07:37:21 crc kubenswrapper[4599]: I1012 07:37:21.623344 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 07:37:21 crc kubenswrapper[4599]: E1012 07:37:21.623929 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 07:37:22.123909224 +0000 UTC m=+138.913104726 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 07:37:21 crc kubenswrapper[4599]: I1012 07:37:21.707225 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-k258m" podStartSLOduration=5.707195039 podStartE2EDuration="5.707195039s" podCreationTimestamp="2025-10-12 07:37:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:37:21.57714113 +0000 UTC m=+138.366336632" watchObservedRunningTime="2025-10-12 07:37:21.707195039 +0000 UTC m=+138.496390541" Oct 12 07:37:21 crc kubenswrapper[4599]: I1012 07:37:21.709504 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-h9rz4"] Oct 12 07:37:21 crc kubenswrapper[4599]: I1012 07:37:21.711022 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5kmjp"] Oct 12 07:37:21 crc kubenswrapper[4599]: I1012 07:37:21.711244 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8r7fw" podStartSLOduration=120.711237568 podStartE2EDuration="2m0.711237568s" podCreationTimestamp="2025-10-12 07:35:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:37:21.647946103 +0000 UTC m=+138.437141605" watchObservedRunningTime="2025-10-12 07:37:21.711237568 +0000 UTC m=+138.500433071" Oct 12 07:37:21 crc kubenswrapper[4599]: I1012 07:37:21.724841 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-4b6ss"] Oct 12 07:37:21 crc kubenswrapper[4599]: I1012 07:37:21.734693 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fwlgl\" (UID: \"6d9a3bfc-818e-445a-a027-264cfcfade2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-fwlgl" Oct 12 07:37:21 crc kubenswrapper[4599]: E1012 07:37:21.735151 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 07:37:22.235136297 +0000 UTC m=+139.024331798 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fwlgl" (UID: "6d9a3bfc-818e-445a-a027-264cfcfade2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 07:37:21 crc kubenswrapper[4599]: I1012 07:37:21.749925 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-4twv2" podStartSLOduration=120.749905196 podStartE2EDuration="2m0.749905196s" podCreationTimestamp="2025-10-12 07:35:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:37:21.749757076 +0000 UTC m=+138.538952578" watchObservedRunningTime="2025-10-12 07:37:21.749905196 +0000 UTC m=+138.539100698" Oct 12 07:37:21 crc kubenswrapper[4599]: I1012 07:37:21.777413 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ks8z9"] Oct 12 07:37:21 crc kubenswrapper[4599]: I1012 07:37:21.777721 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xq6bd"] Oct 12 07:37:21 crc kubenswrapper[4599]: I1012 07:37:21.840615 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 07:37:21 crc kubenswrapper[4599]: E1012 07:37:21.841207 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 07:37:22.341190377 +0000 UTC m=+139.130385879 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 07:37:21 crc kubenswrapper[4599]: I1012 07:37:21.849360 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-jlnn9" podStartSLOduration=121.849323344 podStartE2EDuration="2m1.849323344s" podCreationTimestamp="2025-10-12 07:35:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:37:21.848215765 +0000 UTC m=+138.637411266" watchObservedRunningTime="2025-10-12 07:37:21.849323344 +0000 UTC m=+138.638518846" Oct 12 07:37:21 crc kubenswrapper[4599]: I1012 07:37:21.859308 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-ljckr"] Oct 12 07:37:21 crc kubenswrapper[4599]: I1012 07:37:21.883586 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-gj5kj" event={"ID":"be2a1543-ec30-4cc1-923f-62123740ac1a","Type":"ContainerStarted","Data":"e9528fd28c8931e202dbbe8f1d9530a9d102b186471d229414e215a25aba2372"} Oct 12 07:37:21 crc kubenswrapper[4599]: I1012 07:37:21.884518 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-gj5kj" Oct 12 07:37:21 crc kubenswrapper[4599]: I1012 07:37:21.917314 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-8nzkj" event={"ID":"09a80831-32a7-4583-909d-96fa119c7aa1","Type":"ContainerStarted","Data":"1d1f35fbb596bff1be49ad78a5c0666bd45622f729e6d69f5f19db59fac9e061"} Oct 12 07:37:21 crc kubenswrapper[4599]: I1012 07:37:21.920217 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-44z27" event={"ID":"b22d491a-9add-4ec5-ad3e-e593f9ca93bd","Type":"ContainerStarted","Data":"6b88f6037618c6c5fb1cface1d0cd2c6365ce36b0edfe21457aa6a92fe269029"} Oct 12 07:37:21 crc kubenswrapper[4599]: I1012 07:37:21.941901 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fwlgl\" (UID: \"6d9a3bfc-818e-445a-a027-264cfcfade2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-fwlgl" Oct 12 07:37:21 crc kubenswrapper[4599]: E1012 07:37:21.942207 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 07:37:22.442194728 +0000 UTC m=+139.231390221 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fwlgl" (UID: "6d9a3bfc-818e-445a-a027-264cfcfade2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 07:37:21 crc kubenswrapper[4599]: I1012 07:37:21.943647 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-jlnn9" event={"ID":"06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1","Type":"ContainerStarted","Data":"17b1c6b7ed4f23206782035e25723153ac8f1f4bdd6b7d5d18ecd73d10dbbb65"} Oct 12 07:37:21 crc kubenswrapper[4599]: I1012 07:37:21.986199 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-psd6r" event={"ID":"4459d391-5fe4-4159-b385-9c35fe4cfe77","Type":"ContainerStarted","Data":"b310aeec835de5f90294d9d5d24a67dd0ecc465f62ed5961e99b7f477414c9fc"} Oct 12 07:37:21 crc kubenswrapper[4599]: I1012 07:37:21.986512 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-psd6r" event={"ID":"4459d391-5fe4-4159-b385-9c35fe4cfe77","Type":"ContainerStarted","Data":"c4d611858b2b4192db103e9caf37895148c21cccf769f28118f32bdde9250dcd"} Oct 12 07:37:21 crc kubenswrapper[4599]: I1012 07:37:21.988958 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-gj5kj" podStartSLOduration=121.98894542 podStartE2EDuration="2m1.98894542s" podCreationTimestamp="2025-10-12 07:35:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:37:21.987628174 +0000 UTC m=+138.776823676" watchObservedRunningTime="2025-10-12 07:37:21.98894542 +0000 UTC m=+138.778140922" Oct 12 07:37:22 crc kubenswrapper[4599]: I1012 07:37:22.010970 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-mhj6z"] Oct 12 07:37:22 crc kubenswrapper[4599]: I1012 07:37:22.015418 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f8wgl" event={"ID":"0409709f-7b11-4bcd-aec9-b8922c4474c9","Type":"ContainerStarted","Data":"c91e31d3be2c8ed3df86610b0f72e010694a08c77fd8f0fcc0db5d8d849c211a"} Oct 12 07:37:22 crc kubenswrapper[4599]: I1012 07:37:22.023749 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-44z27" podStartSLOduration=121.023736823 podStartE2EDuration="2m1.023736823s" podCreationTimestamp="2025-10-12 07:35:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:37:22.022676993 +0000 UTC m=+138.811872495" watchObservedRunningTime="2025-10-12 07:37:22.023736823 +0000 UTC m=+138.812932325" Oct 12 07:37:22 crc kubenswrapper[4599]: I1012 07:37:22.054059 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 07:37:22 crc kubenswrapper[4599]: E1012 07:37:22.055377 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 07:37:22.555359498 +0000 UTC m=+139.344554999 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 07:37:22 crc kubenswrapper[4599]: I1012 07:37:22.059537 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-kf2t6" event={"ID":"b84f6b8e-f98e-4d84-823f-f83d7912bc6b","Type":"ContainerStarted","Data":"82ed71ef5e676a0bb54ceb0d98a08c7e257abdf0f2dc28983080b3c8cde201a3"} Oct 12 07:37:22 crc kubenswrapper[4599]: I1012 07:37:22.061309 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-kf2t6" Oct 12 07:37:22 crc kubenswrapper[4599]: I1012 07:37:22.061863 4599 patch_prober.go:28] interesting pod/downloads-7954f5f757-kf2t6 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Oct 12 07:37:22 crc kubenswrapper[4599]: I1012 07:37:22.061918 4599 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-kf2t6" podUID="b84f6b8e-f98e-4d84-823f-f83d7912bc6b" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Oct 12 07:37:22 crc kubenswrapper[4599]: I1012 07:37:22.068684 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-h9rz4" event={"ID":"7a0298f9-796d-4f69-bc48-f74d34944b99","Type":"ContainerStarted","Data":"3a6931e0edc858bc9f0cabe98038ef2422b4049c89b8ea8a687ba142cf3ce28e"} Oct 12 07:37:22 crc kubenswrapper[4599]: I1012 07:37:22.070386 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-42f4k" event={"ID":"1ce36f77-945f-473d-8fa5-011ee88d9adb","Type":"ContainerStarted","Data":"5bc0919c7bcfa8d953a607a51081520b8625b7c7322a5068146f74fdf1806c67"} Oct 12 07:37:22 crc kubenswrapper[4599]: I1012 07:37:22.082422 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-psd6r" podStartSLOduration=122.082406326 podStartE2EDuration="2m2.082406326s" podCreationTimestamp="2025-10-12 07:35:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:37:22.077699383 +0000 UTC m=+138.866894886" watchObservedRunningTime="2025-10-12 07:37:22.082406326 +0000 UTC m=+138.871601828" Oct 12 07:37:22 crc kubenswrapper[4599]: I1012 07:37:22.131981 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-8nzkj" podStartSLOduration=121.131960908 podStartE2EDuration="2m1.131960908s" podCreationTimestamp="2025-10-12 07:35:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:37:22.111760919 +0000 UTC m=+138.900956421" watchObservedRunningTime="2025-10-12 07:37:22.131960908 +0000 UTC m=+138.921156410" Oct 12 07:37:22 crc kubenswrapper[4599]: W1012 07:37:22.132629 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda67d55ed_eb6f_4660_9013_de71da44aad7.slice/crio-24d88e10faa18c3c5d807b05f2068b1c661cc5e6d6b9385cb0c57e7c6d49c0d7 WatchSource:0}: Error finding container 24d88e10faa18c3c5d807b05f2068b1c661cc5e6d6b9385cb0c57e7c6d49c0d7: Status 404 returned error can't find the container with id 24d88e10faa18c3c5d807b05f2068b1c661cc5e6d6b9385cb0c57e7c6d49c0d7 Oct 12 07:37:22 crc kubenswrapper[4599]: I1012 07:37:22.144560 4599 patch_prober.go:28] interesting pod/router-default-5444994796-r64z4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 12 07:37:22 crc kubenswrapper[4599]: [-]has-synced failed: reason withheld Oct 12 07:37:22 crc kubenswrapper[4599]: [+]process-running ok Oct 12 07:37:22 crc kubenswrapper[4599]: healthz check failed Oct 12 07:37:22 crc kubenswrapper[4599]: I1012 07:37:22.144625 4599 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r64z4" podUID="1712254f-c8df-4c98-bfb2-bf79d98a6161" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 12 07:37:22 crc kubenswrapper[4599]: I1012 07:37:22.148618 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-4jqfc" event={"ID":"503dce4f-a5a4-4a20-8546-a0311a212b84","Type":"ContainerStarted","Data":"f9720e6ec5f1aa5414a6106e53ba9497414f142805efc995b21bb4d68f93dabc"} Oct 12 07:37:22 crc kubenswrapper[4599]: I1012 07:37:22.148661 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-4jqfc" event={"ID":"503dce4f-a5a4-4a20-8546-a0311a212b84","Type":"ContainerStarted","Data":"12cbeb2119b32e38eb1d0256b47be744f33edb3367613d9d1298d7ad0c725d91"} Oct 12 07:37:22 crc kubenswrapper[4599]: I1012 07:37:22.156287 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fwlgl\" (UID: \"6d9a3bfc-818e-445a-a027-264cfcfade2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-fwlgl" Oct 12 07:37:22 crc kubenswrapper[4599]: E1012 07:37:22.157630 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 07:37:22.657611703 +0000 UTC m=+139.446807206 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fwlgl" (UID: "6d9a3bfc-818e-445a-a027-264cfcfade2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 07:37:22 crc kubenswrapper[4599]: I1012 07:37:22.174997 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-42f4k" podStartSLOduration=121.174974618 podStartE2EDuration="2m1.174974618s" podCreationTimestamp="2025-10-12 07:35:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:37:22.174761666 +0000 UTC m=+138.963957168" watchObservedRunningTime="2025-10-12 07:37:22.174974618 +0000 UTC m=+138.964170120" Oct 12 07:37:22 crc kubenswrapper[4599]: I1012 07:37:22.219748 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-lwg5q" event={"ID":"f1daac45-8c29-46d6-a4ca-6c42bc99f1f7","Type":"ContainerStarted","Data":"0edded123a6780987964859623160fc4f4e89c2d9bb63fe0b8cf6d07b7742584"} Oct 12 07:37:22 crc kubenswrapper[4599]: I1012 07:37:22.220038 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-lwg5q" event={"ID":"f1daac45-8c29-46d6-a4ca-6c42bc99f1f7","Type":"ContainerStarted","Data":"7132b8e06fe2a89d206adaabce104a6b553eaf5058c69e921243ce0afd346fbf"} Oct 12 07:37:22 crc kubenswrapper[4599]: I1012 07:37:22.235703 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-jlnn9" Oct 12 07:37:22 crc kubenswrapper[4599]: I1012 07:37:22.237477 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-gjqh4" event={"ID":"c2d2a6b0-e99f-4c3b-8535-77d97c89d3ed","Type":"ContainerStarted","Data":"7e53327f73e4fd64ccc33286c32b1c0e4da40e7a75a5141dd569ca567443317a"} Oct 12 07:37:22 crc kubenswrapper[4599]: I1012 07:37:22.242581 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-kf2t6" podStartSLOduration=122.242569313 podStartE2EDuration="2m2.242569313s" podCreationTimestamp="2025-10-12 07:35:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:37:22.241150146 +0000 UTC m=+139.030345648" watchObservedRunningTime="2025-10-12 07:37:22.242569313 +0000 UTC m=+139.031764816" Oct 12 07:37:22 crc kubenswrapper[4599]: I1012 07:37:22.260700 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 07:37:22 crc kubenswrapper[4599]: E1012 07:37:22.261795 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 07:37:22.761773182 +0000 UTC m=+139.550968685 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 07:37:22 crc kubenswrapper[4599]: I1012 07:37:22.280784 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29337570-l9r8r" event={"ID":"a560875d-c07e-457b-a77b-809cc770867c","Type":"ContainerStarted","Data":"09d77bf8350365feb41b8b233fae9fd3ad7ad85f82c388edfbf15103162997eb"} Oct 12 07:37:22 crc kubenswrapper[4599]: I1012 07:37:22.301245 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wqwjn" event={"ID":"11c1c8aa-1469-4229-9163-9df08ae4192f","Type":"ContainerStarted","Data":"84e8e4e36cc69e6a5206b9ef657dd27c1d0a16bf8791e9a5b611f2af760d2290"} Oct 12 07:37:22 crc kubenswrapper[4599]: I1012 07:37:22.303028 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wqwjn" Oct 12 07:37:22 crc kubenswrapper[4599]: I1012 07:37:22.304634 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-4jqfc" podStartSLOduration=121.304615848 podStartE2EDuration="2m1.304615848s" podCreationTimestamp="2025-10-12 07:35:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:37:22.303060152 +0000 UTC m=+139.092255645" watchObservedRunningTime="2025-10-12 07:37:22.304615848 +0000 UTC m=+139.093811351" Oct 12 07:37:22 crc kubenswrapper[4599]: I1012 07:37:22.321614 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wqwjn" Oct 12 07:37:22 crc kubenswrapper[4599]: I1012 07:37:22.339584 4599 generic.go:334] "Generic (PLEG): container finished" podID="ce6df133-1d7c-41a7-a9a8-a18676bae045" containerID="545073deca2a87daa651181f497f3960487d3d670ac48845f1bdea6fa3da38ee" exitCode=0 Oct 12 07:37:22 crc kubenswrapper[4599]: I1012 07:37:22.339655 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pnj7m" event={"ID":"ce6df133-1d7c-41a7-a9a8-a18676bae045","Type":"ContainerDied","Data":"545073deca2a87daa651181f497f3960487d3d670ac48845f1bdea6fa3da38ee"} Oct 12 07:37:22 crc kubenswrapper[4599]: I1012 07:37:22.361309 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-26bbs" event={"ID":"3422bfa7-f155-488a-a72c-129bed440646","Type":"ContainerStarted","Data":"5443d308c2ac4d3e78f551801023787b37ea894aed390a16cd75a736a77ce456"} Oct 12 07:37:22 crc kubenswrapper[4599]: I1012 07:37:22.362566 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fwlgl\" (UID: \"6d9a3bfc-818e-445a-a027-264cfcfade2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-fwlgl" Oct 12 07:37:22 crc kubenswrapper[4599]: E1012 07:37:22.362912 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 07:37:22.862898682 +0000 UTC m=+139.652094183 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fwlgl" (UID: "6d9a3bfc-818e-445a-a027-264cfcfade2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 07:37:22 crc kubenswrapper[4599]: I1012 07:37:22.376272 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-lwg5q" podStartSLOduration=122.376251659 podStartE2EDuration="2m2.376251659s" podCreationTimestamp="2025-10-12 07:35:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:37:22.330899428 +0000 UTC m=+139.120094930" watchObservedRunningTime="2025-10-12 07:37:22.376251659 +0000 UTC m=+139.165447162" Oct 12 07:37:22 crc kubenswrapper[4599]: I1012 07:37:22.386992 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kvpk5" event={"ID":"a823a439-112f-4160-b925-b9b0830ec38f","Type":"ContainerStarted","Data":"84e3b8b489a5784c75be6b0d9851b5ee07b037b04c6270180798cab4e795d602"} Oct 12 07:37:22 crc kubenswrapper[4599]: I1012 07:37:22.410284 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29337570-l9r8r" podStartSLOduration=121.410258102 podStartE2EDuration="2m1.410258102s" podCreationTimestamp="2025-10-12 07:35:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:37:22.409489953 +0000 UTC m=+139.198685455" watchObservedRunningTime="2025-10-12 07:37:22.410258102 +0000 UTC m=+139.199453605" Oct 12 07:37:22 crc kubenswrapper[4599]: I1012 07:37:22.435818 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5mhff" event={"ID":"003c1afa-e2c6-45c8-bdb3-1da462a231d3","Type":"ContainerStarted","Data":"3633679f0946915bbd78984e1a2eab3cf8a3aa48a1d752374148457b8ea0994d"} Oct 12 07:37:22 crc kubenswrapper[4599]: I1012 07:37:22.441423 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wvhc6" event={"ID":"9ffdc69e-7658-4b27-949a-594688dbee92","Type":"ContainerStarted","Data":"54c4cc03eb7d32858ed9d9e92401db24659511a16292d947e69b109e10ac4e0e"} Oct 12 07:37:22 crc kubenswrapper[4599]: I1012 07:37:22.441453 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wvhc6" event={"ID":"9ffdc69e-7658-4b27-949a-594688dbee92","Type":"ContainerStarted","Data":"3bac2e2eac9b082fac4d09fd1ab8330c33f1d391e497dfaa504729e251eee357"} Oct 12 07:37:22 crc kubenswrapper[4599]: I1012 07:37:22.452249 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qw5t7" event={"ID":"0c13d504-e266-472f-9ca7-6644f8b37eef","Type":"ContainerStarted","Data":"31c656c8905c36d58edd2234699e05b8fcb9251c8afc28166a062c8523b70add"} Oct 12 07:37:22 crc kubenswrapper[4599]: I1012 07:37:22.452430 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pnj7m" podStartSLOduration=121.452417239 podStartE2EDuration="2m1.452417239s" podCreationTimestamp="2025-10-12 07:35:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:37:22.452415646 +0000 UTC m=+139.241611148" watchObservedRunningTime="2025-10-12 07:37:22.452417239 +0000 UTC m=+139.241612741" Oct 12 07:37:22 crc kubenswrapper[4599]: I1012 07:37:22.471018 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 07:37:22 crc kubenswrapper[4599]: E1012 07:37:22.471388 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 07:37:22.971359945 +0000 UTC m=+139.760555447 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 07:37:22 crc kubenswrapper[4599]: I1012 07:37:22.510632 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-njb7r" event={"ID":"3cdcf878-abd9-481a-9f95-adeda51c77c8","Type":"ContainerStarted","Data":"4a409f3b083a9a4efe2bf32715fe0682d27efa99c78cc7fb6a19d7f6762941a0"} Oct 12 07:37:22 crc kubenswrapper[4599]: I1012 07:37:22.523023 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-26bbs" podStartSLOduration=122.523009782 podStartE2EDuration="2m2.523009782s" podCreationTimestamp="2025-10-12 07:35:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:37:22.494971762 +0000 UTC m=+139.284167264" watchObservedRunningTime="2025-10-12 07:37:22.523009782 +0000 UTC m=+139.312205283" Oct 12 07:37:22 crc kubenswrapper[4599]: I1012 07:37:22.524422 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wqwjn" podStartSLOduration=121.524415684 podStartE2EDuration="2m1.524415684s" podCreationTimestamp="2025-10-12 07:35:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:37:22.517855104 +0000 UTC m=+139.307050607" watchObservedRunningTime="2025-10-12 07:37:22.524415684 +0000 UTC m=+139.313611186" Oct 12 07:37:22 crc kubenswrapper[4599]: I1012 07:37:22.525882 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-26f4s" event={"ID":"08fb2a9f-0906-4c64-97f8-232b3bc1cdd7","Type":"ContainerStarted","Data":"00eb1bc0417ea728aa9296321a48e557b5b4e99e3c38a6be5c64f142e7bbc788"} Oct 12 07:37:22 crc kubenswrapper[4599]: I1012 07:37:22.544724 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9m75z" event={"ID":"ee778128-189f-4177-9d3d-bbc1468da627","Type":"ContainerStarted","Data":"a12e39c2800bf0d230046cbc728c4f514cc5cafdb9d97d11833495f0d08410d3"} Oct 12 07:37:22 crc kubenswrapper[4599]: I1012 07:37:22.551472 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qw5t7" podStartSLOduration=121.551460259 podStartE2EDuration="2m1.551460259s" podCreationTimestamp="2025-10-12 07:35:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:37:22.551228352 +0000 UTC m=+139.340423853" watchObservedRunningTime="2025-10-12 07:37:22.551460259 +0000 UTC m=+139.340655761" Oct 12 07:37:22 crc kubenswrapper[4599]: I1012 07:37:22.573214 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fwlgl\" (UID: \"6d9a3bfc-818e-445a-a027-264cfcfade2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-fwlgl" Oct 12 07:37:22 crc kubenswrapper[4599]: E1012 07:37:22.582854 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 07:37:23.082837469 +0000 UTC m=+139.872032971 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fwlgl" (UID: "6d9a3bfc-818e-445a-a027-264cfcfade2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 07:37:22 crc kubenswrapper[4599]: I1012 07:37:22.603905 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-njb7r" podStartSLOduration=121.603882774 podStartE2EDuration="2m1.603882774s" podCreationTimestamp="2025-10-12 07:35:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:37:22.593240561 +0000 UTC m=+139.382436063" watchObservedRunningTime="2025-10-12 07:37:22.603882774 +0000 UTC m=+139.393078276" Oct 12 07:37:22 crc kubenswrapper[4599]: I1012 07:37:22.627760 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kvpk5" podStartSLOduration=121.627745685 podStartE2EDuration="2m1.627745685s" podCreationTimestamp="2025-10-12 07:35:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:37:22.625354973 +0000 UTC m=+139.414550475" watchObservedRunningTime="2025-10-12 07:37:22.627745685 +0000 UTC m=+139.416941187" Oct 12 07:37:22 crc kubenswrapper[4599]: I1012 07:37:22.629152 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-dn2jf" event={"ID":"2bd8b379-5b19-453f-af09-9025f6b127db","Type":"ContainerStarted","Data":"5cd24f6d1ff8b65cb0281d210ed5acc676f10a33ddb824ec1a7d0d5a389dc863"} Oct 12 07:37:22 crc kubenswrapper[4599]: I1012 07:37:22.658669 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wvhc6" podStartSLOduration=121.658630366 podStartE2EDuration="2m1.658630366s" podCreationTimestamp="2025-10-12 07:35:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:37:22.653945906 +0000 UTC m=+139.443141407" watchObservedRunningTime="2025-10-12 07:37:22.658630366 +0000 UTC m=+139.447825868" Oct 12 07:37:22 crc kubenswrapper[4599]: I1012 07:37:22.670420 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5kmjp" event={"ID":"147458bd-e3d6-4425-99ab-d2bd59fc2816","Type":"ContainerStarted","Data":"fc28a533048f275993fec048738b8a74f586198344dd16b0c943f7118f916056"} Oct 12 07:37:22 crc kubenswrapper[4599]: I1012 07:37:22.675111 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 07:37:22 crc kubenswrapper[4599]: E1012 07:37:22.676131 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 07:37:23.176112135 +0000 UTC m=+139.965307636 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 07:37:22 crc kubenswrapper[4599]: I1012 07:37:22.693641 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r6fd4" event={"ID":"63a0b5d9-8401-4c7d-b897-295c91ffc508","Type":"ContainerStarted","Data":"0edaaddf3846202143b92d2e01ae7c331d3fbe0f095f0fa6202ad992e28ee732"} Oct 12 07:37:22 crc kubenswrapper[4599]: I1012 07:37:22.706141 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h94q9" Oct 12 07:37:22 crc kubenswrapper[4599]: I1012 07:37:22.713283 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5mhff" podStartSLOduration=121.713269312 podStartE2EDuration="2m1.713269312s" podCreationTimestamp="2025-10-12 07:35:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:37:22.686445224 +0000 UTC m=+139.475640726" watchObservedRunningTime="2025-10-12 07:37:22.713269312 +0000 UTC m=+139.502464814" Oct 12 07:37:22 crc kubenswrapper[4599]: I1012 07:37:22.713974 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-26f4s" podStartSLOduration=6.713968402 podStartE2EDuration="6.713968402s" podCreationTimestamp="2025-10-12 07:37:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:37:22.712190637 +0000 UTC m=+139.501386139" watchObservedRunningTime="2025-10-12 07:37:22.713968402 +0000 UTC m=+139.503163925" Oct 12 07:37:22 crc kubenswrapper[4599]: I1012 07:37:22.740700 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-dn2jf" podStartSLOduration=121.740680409 podStartE2EDuration="2m1.740680409s" podCreationTimestamp="2025-10-12 07:35:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:37:22.732040355 +0000 UTC m=+139.521235856" watchObservedRunningTime="2025-10-12 07:37:22.740680409 +0000 UTC m=+139.529875911" Oct 12 07:37:22 crc kubenswrapper[4599]: I1012 07:37:22.788809 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fwlgl\" (UID: \"6d9a3bfc-818e-445a-a027-264cfcfade2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-fwlgl" Oct 12 07:37:22 crc kubenswrapper[4599]: E1012 07:37:22.789795 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 07:37:23.289782678 +0000 UTC m=+140.078978180 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fwlgl" (UID: "6d9a3bfc-818e-445a-a027-264cfcfade2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 07:37:22 crc kubenswrapper[4599]: I1012 07:37:22.889611 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 07:37:22 crc kubenswrapper[4599]: E1012 07:37:22.889777 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 07:37:23.389751555 +0000 UTC m=+140.178947057 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 07:37:22 crc kubenswrapper[4599]: I1012 07:37:22.890112 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fwlgl\" (UID: \"6d9a3bfc-818e-445a-a027-264cfcfade2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-fwlgl" Oct 12 07:37:22 crc kubenswrapper[4599]: E1012 07:37:22.904775 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 07:37:23.404749838 +0000 UTC m=+140.193945340 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fwlgl" (UID: "6d9a3bfc-818e-445a-a027-264cfcfade2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 07:37:22 crc kubenswrapper[4599]: I1012 07:37:22.959999 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mxnmk"] Oct 12 07:37:22 crc kubenswrapper[4599]: I1012 07:37:22.970259 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mxnmk" Oct 12 07:37:22 crc kubenswrapper[4599]: I1012 07:37:22.974612 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mxnmk"] Oct 12 07:37:22 crc kubenswrapper[4599]: I1012 07:37:22.983717 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 12 07:37:22 crc kubenswrapper[4599]: I1012 07:37:22.991238 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 07:37:22 crc kubenswrapper[4599]: E1012 07:37:22.991544 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 07:37:23.49152612 +0000 UTC m=+140.280721622 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 07:37:22 crc kubenswrapper[4599]: I1012 07:37:22.991835 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fwlgl\" (UID: \"6d9a3bfc-818e-445a-a027-264cfcfade2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-fwlgl" Oct 12 07:37:22 crc kubenswrapper[4599]: E1012 07:37:22.992275 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 07:37:23.492257741 +0000 UTC m=+140.281453243 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fwlgl" (UID: "6d9a3bfc-818e-445a-a027-264cfcfade2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 07:37:23 crc kubenswrapper[4599]: I1012 07:37:23.093319 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 07:37:23 crc kubenswrapper[4599]: I1012 07:37:23.093627 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mt5rp\" (UniqueName: \"kubernetes.io/projected/ddcc528b-bf0a-404c-b121-cee466cb352c-kube-api-access-mt5rp\") pod \"community-operators-mxnmk\" (UID: \"ddcc528b-bf0a-404c-b121-cee466cb352c\") " pod="openshift-marketplace/community-operators-mxnmk" Oct 12 07:37:23 crc kubenswrapper[4599]: I1012 07:37:23.093686 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ddcc528b-bf0a-404c-b121-cee466cb352c-catalog-content\") pod \"community-operators-mxnmk\" (UID: \"ddcc528b-bf0a-404c-b121-cee466cb352c\") " pod="openshift-marketplace/community-operators-mxnmk" Oct 12 07:37:23 crc kubenswrapper[4599]: I1012 07:37:23.093724 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ddcc528b-bf0a-404c-b121-cee466cb352c-utilities\") pod \"community-operators-mxnmk\" (UID: \"ddcc528b-bf0a-404c-b121-cee466cb352c\") " pod="openshift-marketplace/community-operators-mxnmk" Oct 12 07:37:23 crc kubenswrapper[4599]: E1012 07:37:23.093827 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 07:37:23.593810026 +0000 UTC m=+140.383005528 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 07:37:23 crc kubenswrapper[4599]: I1012 07:37:23.140635 4599 patch_prober.go:28] interesting pod/router-default-5444994796-r64z4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 12 07:37:23 crc kubenswrapper[4599]: [-]has-synced failed: reason withheld Oct 12 07:37:23 crc kubenswrapper[4599]: [+]process-running ok Oct 12 07:37:23 crc kubenswrapper[4599]: healthz check failed Oct 12 07:37:23 crc kubenswrapper[4599]: I1012 07:37:23.140719 4599 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r64z4" podUID="1712254f-c8df-4c98-bfb2-bf79d98a6161" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 12 07:37:23 crc kubenswrapper[4599]: I1012 07:37:23.151564 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rmrp7"] Oct 12 07:37:23 crc kubenswrapper[4599]: I1012 07:37:23.152750 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rmrp7" Oct 12 07:37:23 crc kubenswrapper[4599]: I1012 07:37:23.155113 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 12 07:37:23 crc kubenswrapper[4599]: I1012 07:37:23.167503 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rmrp7"] Oct 12 07:37:23 crc kubenswrapper[4599]: I1012 07:37:23.194865 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mt5rp\" (UniqueName: \"kubernetes.io/projected/ddcc528b-bf0a-404c-b121-cee466cb352c-kube-api-access-mt5rp\") pod \"community-operators-mxnmk\" (UID: \"ddcc528b-bf0a-404c-b121-cee466cb352c\") " pod="openshift-marketplace/community-operators-mxnmk" Oct 12 07:37:23 crc kubenswrapper[4599]: I1012 07:37:23.194920 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ddcc528b-bf0a-404c-b121-cee466cb352c-catalog-content\") pod \"community-operators-mxnmk\" (UID: \"ddcc528b-bf0a-404c-b121-cee466cb352c\") " pod="openshift-marketplace/community-operators-mxnmk" Oct 12 07:37:23 crc kubenswrapper[4599]: I1012 07:37:23.194948 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fwlgl\" (UID: \"6d9a3bfc-818e-445a-a027-264cfcfade2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-fwlgl" Oct 12 07:37:23 crc kubenswrapper[4599]: I1012 07:37:23.194972 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ddcc528b-bf0a-404c-b121-cee466cb352c-utilities\") pod \"community-operators-mxnmk\" (UID: \"ddcc528b-bf0a-404c-b121-cee466cb352c\") " pod="openshift-marketplace/community-operators-mxnmk" Oct 12 07:37:23 crc kubenswrapper[4599]: I1012 07:37:23.195370 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ddcc528b-bf0a-404c-b121-cee466cb352c-utilities\") pod \"community-operators-mxnmk\" (UID: \"ddcc528b-bf0a-404c-b121-cee466cb352c\") " pod="openshift-marketplace/community-operators-mxnmk" Oct 12 07:37:23 crc kubenswrapper[4599]: I1012 07:37:23.195766 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ddcc528b-bf0a-404c-b121-cee466cb352c-catalog-content\") pod \"community-operators-mxnmk\" (UID: \"ddcc528b-bf0a-404c-b121-cee466cb352c\") " pod="openshift-marketplace/community-operators-mxnmk" Oct 12 07:37:23 crc kubenswrapper[4599]: E1012 07:37:23.196013 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 07:37:23.69600279 +0000 UTC m=+140.485198292 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fwlgl" (UID: "6d9a3bfc-818e-445a-a027-264cfcfade2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 07:37:23 crc kubenswrapper[4599]: I1012 07:37:23.222193 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mt5rp\" (UniqueName: \"kubernetes.io/projected/ddcc528b-bf0a-404c-b121-cee466cb352c-kube-api-access-mt5rp\") pod \"community-operators-mxnmk\" (UID: \"ddcc528b-bf0a-404c-b121-cee466cb352c\") " pod="openshift-marketplace/community-operators-mxnmk" Oct 12 07:37:23 crc kubenswrapper[4599]: I1012 07:37:23.296385 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 07:37:23 crc kubenswrapper[4599]: E1012 07:37:23.296567 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 07:37:23.796535873 +0000 UTC m=+140.585731374 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 07:37:23 crc kubenswrapper[4599]: I1012 07:37:23.296626 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b914c467-be05-49d4-a391-d98254248ade-utilities\") pod \"certified-operators-rmrp7\" (UID: \"b914c467-be05-49d4-a391-d98254248ade\") " pod="openshift-marketplace/certified-operators-rmrp7" Oct 12 07:37:23 crc kubenswrapper[4599]: I1012 07:37:23.296677 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b914c467-be05-49d4-a391-d98254248ade-catalog-content\") pod \"certified-operators-rmrp7\" (UID: \"b914c467-be05-49d4-a391-d98254248ade\") " pod="openshift-marketplace/certified-operators-rmrp7" Oct 12 07:37:23 crc kubenswrapper[4599]: I1012 07:37:23.296738 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fwlgl\" (UID: \"6d9a3bfc-818e-445a-a027-264cfcfade2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-fwlgl" Oct 12 07:37:23 crc kubenswrapper[4599]: I1012 07:37:23.296792 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dbtv\" (UniqueName: \"kubernetes.io/projected/b914c467-be05-49d4-a391-d98254248ade-kube-api-access-5dbtv\") pod \"certified-operators-rmrp7\" (UID: \"b914c467-be05-49d4-a391-d98254248ade\") " pod="openshift-marketplace/certified-operators-rmrp7" Oct 12 07:37:23 crc kubenswrapper[4599]: E1012 07:37:23.297070 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 07:37:23.797051886 +0000 UTC m=+140.586247388 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fwlgl" (UID: "6d9a3bfc-818e-445a-a027-264cfcfade2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 07:37:23 crc kubenswrapper[4599]: I1012 07:37:23.304089 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mxnmk" Oct 12 07:37:23 crc kubenswrapper[4599]: I1012 07:37:23.345610 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fklx9"] Oct 12 07:37:23 crc kubenswrapper[4599]: I1012 07:37:23.346502 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fklx9" Oct 12 07:37:23 crc kubenswrapper[4599]: I1012 07:37:23.363701 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fklx9"] Oct 12 07:37:23 crc kubenswrapper[4599]: I1012 07:37:23.397659 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 07:37:23 crc kubenswrapper[4599]: E1012 07:37:23.397765 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 07:37:23.897751682 +0000 UTC m=+140.686947184 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 07:37:23 crc kubenswrapper[4599]: I1012 07:37:23.398022 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fwlgl\" (UID: \"6d9a3bfc-818e-445a-a027-264cfcfade2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-fwlgl" Oct 12 07:37:23 crc kubenswrapper[4599]: I1012 07:37:23.398075 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dbtv\" (UniqueName: \"kubernetes.io/projected/b914c467-be05-49d4-a391-d98254248ade-kube-api-access-5dbtv\") pod \"certified-operators-rmrp7\" (UID: \"b914c467-be05-49d4-a391-d98254248ade\") " pod="openshift-marketplace/certified-operators-rmrp7" Oct 12 07:37:23 crc kubenswrapper[4599]: I1012 07:37:23.398103 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b914c467-be05-49d4-a391-d98254248ade-utilities\") pod \"certified-operators-rmrp7\" (UID: \"b914c467-be05-49d4-a391-d98254248ade\") " pod="openshift-marketplace/certified-operators-rmrp7" Oct 12 07:37:23 crc kubenswrapper[4599]: I1012 07:37:23.398134 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b914c467-be05-49d4-a391-d98254248ade-catalog-content\") pod \"certified-operators-rmrp7\" (UID: \"b914c467-be05-49d4-a391-d98254248ade\") " pod="openshift-marketplace/certified-operators-rmrp7" Oct 12 07:37:23 crc kubenswrapper[4599]: I1012 07:37:23.398489 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b914c467-be05-49d4-a391-d98254248ade-catalog-content\") pod \"certified-operators-rmrp7\" (UID: \"b914c467-be05-49d4-a391-d98254248ade\") " pod="openshift-marketplace/certified-operators-rmrp7" Oct 12 07:37:23 crc kubenswrapper[4599]: E1012 07:37:23.398744 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 07:37:23.898737573 +0000 UTC m=+140.687933074 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fwlgl" (UID: "6d9a3bfc-818e-445a-a027-264cfcfade2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 07:37:23 crc kubenswrapper[4599]: I1012 07:37:23.399324 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b914c467-be05-49d4-a391-d98254248ade-utilities\") pod \"certified-operators-rmrp7\" (UID: \"b914c467-be05-49d4-a391-d98254248ade\") " pod="openshift-marketplace/certified-operators-rmrp7" Oct 12 07:37:23 crc kubenswrapper[4599]: I1012 07:37:23.424159 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dbtv\" (UniqueName: \"kubernetes.io/projected/b914c467-be05-49d4-a391-d98254248ade-kube-api-access-5dbtv\") pod \"certified-operators-rmrp7\" (UID: \"b914c467-be05-49d4-a391-d98254248ade\") " pod="openshift-marketplace/certified-operators-rmrp7" Oct 12 07:37:23 crc kubenswrapper[4599]: I1012 07:37:23.467964 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rmrp7" Oct 12 07:37:23 crc kubenswrapper[4599]: I1012 07:37:23.498785 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 07:37:23 crc kubenswrapper[4599]: I1012 07:37:23.498963 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2b7b084-738f-430d-8e2c-9c1b5c0ea421-utilities\") pod \"community-operators-fklx9\" (UID: \"e2b7b084-738f-430d-8e2c-9c1b5c0ea421\") " pod="openshift-marketplace/community-operators-fklx9" Oct 12 07:37:23 crc kubenswrapper[4599]: I1012 07:37:23.498994 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2b7b084-738f-430d-8e2c-9c1b5c0ea421-catalog-content\") pod \"community-operators-fklx9\" (UID: \"e2b7b084-738f-430d-8e2c-9c1b5c0ea421\") " pod="openshift-marketplace/community-operators-fklx9" Oct 12 07:37:23 crc kubenswrapper[4599]: I1012 07:37:23.499092 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9rrr\" (UniqueName: \"kubernetes.io/projected/e2b7b084-738f-430d-8e2c-9c1b5c0ea421-kube-api-access-k9rrr\") pod \"community-operators-fklx9\" (UID: \"e2b7b084-738f-430d-8e2c-9c1b5c0ea421\") " pod="openshift-marketplace/community-operators-fklx9" Oct 12 07:37:23 crc kubenswrapper[4599]: E1012 07:37:23.499189 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 07:37:23.999177008 +0000 UTC m=+140.788372510 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 07:37:23 crc kubenswrapper[4599]: I1012 07:37:23.560368 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-68sdf"] Oct 12 07:37:23 crc kubenswrapper[4599]: I1012 07:37:23.562790 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-68sdf" Oct 12 07:37:23 crc kubenswrapper[4599]: I1012 07:37:23.565138 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-68sdf"] Oct 12 07:37:23 crc kubenswrapper[4599]: I1012 07:37:23.601955 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fwlgl\" (UID: \"6d9a3bfc-818e-445a-a027-264cfcfade2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-fwlgl" Oct 12 07:37:23 crc kubenswrapper[4599]: I1012 07:37:23.602012 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9rrr\" (UniqueName: \"kubernetes.io/projected/e2b7b084-738f-430d-8e2c-9c1b5c0ea421-kube-api-access-k9rrr\") pod \"community-operators-fklx9\" (UID: \"e2b7b084-738f-430d-8e2c-9c1b5c0ea421\") " pod="openshift-marketplace/community-operators-fklx9" Oct 12 07:37:23 crc kubenswrapper[4599]: I1012 07:37:23.602038 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2b7b084-738f-430d-8e2c-9c1b5c0ea421-utilities\") pod \"community-operators-fklx9\" (UID: \"e2b7b084-738f-430d-8e2c-9c1b5c0ea421\") " pod="openshift-marketplace/community-operators-fklx9" Oct 12 07:37:23 crc kubenswrapper[4599]: I1012 07:37:23.602058 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2b7b084-738f-430d-8e2c-9c1b5c0ea421-catalog-content\") pod \"community-operators-fklx9\" (UID: \"e2b7b084-738f-430d-8e2c-9c1b5c0ea421\") " pod="openshift-marketplace/community-operators-fklx9" Oct 12 07:37:23 crc kubenswrapper[4599]: I1012 07:37:23.602455 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2b7b084-738f-430d-8e2c-9c1b5c0ea421-catalog-content\") pod \"community-operators-fklx9\" (UID: \"e2b7b084-738f-430d-8e2c-9c1b5c0ea421\") " pod="openshift-marketplace/community-operators-fklx9" Oct 12 07:37:23 crc kubenswrapper[4599]: E1012 07:37:23.602870 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 07:37:24.102860385 +0000 UTC m=+140.892055887 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fwlgl" (UID: "6d9a3bfc-818e-445a-a027-264cfcfade2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 07:37:23 crc kubenswrapper[4599]: I1012 07:37:23.603284 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2b7b084-738f-430d-8e2c-9c1b5c0ea421-utilities\") pod \"community-operators-fklx9\" (UID: \"e2b7b084-738f-430d-8e2c-9c1b5c0ea421\") " pod="openshift-marketplace/community-operators-fklx9" Oct 12 07:37:23 crc kubenswrapper[4599]: I1012 07:37:23.627114 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9rrr\" (UniqueName: \"kubernetes.io/projected/e2b7b084-738f-430d-8e2c-9c1b5c0ea421-kube-api-access-k9rrr\") pod \"community-operators-fklx9\" (UID: \"e2b7b084-738f-430d-8e2c-9c1b5c0ea421\") " pod="openshift-marketplace/community-operators-fklx9" Oct 12 07:37:23 crc kubenswrapper[4599]: I1012 07:37:23.678464 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fklx9" Oct 12 07:37:23 crc kubenswrapper[4599]: I1012 07:37:23.704924 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 07:37:23 crc kubenswrapper[4599]: I1012 07:37:23.705190 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a84bfa7d-3d7b-48df-ba61-ab9640a00e1e-utilities\") pod \"certified-operators-68sdf\" (UID: \"a84bfa7d-3d7b-48df-ba61-ab9640a00e1e\") " pod="openshift-marketplace/certified-operators-68sdf" Oct 12 07:37:23 crc kubenswrapper[4599]: I1012 07:37:23.705227 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a84bfa7d-3d7b-48df-ba61-ab9640a00e1e-catalog-content\") pod \"certified-operators-68sdf\" (UID: \"a84bfa7d-3d7b-48df-ba61-ab9640a00e1e\") " pod="openshift-marketplace/certified-operators-68sdf" Oct 12 07:37:23 crc kubenswrapper[4599]: I1012 07:37:23.705271 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rf2qq\" (UniqueName: \"kubernetes.io/projected/a84bfa7d-3d7b-48df-ba61-ab9640a00e1e-kube-api-access-rf2qq\") pod \"certified-operators-68sdf\" (UID: \"a84bfa7d-3d7b-48df-ba61-ab9640a00e1e\") " pod="openshift-marketplace/certified-operators-68sdf" Oct 12 07:37:23 crc kubenswrapper[4599]: E1012 07:37:23.705434 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 07:37:24.205415472 +0000 UTC m=+140.994610974 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 07:37:23 crc kubenswrapper[4599]: I1012 07:37:23.802514 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9m75z" event={"ID":"ee778128-189f-4177-9d3d-bbc1468da627","Type":"ContainerStarted","Data":"d362b4bcae4559466ce3685158edc77577c42f88488b734467f60428c7221e99"} Oct 12 07:37:23 crc kubenswrapper[4599]: I1012 07:37:23.802575 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9m75z" event={"ID":"ee778128-189f-4177-9d3d-bbc1468da627","Type":"ContainerStarted","Data":"c26b584d2733cf9aec087bb39cd058b10130c8995e2c5af3d9ae4b75048424a5"} Oct 12 07:37:23 crc kubenswrapper[4599]: I1012 07:37:23.814313 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a84bfa7d-3d7b-48df-ba61-ab9640a00e1e-utilities\") pod \"certified-operators-68sdf\" (UID: \"a84bfa7d-3d7b-48df-ba61-ab9640a00e1e\") " pod="openshift-marketplace/certified-operators-68sdf" Oct 12 07:37:23 crc kubenswrapper[4599]: I1012 07:37:23.814386 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a84bfa7d-3d7b-48df-ba61-ab9640a00e1e-catalog-content\") pod \"certified-operators-68sdf\" (UID: \"a84bfa7d-3d7b-48df-ba61-ab9640a00e1e\") " pod="openshift-marketplace/certified-operators-68sdf" Oct 12 07:37:23 crc kubenswrapper[4599]: I1012 07:37:23.814431 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rf2qq\" (UniqueName: \"kubernetes.io/projected/a84bfa7d-3d7b-48df-ba61-ab9640a00e1e-kube-api-access-rf2qq\") pod \"certified-operators-68sdf\" (UID: \"a84bfa7d-3d7b-48df-ba61-ab9640a00e1e\") " pod="openshift-marketplace/certified-operators-68sdf" Oct 12 07:37:23 crc kubenswrapper[4599]: I1012 07:37:23.814452 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fwlgl\" (UID: \"6d9a3bfc-818e-445a-a027-264cfcfade2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-fwlgl" Oct 12 07:37:23 crc kubenswrapper[4599]: E1012 07:37:23.814785 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 07:37:24.314773058 +0000 UTC m=+141.103968559 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fwlgl" (UID: "6d9a3bfc-818e-445a-a027-264cfcfade2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 07:37:23 crc kubenswrapper[4599]: I1012 07:37:23.815011 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a84bfa7d-3d7b-48df-ba61-ab9640a00e1e-utilities\") pod \"certified-operators-68sdf\" (UID: \"a84bfa7d-3d7b-48df-ba61-ab9640a00e1e\") " pod="openshift-marketplace/certified-operators-68sdf" Oct 12 07:37:23 crc kubenswrapper[4599]: I1012 07:37:23.815142 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a84bfa7d-3d7b-48df-ba61-ab9640a00e1e-catalog-content\") pod \"certified-operators-68sdf\" (UID: \"a84bfa7d-3d7b-48df-ba61-ab9640a00e1e\") " pod="openshift-marketplace/certified-operators-68sdf" Oct 12 07:37:23 crc kubenswrapper[4599]: I1012 07:37:23.815970 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-4b6ss" event={"ID":"ba82c167-7692-4fe7-843a-f9d17d328cfc","Type":"ContainerStarted","Data":"36707a5c465635cfdb57d5b3888c5e6deaf8163aceb8fe9c82f74ba6ccb47346"} Oct 12 07:37:23 crc kubenswrapper[4599]: I1012 07:37:23.816022 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-4b6ss" event={"ID":"ba82c167-7692-4fe7-843a-f9d17d328cfc","Type":"ContainerStarted","Data":"bc43bd4261a493f0fbeec80a5b5c70d38e661f535a6099c3b9b5657874f08eef"} Oct 12 07:37:23 crc kubenswrapper[4599]: I1012 07:37:23.851885 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-26f4s" event={"ID":"08fb2a9f-0906-4c64-97f8-232b3bc1cdd7","Type":"ContainerStarted","Data":"2988f34b422cca3ec6c2ac66c43d547a995cb8c8007821dd5d29e2a319857981"} Oct 12 07:37:23 crc kubenswrapper[4599]: I1012 07:37:23.863632 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rf2qq\" (UniqueName: \"kubernetes.io/projected/a84bfa7d-3d7b-48df-ba61-ab9640a00e1e-kube-api-access-rf2qq\") pod \"certified-operators-68sdf\" (UID: \"a84bfa7d-3d7b-48df-ba61-ab9640a00e1e\") " pod="openshift-marketplace/certified-operators-68sdf" Oct 12 07:37:23 crc kubenswrapper[4599]: I1012 07:37:23.863824 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mxnmk"] Oct 12 07:37:23 crc kubenswrapper[4599]: I1012 07:37:23.865693 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-njb7r" event={"ID":"3cdcf878-abd9-481a-9f95-adeda51c77c8","Type":"ContainerStarted","Data":"365791ea4ab58ed1a3af70a3a826012ff1273fb54d4154b5abc5f983504ac3fe"} Oct 12 07:37:23 crc kubenswrapper[4599]: I1012 07:37:23.873285 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pnj7m" Oct 12 07:37:23 crc kubenswrapper[4599]: I1012 07:37:23.873605 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pnj7m" Oct 12 07:37:23 crc kubenswrapper[4599]: I1012 07:37:23.876369 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-mhj6z" event={"ID":"a67d55ed-eb6f-4660-9013-de71da44aad7","Type":"ContainerStarted","Data":"fdba4d8ca8ad76e64479f9fb5c04780c603521586f7a1919555d5c25c778c285"} Oct 12 07:37:23 crc kubenswrapper[4599]: I1012 07:37:23.876409 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-mhj6z" event={"ID":"a67d55ed-eb6f-4660-9013-de71da44aad7","Type":"ContainerStarted","Data":"24d88e10faa18c3c5d807b05f2068b1c661cc5e6d6b9385cb0c57e7c6d49c0d7"} Oct 12 07:37:23 crc kubenswrapper[4599]: I1012 07:37:23.877654 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-mhj6z" Oct 12 07:37:23 crc kubenswrapper[4599]: I1012 07:37:23.899410 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-68sdf" Oct 12 07:37:23 crc kubenswrapper[4599]: I1012 07:37:23.902124 4599 patch_prober.go:28] interesting pod/console-operator-58897d9998-mhj6z container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.22:8443/readyz\": dial tcp 10.217.0.22:8443: connect: connection refused" start-of-body= Oct 12 07:37:23 crc kubenswrapper[4599]: I1012 07:37:23.902162 4599 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-mhj6z" podUID="a67d55ed-eb6f-4660-9013-de71da44aad7" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.22:8443/readyz\": dial tcp 10.217.0.22:8443: connect: connection refused" Oct 12 07:37:23 crc kubenswrapper[4599]: I1012 07:37:23.904099 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pnj7m" Oct 12 07:37:23 crc kubenswrapper[4599]: I1012 07:37:23.907642 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f8wgl" event={"ID":"0409709f-7b11-4bcd-aec9-b8922c4474c9","Type":"ContainerStarted","Data":"c7939be1728583c6125b212bc7e1c21caa8e1b138c52420aca97473c66c34e0d"} Oct 12 07:37:23 crc kubenswrapper[4599]: I1012 07:37:23.907678 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f8wgl" event={"ID":"0409709f-7b11-4bcd-aec9-b8922c4474c9","Type":"ContainerStarted","Data":"b3942cbd499da71260b4b3c439f122e6d6f1adfa9dff33072aec5cae01f26ee7"} Oct 12 07:37:23 crc kubenswrapper[4599]: I1012 07:37:23.916036 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 07:37:23 crc kubenswrapper[4599]: E1012 07:37:23.917183 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 07:37:24.41716693 +0000 UTC m=+141.206362432 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 07:37:23 crc kubenswrapper[4599]: I1012 07:37:23.924232 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9m75z" podStartSLOduration=122.924210392 podStartE2EDuration="2m2.924210392s" podCreationTimestamp="2025-10-12 07:35:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:37:23.900510819 +0000 UTC m=+140.689706322" watchObservedRunningTime="2025-10-12 07:37:23.924210392 +0000 UTC m=+140.713405894" Oct 12 07:37:23 crc kubenswrapper[4599]: I1012 07:37:23.925380 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-4b6ss" podStartSLOduration=122.925376062 podStartE2EDuration="2m2.925376062s" podCreationTimestamp="2025-10-12 07:35:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:37:23.924632128 +0000 UTC m=+140.713827629" watchObservedRunningTime="2025-10-12 07:37:23.925376062 +0000 UTC m=+140.714571564" Oct 12 07:37:23 crc kubenswrapper[4599]: I1012 07:37:23.927026 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-dn2jf" event={"ID":"2bd8b379-5b19-453f-af09-9025f6b127db","Type":"ContainerStarted","Data":"116428e4f56adfcb1ea2cf988ac09c7ee09dbe6dcde1c3dabef146397aeecdc5"} Oct 12 07:37:23 crc kubenswrapper[4599]: I1012 07:37:23.936023 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qw5t7" event={"ID":"0c13d504-e266-472f-9ca7-6644f8b37eef","Type":"ContainerStarted","Data":"9020ace4e32876108f6ea37c92191ab984428bd1ae6ae4fdb5b1f06a28a89ca4"} Oct 12 07:37:23 crc kubenswrapper[4599]: I1012 07:37:23.958616 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-ks8z9" event={"ID":"d7e17165-80a0-4e3a-921f-a7ccbf9a9fe5","Type":"ContainerStarted","Data":"54a02e07e2e27b2c5e82be454f3c251d42fcc14b5404f74bd9a3ac1e6bc3a845"} Oct 12 07:37:23 crc kubenswrapper[4599]: I1012 07:37:23.958656 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-ks8z9" event={"ID":"d7e17165-80a0-4e3a-921f-a7ccbf9a9fe5","Type":"ContainerStarted","Data":"b4392f94f29a381b82692ec9446e895be57b994a594e05e1c1a0a4ff0e032d96"} Oct 12 07:37:23 crc kubenswrapper[4599]: I1012 07:37:23.959430 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-ks8z9" Oct 12 07:37:23 crc kubenswrapper[4599]: I1012 07:37:23.972955 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-ljckr" event={"ID":"2ca6d2a5-c322-47e7-97c5-cbae83506fe9","Type":"ContainerStarted","Data":"0ba396c14b92308c6e78cf51f08621c99c6d3fdc5a0cb65492a47023bbadf269"} Oct 12 07:37:23 crc kubenswrapper[4599]: I1012 07:37:23.972998 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-ljckr" event={"ID":"2ca6d2a5-c322-47e7-97c5-cbae83506fe9","Type":"ContainerStarted","Data":"4aafd916b66ac6151b1b1735761ed9faa235cc5d2e1e01ce552b7205c060c2c6"} Oct 12 07:37:23 crc kubenswrapper[4599]: I1012 07:37:23.978955 4599 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-ks8z9 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" start-of-body= Oct 12 07:37:23 crc kubenswrapper[4599]: I1012 07:37:23.979006 4599 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-ks8z9" podUID="d7e17165-80a0-4e3a-921f-a7ccbf9a9fe5" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" Oct 12 07:37:23 crc kubenswrapper[4599]: I1012 07:37:23.980589 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-mhj6z" podStartSLOduration=123.980576218 podStartE2EDuration="2m3.980576218s" podCreationTimestamp="2025-10-12 07:35:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:37:23.951931764 +0000 UTC m=+140.741127265" watchObservedRunningTime="2025-10-12 07:37:23.980576218 +0000 UTC m=+140.769771719" Oct 12 07:37:23 crc kubenswrapper[4599]: I1012 07:37:23.981512 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f8wgl" podStartSLOduration=123.981507174 podStartE2EDuration="2m3.981507174s" podCreationTimestamp="2025-10-12 07:35:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:37:23.97900896 +0000 UTC m=+140.768204462" watchObservedRunningTime="2025-10-12 07:37:23.981507174 +0000 UTC m=+140.770702676" Oct 12 07:37:24 crc kubenswrapper[4599]: I1012 07:37:24.017686 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-h9rz4" event={"ID":"7a0298f9-796d-4f69-bc48-f74d34944b99","Type":"ContainerStarted","Data":"31f99e33b8a890bd5d7450b4db0348b83a937629b16c0f8850160bc70e05ff84"} Oct 12 07:37:24 crc kubenswrapper[4599]: I1012 07:37:24.017743 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-h9rz4" event={"ID":"7a0298f9-796d-4f69-bc48-f74d34944b99","Type":"ContainerStarted","Data":"418949dc64b0aac9b61cc1e1cab6e070354a8859b48cd386b57578154ab31603"} Oct 12 07:37:24 crc kubenswrapper[4599]: I1012 07:37:24.018647 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-h9rz4" Oct 12 07:37:24 crc kubenswrapper[4599]: I1012 07:37:24.019260 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fwlgl\" (UID: \"6d9a3bfc-818e-445a-a027-264cfcfade2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-fwlgl" Oct 12 07:37:24 crc kubenswrapper[4599]: E1012 07:37:24.020982 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 07:37:24.520969923 +0000 UTC m=+141.310165424 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fwlgl" (UID: "6d9a3bfc-818e-445a-a027-264cfcfade2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 07:37:24 crc kubenswrapper[4599]: I1012 07:37:24.023763 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rmrp7"] Oct 12 07:37:24 crc kubenswrapper[4599]: I1012 07:37:24.032254 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pnj7m" event={"ID":"ce6df133-1d7c-41a7-a9a8-a18676bae045","Type":"ContainerStarted","Data":"a96bec2c2bb820452b113bac5a1576ecf381876cf57fa8ff5df6da7dcacc26b0"} Oct 12 07:37:24 crc kubenswrapper[4599]: I1012 07:37:24.034907 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-gjqh4" event={"ID":"c2d2a6b0-e99f-4c3b-8535-77d97c89d3ed","Type":"ContainerStarted","Data":"242c2ea5c0c4b18dbbcb53359a5184541b7af04b833c71c72fa7626d6bf0d39e"} Oct 12 07:37:24 crc kubenswrapper[4599]: I1012 07:37:24.034927 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-gjqh4" event={"ID":"c2d2a6b0-e99f-4c3b-8535-77d97c89d3ed","Type":"ContainerStarted","Data":"3e613922ae9c820e1519c9ae4e1c426fdb3d154ca232a85c7e8a6e6e482beea2"} Oct 12 07:37:24 crc kubenswrapper[4599]: I1012 07:37:24.049615 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pnj7m" Oct 12 07:37:24 crc kubenswrapper[4599]: I1012 07:37:24.053147 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5kmjp" event={"ID":"147458bd-e3d6-4425-99ab-d2bd59fc2816","Type":"ContainerStarted","Data":"247e87522c56bdcda7fca1dd1b83d9c368976a5a42eeb25483951635f3413a9b"} Oct 12 07:37:24 crc kubenswrapper[4599]: I1012 07:37:24.054422 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-ks8z9" podStartSLOduration=123.054400649 podStartE2EDuration="2m3.054400649s" podCreationTimestamp="2025-10-12 07:35:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:37:24.049014755 +0000 UTC m=+140.838210257" watchObservedRunningTime="2025-10-12 07:37:24.054400649 +0000 UTC m=+140.843596151" Oct 12 07:37:24 crc kubenswrapper[4599]: I1012 07:37:24.064589 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-h9rz4" podStartSLOduration=8.064578966 podStartE2EDuration="8.064578966s" podCreationTimestamp="2025-10-12 07:37:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:37:24.059962304 +0000 UTC m=+140.849157805" watchObservedRunningTime="2025-10-12 07:37:24.064578966 +0000 UTC m=+140.853774468" Oct 12 07:37:24 crc kubenswrapper[4599]: I1012 07:37:24.096205 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-v2b5m" event={"ID":"10b365db-4478-4250-abe6-fa9e77354d70","Type":"ContainerStarted","Data":"0b164bff0e22ed0342a875650890587bbbc56243f615f11c5735c78b5699e125"} Oct 12 07:37:24 crc kubenswrapper[4599]: I1012 07:37:24.096257 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-v2b5m" event={"ID":"10b365db-4478-4250-abe6-fa9e77354d70","Type":"ContainerStarted","Data":"9c8c765e974e53a662408fc3d6a6f78fe7d8e0aa32b1ad6c8ce90fbfa39cc852"} Oct 12 07:37:24 crc kubenswrapper[4599]: I1012 07:37:24.133024 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 07:37:24 crc kubenswrapper[4599]: E1012 07:37:24.134277 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 07:37:24.634260618 +0000 UTC m=+141.423456121 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 07:37:24 crc kubenswrapper[4599]: I1012 07:37:24.144270 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5kmjp" podStartSLOduration=123.144253937 podStartE2EDuration="2m3.144253937s" podCreationTimestamp="2025-10-12 07:35:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:37:24.117935202 +0000 UTC m=+140.907130704" watchObservedRunningTime="2025-10-12 07:37:24.144253937 +0000 UTC m=+140.933449439" Oct 12 07:37:24 crc kubenswrapper[4599]: I1012 07:37:24.146121 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-gjqh4" podStartSLOduration=123.146115451 podStartE2EDuration="2m3.146115451s" podCreationTimestamp="2025-10-12 07:35:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:37:24.14111798 +0000 UTC m=+140.930313482" watchObservedRunningTime="2025-10-12 07:37:24.146115451 +0000 UTC m=+140.935310963" Oct 12 07:37:24 crc kubenswrapper[4599]: I1012 07:37:24.154448 4599 patch_prober.go:28] interesting pod/router-default-5444994796-r64z4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 12 07:37:24 crc kubenswrapper[4599]: [-]has-synced failed: reason withheld Oct 12 07:37:24 crc kubenswrapper[4599]: [+]process-running ok Oct 12 07:37:24 crc kubenswrapper[4599]: healthz check failed Oct 12 07:37:24 crc kubenswrapper[4599]: I1012 07:37:24.154501 4599 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r64z4" podUID="1712254f-c8df-4c98-bfb2-bf79d98a6161" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 12 07:37:24 crc kubenswrapper[4599]: I1012 07:37:24.179449 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xq6bd" event={"ID":"c565f4c6-fef1-4ba7-be0b-cfb3ade367a9","Type":"ContainerStarted","Data":"e637b2afb5249e67282ea2b0c0619f356e36aa6ddec44fe9e9b1d7c1fec095f0"} Oct 12 07:37:24 crc kubenswrapper[4599]: I1012 07:37:24.179495 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xq6bd" event={"ID":"c565f4c6-fef1-4ba7-be0b-cfb3ade367a9","Type":"ContainerStarted","Data":"7374c7d9fdd18e13f46c366456fd282627662fb32c2ba906c9b29a88f30f6aa4"} Oct 12 07:37:24 crc kubenswrapper[4599]: I1012 07:37:24.180434 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xq6bd" Oct 12 07:37:24 crc kubenswrapper[4599]: I1012 07:37:24.182102 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-v2b5m" podStartSLOduration=124.182090207 podStartE2EDuration="2m4.182090207s" podCreationTimestamp="2025-10-12 07:35:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:37:24.181591726 +0000 UTC m=+140.970787229" watchObservedRunningTime="2025-10-12 07:37:24.182090207 +0000 UTC m=+140.971285709" Oct 12 07:37:24 crc kubenswrapper[4599]: I1012 07:37:24.196802 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xq6bd" Oct 12 07:37:24 crc kubenswrapper[4599]: I1012 07:37:24.232954 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xq6bd" podStartSLOduration=123.232934603 podStartE2EDuration="2m3.232934603s" podCreationTimestamp="2025-10-12 07:35:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:37:24.232353807 +0000 UTC m=+141.021549309" watchObservedRunningTime="2025-10-12 07:37:24.232934603 +0000 UTC m=+141.022130095" Oct 12 07:37:24 crc kubenswrapper[4599]: I1012 07:37:24.235900 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fwlgl\" (UID: \"6d9a3bfc-818e-445a-a027-264cfcfade2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-fwlgl" Oct 12 07:37:24 crc kubenswrapper[4599]: E1012 07:37:24.239791 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 07:37:24.739773419 +0000 UTC m=+141.528968920 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fwlgl" (UID: "6d9a3bfc-818e-445a-a027-264cfcfade2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 07:37:24 crc kubenswrapper[4599]: I1012 07:37:24.247931 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r6fd4" event={"ID":"63a0b5d9-8401-4c7d-b897-295c91ffc508","Type":"ContainerStarted","Data":"c85c349d70a2a3512f98507463ff1840f042d7964e4e38c3cdac2df38830662c"} Oct 12 07:37:24 crc kubenswrapper[4599]: I1012 07:37:24.247972 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r6fd4" Oct 12 07:37:24 crc kubenswrapper[4599]: I1012 07:37:24.247983 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r6fd4" event={"ID":"63a0b5d9-8401-4c7d-b897-295c91ffc508","Type":"ContainerStarted","Data":"a9c8571a1c1770dd55e184670530dbf7f9f97c83a668734e87484d377dcf1bb6"} Oct 12 07:37:24 crc kubenswrapper[4599]: I1012 07:37:24.250686 4599 patch_prober.go:28] interesting pod/downloads-7954f5f757-kf2t6 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Oct 12 07:37:24 crc kubenswrapper[4599]: I1012 07:37:24.250746 4599 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-kf2t6" podUID="b84f6b8e-f98e-4d84-823f-f83d7912bc6b" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Oct 12 07:37:24 crc kubenswrapper[4599]: I1012 07:37:24.307587 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fklx9"] Oct 12 07:37:24 crc kubenswrapper[4599]: I1012 07:37:24.317914 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-gj5kj" Oct 12 07:37:24 crc kubenswrapper[4599]: I1012 07:37:24.329517 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r6fd4" podStartSLOduration=123.329496031 podStartE2EDuration="2m3.329496031s" podCreationTimestamp="2025-10-12 07:35:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:37:24.327915278 +0000 UTC m=+141.117110780" watchObservedRunningTime="2025-10-12 07:37:24.329496031 +0000 UTC m=+141.118691533" Oct 12 07:37:24 crc kubenswrapper[4599]: I1012 07:37:24.338214 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 07:37:24 crc kubenswrapper[4599]: E1012 07:37:24.340204 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 07:37:24.840185623 +0000 UTC m=+141.629381124 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 07:37:24 crc kubenswrapper[4599]: I1012 07:37:24.404386 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-68sdf"] Oct 12 07:37:24 crc kubenswrapper[4599]: I1012 07:37:24.440417 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fwlgl\" (UID: \"6d9a3bfc-818e-445a-a027-264cfcfade2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-fwlgl" Oct 12 07:37:24 crc kubenswrapper[4599]: E1012 07:37:24.441105 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 07:37:24.941086599 +0000 UTC m=+141.730282101 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fwlgl" (UID: "6d9a3bfc-818e-445a-a027-264cfcfade2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 07:37:24 crc kubenswrapper[4599]: I1012 07:37:24.542235 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 07:37:24 crc kubenswrapper[4599]: E1012 07:37:24.542705 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 07:37:25.042688447 +0000 UTC m=+141.831883950 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 07:37:24 crc kubenswrapper[4599]: I1012 07:37:24.579616 4599 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Oct 12 07:37:24 crc kubenswrapper[4599]: I1012 07:37:24.647024 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fwlgl\" (UID: \"6d9a3bfc-818e-445a-a027-264cfcfade2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-fwlgl" Oct 12 07:37:24 crc kubenswrapper[4599]: E1012 07:37:24.647430 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 07:37:25.147411376 +0000 UTC m=+141.936606878 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fwlgl" (UID: "6d9a3bfc-818e-445a-a027-264cfcfade2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 07:37:24 crc kubenswrapper[4599]: I1012 07:37:24.673581 4599 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-10-12T07:37:24.579639717Z","Handler":null,"Name":""} Oct 12 07:37:24 crc kubenswrapper[4599]: I1012 07:37:24.682671 4599 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Oct 12 07:37:24 crc kubenswrapper[4599]: I1012 07:37:24.682698 4599 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Oct 12 07:37:24 crc kubenswrapper[4599]: I1012 07:37:24.749354 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 07:37:24 crc kubenswrapper[4599]: I1012 07:37:24.755063 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 12 07:37:24 crc kubenswrapper[4599]: I1012 07:37:24.851088 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fwlgl\" (UID: \"6d9a3bfc-818e-445a-a027-264cfcfade2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-fwlgl" Oct 12 07:37:24 crc kubenswrapper[4599]: I1012 07:37:24.855293 4599 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 12 07:37:24 crc kubenswrapper[4599]: I1012 07:37:24.855365 4599 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fwlgl\" (UID: \"6d9a3bfc-818e-445a-a027-264cfcfade2b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-fwlgl" Oct 12 07:37:24 crc kubenswrapper[4599]: I1012 07:37:24.893625 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fwlgl\" (UID: \"6d9a3bfc-818e-445a-a027-264cfcfade2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-fwlgl" Oct 12 07:37:25 crc kubenswrapper[4599]: I1012 07:37:25.049552 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-fwlgl" Oct 12 07:37:25 crc kubenswrapper[4599]: I1012 07:37:25.138056 4599 patch_prober.go:28] interesting pod/router-default-5444994796-r64z4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 12 07:37:25 crc kubenswrapper[4599]: [-]has-synced failed: reason withheld Oct 12 07:37:25 crc kubenswrapper[4599]: [+]process-running ok Oct 12 07:37:25 crc kubenswrapper[4599]: healthz check failed Oct 12 07:37:25 crc kubenswrapper[4599]: I1012 07:37:25.138346 4599 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r64z4" podUID="1712254f-c8df-4c98-bfb2-bf79d98a6161" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 12 07:37:25 crc kubenswrapper[4599]: I1012 07:37:25.144547 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-h8fml"] Oct 12 07:37:25 crc kubenswrapper[4599]: I1012 07:37:25.145476 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h8fml" Oct 12 07:37:25 crc kubenswrapper[4599]: I1012 07:37:25.148145 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 12 07:37:25 crc kubenswrapper[4599]: I1012 07:37:25.160478 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-h8fml"] Oct 12 07:37:25 crc kubenswrapper[4599]: I1012 07:37:25.226443 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-fwlgl"] Oct 12 07:37:25 crc kubenswrapper[4599]: I1012 07:37:25.255018 4599 generic.go:334] "Generic (PLEG): container finished" podID="a84bfa7d-3d7b-48df-ba61-ab9640a00e1e" containerID="68d8337fb2cbf76dad3dfd0f642bede2bdc1f9299272093f7d09940757493074" exitCode=0 Oct 12 07:37:25 crc kubenswrapper[4599]: I1012 07:37:25.255102 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-68sdf" event={"ID":"a84bfa7d-3d7b-48df-ba61-ab9640a00e1e","Type":"ContainerDied","Data":"68d8337fb2cbf76dad3dfd0f642bede2bdc1f9299272093f7d09940757493074"} Oct 12 07:37:25 crc kubenswrapper[4599]: I1012 07:37:25.255136 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-68sdf" event={"ID":"a84bfa7d-3d7b-48df-ba61-ab9640a00e1e","Type":"ContainerStarted","Data":"1a3830d45c4e9ce666697152963d61721646271a0f8924f5d83ccb55b101f425"} Oct 12 07:37:25 crc kubenswrapper[4599]: I1012 07:37:25.256899 4599 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 12 07:37:25 crc kubenswrapper[4599]: I1012 07:37:25.257167 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlglp\" (UniqueName: \"kubernetes.io/projected/cef6ba33-1bde-40ae-8025-05f6234cd636-kube-api-access-jlglp\") pod \"redhat-marketplace-h8fml\" (UID: \"cef6ba33-1bde-40ae-8025-05f6234cd636\") " pod="openshift-marketplace/redhat-marketplace-h8fml" Oct 12 07:37:25 crc kubenswrapper[4599]: I1012 07:37:25.257226 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cef6ba33-1bde-40ae-8025-05f6234cd636-utilities\") pod \"redhat-marketplace-h8fml\" (UID: \"cef6ba33-1bde-40ae-8025-05f6234cd636\") " pod="openshift-marketplace/redhat-marketplace-h8fml" Oct 12 07:37:25 crc kubenswrapper[4599]: I1012 07:37:25.257297 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cef6ba33-1bde-40ae-8025-05f6234cd636-catalog-content\") pod \"redhat-marketplace-h8fml\" (UID: \"cef6ba33-1bde-40ae-8025-05f6234cd636\") " pod="openshift-marketplace/redhat-marketplace-h8fml" Oct 12 07:37:25 crc kubenswrapper[4599]: I1012 07:37:25.259275 4599 generic.go:334] "Generic (PLEG): container finished" podID="ddcc528b-bf0a-404c-b121-cee466cb352c" containerID="76588b57c57fba6c8aaa04548ee3ad8b47b75c9f8dba00cc60db84fd032524f6" exitCode=0 Oct 12 07:37:25 crc kubenswrapper[4599]: I1012 07:37:25.259328 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mxnmk" event={"ID":"ddcc528b-bf0a-404c-b121-cee466cb352c","Type":"ContainerDied","Data":"76588b57c57fba6c8aaa04548ee3ad8b47b75c9f8dba00cc60db84fd032524f6"} Oct 12 07:37:25 crc kubenswrapper[4599]: I1012 07:37:25.259361 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mxnmk" event={"ID":"ddcc528b-bf0a-404c-b121-cee466cb352c","Type":"ContainerStarted","Data":"95c3ef1d3053912b86beb5a65805e9b00636503d078e67ca47e8a1b4c17423c1"} Oct 12 07:37:25 crc kubenswrapper[4599]: I1012 07:37:25.264492 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-ljckr" event={"ID":"2ca6d2a5-c322-47e7-97c5-cbae83506fe9","Type":"ContainerStarted","Data":"1baf5572c609172c86771ed49fe2a663ec0be9f19cdb3cd98868ad9bedf5ee13"} Oct 12 07:37:25 crc kubenswrapper[4599]: I1012 07:37:25.264530 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-ljckr" event={"ID":"2ca6d2a5-c322-47e7-97c5-cbae83506fe9","Type":"ContainerStarted","Data":"b4f2644429c51c6e3f6a6100936adb0f70212044ebd7424e5dcc207346c72743"} Oct 12 07:37:25 crc kubenswrapper[4599]: I1012 07:37:25.264557 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-ljckr" event={"ID":"2ca6d2a5-c322-47e7-97c5-cbae83506fe9","Type":"ContainerStarted","Data":"85f48f6746f0e74c517b52acb13d5f6b913f7fe6798c9d0754a288c833d4500f"} Oct 12 07:37:25 crc kubenswrapper[4599]: I1012 07:37:25.266879 4599 generic.go:334] "Generic (PLEG): container finished" podID="b914c467-be05-49d4-a391-d98254248ade" containerID="34746348f8f031d16440cfbf96cb4273432f70222570190aa8f7506da485c2f0" exitCode=0 Oct 12 07:37:25 crc kubenswrapper[4599]: I1012 07:37:25.266995 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rmrp7" event={"ID":"b914c467-be05-49d4-a391-d98254248ade","Type":"ContainerDied","Data":"34746348f8f031d16440cfbf96cb4273432f70222570190aa8f7506da485c2f0"} Oct 12 07:37:25 crc kubenswrapper[4599]: I1012 07:37:25.267033 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rmrp7" event={"ID":"b914c467-be05-49d4-a391-d98254248ade","Type":"ContainerStarted","Data":"cb1a4eb0176453d2db2fd0781e4bb24420268e13aae3fee3ec5f1f6f46db20bf"} Oct 12 07:37:25 crc kubenswrapper[4599]: I1012 07:37:25.270369 4599 generic.go:334] "Generic (PLEG): container finished" podID="e2b7b084-738f-430d-8e2c-9c1b5c0ea421" containerID="69a0bb4c9375306a32148df64258e11dc2c7f391451403ccc07eae70380950ac" exitCode=0 Oct 12 07:37:25 crc kubenswrapper[4599]: I1012 07:37:25.270429 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fklx9" event={"ID":"e2b7b084-738f-430d-8e2c-9c1b5c0ea421","Type":"ContainerDied","Data":"69a0bb4c9375306a32148df64258e11dc2c7f391451403ccc07eae70380950ac"} Oct 12 07:37:25 crc kubenswrapper[4599]: I1012 07:37:25.270457 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fklx9" event={"ID":"e2b7b084-738f-430d-8e2c-9c1b5c0ea421","Type":"ContainerStarted","Data":"cf14898142682e25100c496652de2202483bb25e8a0d0434fa1c472d854207e1"} Oct 12 07:37:25 crc kubenswrapper[4599]: I1012 07:37:25.272203 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-fwlgl" event={"ID":"6d9a3bfc-818e-445a-a027-264cfcfade2b","Type":"ContainerStarted","Data":"2774b3586dd22f700ce384d2404bb08a4474566fad82f025a372971a167f992a"} Oct 12 07:37:25 crc kubenswrapper[4599]: I1012 07:37:25.273609 4599 patch_prober.go:28] interesting pod/downloads-7954f5f757-kf2t6 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Oct 12 07:37:25 crc kubenswrapper[4599]: I1012 07:37:25.273656 4599 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-kf2t6" podUID="b84f6b8e-f98e-4d84-823f-f83d7912bc6b" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Oct 12 07:37:25 crc kubenswrapper[4599]: I1012 07:37:25.279511 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-mhj6z" Oct 12 07:37:25 crc kubenswrapper[4599]: I1012 07:37:25.280184 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-ks8z9" Oct 12 07:37:25 crc kubenswrapper[4599]: I1012 07:37:25.286120 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-ljckr" podStartSLOduration=9.286107404 podStartE2EDuration="9.286107404s" podCreationTimestamp="2025-10-12 07:37:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:37:25.283944382 +0000 UTC m=+142.073139884" watchObservedRunningTime="2025-10-12 07:37:25.286107404 +0000 UTC m=+142.075302906" Oct 12 07:37:25 crc kubenswrapper[4599]: I1012 07:37:25.358646 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlglp\" (UniqueName: \"kubernetes.io/projected/cef6ba33-1bde-40ae-8025-05f6234cd636-kube-api-access-jlglp\") pod \"redhat-marketplace-h8fml\" (UID: \"cef6ba33-1bde-40ae-8025-05f6234cd636\") " pod="openshift-marketplace/redhat-marketplace-h8fml" Oct 12 07:37:25 crc kubenswrapper[4599]: I1012 07:37:25.358848 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cef6ba33-1bde-40ae-8025-05f6234cd636-utilities\") pod \"redhat-marketplace-h8fml\" (UID: \"cef6ba33-1bde-40ae-8025-05f6234cd636\") " pod="openshift-marketplace/redhat-marketplace-h8fml" Oct 12 07:37:25 crc kubenswrapper[4599]: I1012 07:37:25.359194 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cef6ba33-1bde-40ae-8025-05f6234cd636-catalog-content\") pod \"redhat-marketplace-h8fml\" (UID: \"cef6ba33-1bde-40ae-8025-05f6234cd636\") " pod="openshift-marketplace/redhat-marketplace-h8fml" Oct 12 07:37:25 crc kubenswrapper[4599]: I1012 07:37:25.363329 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cef6ba33-1bde-40ae-8025-05f6234cd636-utilities\") pod \"redhat-marketplace-h8fml\" (UID: \"cef6ba33-1bde-40ae-8025-05f6234cd636\") " pod="openshift-marketplace/redhat-marketplace-h8fml" Oct 12 07:37:25 crc kubenswrapper[4599]: I1012 07:37:25.364643 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cef6ba33-1bde-40ae-8025-05f6234cd636-catalog-content\") pod \"redhat-marketplace-h8fml\" (UID: \"cef6ba33-1bde-40ae-8025-05f6234cd636\") " pod="openshift-marketplace/redhat-marketplace-h8fml" Oct 12 07:37:25 crc kubenswrapper[4599]: I1012 07:37:25.383865 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlglp\" (UniqueName: \"kubernetes.io/projected/cef6ba33-1bde-40ae-8025-05f6234cd636-kube-api-access-jlglp\") pod \"redhat-marketplace-h8fml\" (UID: \"cef6ba33-1bde-40ae-8025-05f6234cd636\") " pod="openshift-marketplace/redhat-marketplace-h8fml" Oct 12 07:37:25 crc kubenswrapper[4599]: I1012 07:37:25.458926 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h8fml" Oct 12 07:37:25 crc kubenswrapper[4599]: I1012 07:37:25.544314 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-cnl9m"] Oct 12 07:37:25 crc kubenswrapper[4599]: I1012 07:37:25.545746 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cnl9m" Oct 12 07:37:25 crc kubenswrapper[4599]: I1012 07:37:25.553800 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Oct 12 07:37:25 crc kubenswrapper[4599]: I1012 07:37:25.554346 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cnl9m"] Oct 12 07:37:25 crc kubenswrapper[4599]: I1012 07:37:25.609669 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-h8fml"] Oct 12 07:37:25 crc kubenswrapper[4599]: W1012 07:37:25.621024 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcef6ba33_1bde_40ae_8025_05f6234cd636.slice/crio-363864e6962e87086bd9cdba331a628918793ccc61055dd55edf3f96ce957a92 WatchSource:0}: Error finding container 363864e6962e87086bd9cdba331a628918793ccc61055dd55edf3f96ce957a92: Status 404 returned error can't find the container with id 363864e6962e87086bd9cdba331a628918793ccc61055dd55edf3f96ce957a92 Oct 12 07:37:25 crc kubenswrapper[4599]: I1012 07:37:25.668622 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58710253-2df7-4fef-89bd-e5b6267bd0b4-utilities\") pod \"redhat-marketplace-cnl9m\" (UID: \"58710253-2df7-4fef-89bd-e5b6267bd0b4\") " pod="openshift-marketplace/redhat-marketplace-cnl9m" Oct 12 07:37:25 crc kubenswrapper[4599]: I1012 07:37:25.668683 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58710253-2df7-4fef-89bd-e5b6267bd0b4-catalog-content\") pod \"redhat-marketplace-cnl9m\" (UID: \"58710253-2df7-4fef-89bd-e5b6267bd0b4\") " pod="openshift-marketplace/redhat-marketplace-cnl9m" Oct 12 07:37:25 crc kubenswrapper[4599]: I1012 07:37:25.668720 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxk5n\" (UniqueName: \"kubernetes.io/projected/58710253-2df7-4fef-89bd-e5b6267bd0b4-kube-api-access-hxk5n\") pod \"redhat-marketplace-cnl9m\" (UID: \"58710253-2df7-4fef-89bd-e5b6267bd0b4\") " pod="openshift-marketplace/redhat-marketplace-cnl9m" Oct 12 07:37:25 crc kubenswrapper[4599]: I1012 07:37:25.769219 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58710253-2df7-4fef-89bd-e5b6267bd0b4-utilities\") pod \"redhat-marketplace-cnl9m\" (UID: \"58710253-2df7-4fef-89bd-e5b6267bd0b4\") " pod="openshift-marketplace/redhat-marketplace-cnl9m" Oct 12 07:37:25 crc kubenswrapper[4599]: I1012 07:37:25.769263 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58710253-2df7-4fef-89bd-e5b6267bd0b4-catalog-content\") pod \"redhat-marketplace-cnl9m\" (UID: \"58710253-2df7-4fef-89bd-e5b6267bd0b4\") " pod="openshift-marketplace/redhat-marketplace-cnl9m" Oct 12 07:37:25 crc kubenswrapper[4599]: I1012 07:37:25.769289 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxk5n\" (UniqueName: \"kubernetes.io/projected/58710253-2df7-4fef-89bd-e5b6267bd0b4-kube-api-access-hxk5n\") pod \"redhat-marketplace-cnl9m\" (UID: \"58710253-2df7-4fef-89bd-e5b6267bd0b4\") " pod="openshift-marketplace/redhat-marketplace-cnl9m" Oct 12 07:37:25 crc kubenswrapper[4599]: I1012 07:37:25.770120 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58710253-2df7-4fef-89bd-e5b6267bd0b4-utilities\") pod \"redhat-marketplace-cnl9m\" (UID: \"58710253-2df7-4fef-89bd-e5b6267bd0b4\") " pod="openshift-marketplace/redhat-marketplace-cnl9m" Oct 12 07:37:25 crc kubenswrapper[4599]: I1012 07:37:25.770147 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58710253-2df7-4fef-89bd-e5b6267bd0b4-catalog-content\") pod \"redhat-marketplace-cnl9m\" (UID: \"58710253-2df7-4fef-89bd-e5b6267bd0b4\") " pod="openshift-marketplace/redhat-marketplace-cnl9m" Oct 12 07:37:25 crc kubenswrapper[4599]: I1012 07:37:25.785558 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxk5n\" (UniqueName: \"kubernetes.io/projected/58710253-2df7-4fef-89bd-e5b6267bd0b4-kube-api-access-hxk5n\") pod \"redhat-marketplace-cnl9m\" (UID: \"58710253-2df7-4fef-89bd-e5b6267bd0b4\") " pod="openshift-marketplace/redhat-marketplace-cnl9m" Oct 12 07:37:25 crc kubenswrapper[4599]: I1012 07:37:25.877178 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cnl9m" Oct 12 07:37:26 crc kubenswrapper[4599]: I1012 07:37:26.050542 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cnl9m"] Oct 12 07:37:26 crc kubenswrapper[4599]: I1012 07:37:26.140791 4599 patch_prober.go:28] interesting pod/router-default-5444994796-r64z4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 12 07:37:26 crc kubenswrapper[4599]: [-]has-synced failed: reason withheld Oct 12 07:37:26 crc kubenswrapper[4599]: [+]process-running ok Oct 12 07:37:26 crc kubenswrapper[4599]: healthz check failed Oct 12 07:37:26 crc kubenswrapper[4599]: I1012 07:37:26.140834 4599 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r64z4" podUID="1712254f-c8df-4c98-bfb2-bf79d98a6161" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 12 07:37:26 crc kubenswrapper[4599]: I1012 07:37:26.143693 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6m4tn"] Oct 12 07:37:26 crc kubenswrapper[4599]: I1012 07:37:26.144852 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6m4tn" Oct 12 07:37:26 crc kubenswrapper[4599]: I1012 07:37:26.147460 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 12 07:37:26 crc kubenswrapper[4599]: I1012 07:37:26.184038 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6m4tn"] Oct 12 07:37:26 crc kubenswrapper[4599]: I1012 07:37:26.280489 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-fwlgl" event={"ID":"6d9a3bfc-818e-445a-a027-264cfcfade2b","Type":"ContainerStarted","Data":"bd7cef82d922801b435ec3c8a023cdf88cb3edbfb61625290db1f691e9c21629"} Oct 12 07:37:26 crc kubenswrapper[4599]: I1012 07:37:26.280628 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-fwlgl" Oct 12 07:37:26 crc kubenswrapper[4599]: I1012 07:37:26.284114 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/611866fc-22a0-472d-bcee-99765386e1fb-catalog-content\") pod \"redhat-operators-6m4tn\" (UID: \"611866fc-22a0-472d-bcee-99765386e1fb\") " pod="openshift-marketplace/redhat-operators-6m4tn" Oct 12 07:37:26 crc kubenswrapper[4599]: I1012 07:37:26.284180 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/611866fc-22a0-472d-bcee-99765386e1fb-utilities\") pod \"redhat-operators-6m4tn\" (UID: \"611866fc-22a0-472d-bcee-99765386e1fb\") " pod="openshift-marketplace/redhat-operators-6m4tn" Oct 12 07:37:26 crc kubenswrapper[4599]: I1012 07:37:26.284236 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbqcx\" (UniqueName: \"kubernetes.io/projected/611866fc-22a0-472d-bcee-99765386e1fb-kube-api-access-jbqcx\") pod \"redhat-operators-6m4tn\" (UID: \"611866fc-22a0-472d-bcee-99765386e1fb\") " pod="openshift-marketplace/redhat-operators-6m4tn" Oct 12 07:37:26 crc kubenswrapper[4599]: I1012 07:37:26.285294 4599 generic.go:334] "Generic (PLEG): container finished" podID="a560875d-c07e-457b-a77b-809cc770867c" containerID="09d77bf8350365feb41b8b233fae9fd3ad7ad85f82c388edfbf15103162997eb" exitCode=0 Oct 12 07:37:26 crc kubenswrapper[4599]: I1012 07:37:26.285378 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29337570-l9r8r" event={"ID":"a560875d-c07e-457b-a77b-809cc770867c","Type":"ContainerDied","Data":"09d77bf8350365feb41b8b233fae9fd3ad7ad85f82c388edfbf15103162997eb"} Oct 12 07:37:26 crc kubenswrapper[4599]: I1012 07:37:26.287265 4599 generic.go:334] "Generic (PLEG): container finished" podID="cef6ba33-1bde-40ae-8025-05f6234cd636" containerID="f9bcdfcfcbd626b50e1866253e8a8023879ea09f069527d80a08f5b24e586abe" exitCode=0 Oct 12 07:37:26 crc kubenswrapper[4599]: I1012 07:37:26.287324 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h8fml" event={"ID":"cef6ba33-1bde-40ae-8025-05f6234cd636","Type":"ContainerDied","Data":"f9bcdfcfcbd626b50e1866253e8a8023879ea09f069527d80a08f5b24e586abe"} Oct 12 07:37:26 crc kubenswrapper[4599]: I1012 07:37:26.287386 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h8fml" event={"ID":"cef6ba33-1bde-40ae-8025-05f6234cd636","Type":"ContainerStarted","Data":"363864e6962e87086bd9cdba331a628918793ccc61055dd55edf3f96ce957a92"} Oct 12 07:37:26 crc kubenswrapper[4599]: I1012 07:37:26.292268 4599 generic.go:334] "Generic (PLEG): container finished" podID="58710253-2df7-4fef-89bd-e5b6267bd0b4" containerID="946159b5aec9641badd7ea8eef8018d45800ea7343e5ae8073faf5077358b29c" exitCode=0 Oct 12 07:37:26 crc kubenswrapper[4599]: I1012 07:37:26.292370 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cnl9m" event={"ID":"58710253-2df7-4fef-89bd-e5b6267bd0b4","Type":"ContainerDied","Data":"946159b5aec9641badd7ea8eef8018d45800ea7343e5ae8073faf5077358b29c"} Oct 12 07:37:26 crc kubenswrapper[4599]: I1012 07:37:26.292416 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cnl9m" event={"ID":"58710253-2df7-4fef-89bd-e5b6267bd0b4","Type":"ContainerStarted","Data":"14e694855222391d392f8d524d45dfa4ad3502201f52be123c81a8ec403ca850"} Oct 12 07:37:26 crc kubenswrapper[4599]: I1012 07:37:26.302027 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-fwlgl" podStartSLOduration=125.302012198 podStartE2EDuration="2m5.302012198s" podCreationTimestamp="2025-10-12 07:35:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:37:26.294757458 +0000 UTC m=+143.083952960" watchObservedRunningTime="2025-10-12 07:37:26.302012198 +0000 UTC m=+143.091207700" Oct 12 07:37:26 crc kubenswrapper[4599]: I1012 07:37:26.386834 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbqcx\" (UniqueName: \"kubernetes.io/projected/611866fc-22a0-472d-bcee-99765386e1fb-kube-api-access-jbqcx\") pod \"redhat-operators-6m4tn\" (UID: \"611866fc-22a0-472d-bcee-99765386e1fb\") " pod="openshift-marketplace/redhat-operators-6m4tn" Oct 12 07:37:26 crc kubenswrapper[4599]: I1012 07:37:26.387778 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/611866fc-22a0-472d-bcee-99765386e1fb-catalog-content\") pod \"redhat-operators-6m4tn\" (UID: \"611866fc-22a0-472d-bcee-99765386e1fb\") " pod="openshift-marketplace/redhat-operators-6m4tn" Oct 12 07:37:26 crc kubenswrapper[4599]: I1012 07:37:26.388160 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/611866fc-22a0-472d-bcee-99765386e1fb-catalog-content\") pod \"redhat-operators-6m4tn\" (UID: \"611866fc-22a0-472d-bcee-99765386e1fb\") " pod="openshift-marketplace/redhat-operators-6m4tn" Oct 12 07:37:26 crc kubenswrapper[4599]: I1012 07:37:26.388268 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/611866fc-22a0-472d-bcee-99765386e1fb-utilities\") pod \"redhat-operators-6m4tn\" (UID: \"611866fc-22a0-472d-bcee-99765386e1fb\") " pod="openshift-marketplace/redhat-operators-6m4tn" Oct 12 07:37:26 crc kubenswrapper[4599]: I1012 07:37:26.388729 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/611866fc-22a0-472d-bcee-99765386e1fb-utilities\") pod \"redhat-operators-6m4tn\" (UID: \"611866fc-22a0-472d-bcee-99765386e1fb\") " pod="openshift-marketplace/redhat-operators-6m4tn" Oct 12 07:37:26 crc kubenswrapper[4599]: I1012 07:37:26.406001 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbqcx\" (UniqueName: \"kubernetes.io/projected/611866fc-22a0-472d-bcee-99765386e1fb-kube-api-access-jbqcx\") pod \"redhat-operators-6m4tn\" (UID: \"611866fc-22a0-472d-bcee-99765386e1fb\") " pod="openshift-marketplace/redhat-operators-6m4tn" Oct 12 07:37:26 crc kubenswrapper[4599]: I1012 07:37:26.485379 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6m4tn" Oct 12 07:37:26 crc kubenswrapper[4599]: I1012 07:37:26.546028 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fn444"] Oct 12 07:37:26 crc kubenswrapper[4599]: I1012 07:37:26.547361 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fn444" Oct 12 07:37:26 crc kubenswrapper[4599]: I1012 07:37:26.554090 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fn444"] Oct 12 07:37:26 crc kubenswrapper[4599]: I1012 07:37:26.591599 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wb5r9\" (UniqueName: \"kubernetes.io/projected/0521e91a-c932-4050-9981-c52eefe3e1b9-kube-api-access-wb5r9\") pod \"redhat-operators-fn444\" (UID: \"0521e91a-c932-4050-9981-c52eefe3e1b9\") " pod="openshift-marketplace/redhat-operators-fn444" Oct 12 07:37:26 crc kubenswrapper[4599]: I1012 07:37:26.591660 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0521e91a-c932-4050-9981-c52eefe3e1b9-catalog-content\") pod \"redhat-operators-fn444\" (UID: \"0521e91a-c932-4050-9981-c52eefe3e1b9\") " pod="openshift-marketplace/redhat-operators-fn444" Oct 12 07:37:26 crc kubenswrapper[4599]: I1012 07:37:26.591788 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0521e91a-c932-4050-9981-c52eefe3e1b9-utilities\") pod \"redhat-operators-fn444\" (UID: \"0521e91a-c932-4050-9981-c52eefe3e1b9\") " pod="openshift-marketplace/redhat-operators-fn444" Oct 12 07:37:26 crc kubenswrapper[4599]: I1012 07:37:26.663566 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6m4tn"] Oct 12 07:37:26 crc kubenswrapper[4599]: I1012 07:37:26.696289 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wb5r9\" (UniqueName: \"kubernetes.io/projected/0521e91a-c932-4050-9981-c52eefe3e1b9-kube-api-access-wb5r9\") pod \"redhat-operators-fn444\" (UID: \"0521e91a-c932-4050-9981-c52eefe3e1b9\") " pod="openshift-marketplace/redhat-operators-fn444" Oct 12 07:37:26 crc kubenswrapper[4599]: I1012 07:37:26.696383 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0521e91a-c932-4050-9981-c52eefe3e1b9-catalog-content\") pod \"redhat-operators-fn444\" (UID: \"0521e91a-c932-4050-9981-c52eefe3e1b9\") " pod="openshift-marketplace/redhat-operators-fn444" Oct 12 07:37:26 crc kubenswrapper[4599]: I1012 07:37:26.696441 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0521e91a-c932-4050-9981-c52eefe3e1b9-utilities\") pod \"redhat-operators-fn444\" (UID: \"0521e91a-c932-4050-9981-c52eefe3e1b9\") " pod="openshift-marketplace/redhat-operators-fn444" Oct 12 07:37:26 crc kubenswrapper[4599]: I1012 07:37:26.697296 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0521e91a-c932-4050-9981-c52eefe3e1b9-utilities\") pod \"redhat-operators-fn444\" (UID: \"0521e91a-c932-4050-9981-c52eefe3e1b9\") " pod="openshift-marketplace/redhat-operators-fn444" Oct 12 07:37:26 crc kubenswrapper[4599]: I1012 07:37:26.697331 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0521e91a-c932-4050-9981-c52eefe3e1b9-catalog-content\") pod \"redhat-operators-fn444\" (UID: \"0521e91a-c932-4050-9981-c52eefe3e1b9\") " pod="openshift-marketplace/redhat-operators-fn444" Oct 12 07:37:26 crc kubenswrapper[4599]: W1012 07:37:26.700979 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod611866fc_22a0_472d_bcee_99765386e1fb.slice/crio-828ea93e289771fbe9c6ec8eaf07d5f983c250a55738e23993598393acb0e85a WatchSource:0}: Error finding container 828ea93e289771fbe9c6ec8eaf07d5f983c250a55738e23993598393acb0e85a: Status 404 returned error can't find the container with id 828ea93e289771fbe9c6ec8eaf07d5f983c250a55738e23993598393acb0e85a Oct 12 07:37:26 crc kubenswrapper[4599]: I1012 07:37:26.710933 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wb5r9\" (UniqueName: \"kubernetes.io/projected/0521e91a-c932-4050-9981-c52eefe3e1b9-kube-api-access-wb5r9\") pod \"redhat-operators-fn444\" (UID: \"0521e91a-c932-4050-9981-c52eefe3e1b9\") " pod="openshift-marketplace/redhat-operators-fn444" Oct 12 07:37:26 crc kubenswrapper[4599]: I1012 07:37:26.862634 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fn444" Oct 12 07:37:27 crc kubenswrapper[4599]: I1012 07:37:27.042432 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fn444"] Oct 12 07:37:27 crc kubenswrapper[4599]: I1012 07:37:27.058948 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 12 07:37:27 crc kubenswrapper[4599]: I1012 07:37:27.060405 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 12 07:37:27 crc kubenswrapper[4599]: I1012 07:37:27.062793 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 12 07:37:27 crc kubenswrapper[4599]: I1012 07:37:27.064367 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Oct 12 07:37:27 crc kubenswrapper[4599]: I1012 07:37:27.064558 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Oct 12 07:37:27 crc kubenswrapper[4599]: I1012 07:37:27.100548 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6fe01788-af0e-426a-8c28-a26764993905-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"6fe01788-af0e-426a-8c28-a26764993905\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 12 07:37:27 crc kubenswrapper[4599]: I1012 07:37:27.100654 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6fe01788-af0e-426a-8c28-a26764993905-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"6fe01788-af0e-426a-8c28-a26764993905\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 12 07:37:27 crc kubenswrapper[4599]: I1012 07:37:27.140664 4599 patch_prober.go:28] interesting pod/router-default-5444994796-r64z4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 12 07:37:27 crc kubenswrapper[4599]: [-]has-synced failed: reason withheld Oct 12 07:37:27 crc kubenswrapper[4599]: [+]process-running ok Oct 12 07:37:27 crc kubenswrapper[4599]: healthz check failed Oct 12 07:37:27 crc kubenswrapper[4599]: I1012 07:37:27.140723 4599 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r64z4" podUID="1712254f-c8df-4c98-bfb2-bf79d98a6161" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 12 07:37:27 crc kubenswrapper[4599]: I1012 07:37:27.205881 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6fe01788-af0e-426a-8c28-a26764993905-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"6fe01788-af0e-426a-8c28-a26764993905\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 12 07:37:27 crc kubenswrapper[4599]: I1012 07:37:27.206203 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6fe01788-af0e-426a-8c28-a26764993905-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"6fe01788-af0e-426a-8c28-a26764993905\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 12 07:37:27 crc kubenswrapper[4599]: I1012 07:37:27.206553 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6fe01788-af0e-426a-8c28-a26764993905-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"6fe01788-af0e-426a-8c28-a26764993905\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 12 07:37:27 crc kubenswrapper[4599]: I1012 07:37:27.224377 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6fe01788-af0e-426a-8c28-a26764993905-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"6fe01788-af0e-426a-8c28-a26764993905\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 12 07:37:27 crc kubenswrapper[4599]: I1012 07:37:27.302371 4599 generic.go:334] "Generic (PLEG): container finished" podID="0521e91a-c932-4050-9981-c52eefe3e1b9" containerID="e237d10b2590c0496d66772c943e401f156d1300bac87cde71792883e70ad367" exitCode=0 Oct 12 07:37:27 crc kubenswrapper[4599]: I1012 07:37:27.302457 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fn444" event={"ID":"0521e91a-c932-4050-9981-c52eefe3e1b9","Type":"ContainerDied","Data":"e237d10b2590c0496d66772c943e401f156d1300bac87cde71792883e70ad367"} Oct 12 07:37:27 crc kubenswrapper[4599]: I1012 07:37:27.302503 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fn444" event={"ID":"0521e91a-c932-4050-9981-c52eefe3e1b9","Type":"ContainerStarted","Data":"eac9f4bfcd96770f2335a6c6bfb1276c915b913648eb5124b1e2b6f8e3f99fe0"} Oct 12 07:37:27 crc kubenswrapper[4599]: I1012 07:37:27.304719 4599 generic.go:334] "Generic (PLEG): container finished" podID="611866fc-22a0-472d-bcee-99765386e1fb" containerID="a88249d06a85ba5bd9b7f291cfbd8aef6edf63782fbf6926332fe4dcd164c1fd" exitCode=0 Oct 12 07:37:27 crc kubenswrapper[4599]: I1012 07:37:27.306844 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6m4tn" event={"ID":"611866fc-22a0-472d-bcee-99765386e1fb","Type":"ContainerDied","Data":"a88249d06a85ba5bd9b7f291cfbd8aef6edf63782fbf6926332fe4dcd164c1fd"} Oct 12 07:37:27 crc kubenswrapper[4599]: I1012 07:37:27.306871 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6m4tn" event={"ID":"611866fc-22a0-472d-bcee-99765386e1fb","Type":"ContainerStarted","Data":"828ea93e289771fbe9c6ec8eaf07d5f983c250a55738e23993598393acb0e85a"} Oct 12 07:37:27 crc kubenswrapper[4599]: I1012 07:37:27.388455 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 12 07:37:27 crc kubenswrapper[4599]: I1012 07:37:27.556686 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29337570-l9r8r" Oct 12 07:37:27 crc kubenswrapper[4599]: I1012 07:37:27.611164 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a560875d-c07e-457b-a77b-809cc770867c-secret-volume\") pod \"a560875d-c07e-457b-a77b-809cc770867c\" (UID: \"a560875d-c07e-457b-a77b-809cc770867c\") " Oct 12 07:37:27 crc kubenswrapper[4599]: I1012 07:37:27.611224 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a560875d-c07e-457b-a77b-809cc770867c-config-volume\") pod \"a560875d-c07e-457b-a77b-809cc770867c\" (UID: \"a560875d-c07e-457b-a77b-809cc770867c\") " Oct 12 07:37:27 crc kubenswrapper[4599]: I1012 07:37:27.611266 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pzz22\" (UniqueName: \"kubernetes.io/projected/a560875d-c07e-457b-a77b-809cc770867c-kube-api-access-pzz22\") pod \"a560875d-c07e-457b-a77b-809cc770867c\" (UID: \"a560875d-c07e-457b-a77b-809cc770867c\") " Oct 12 07:37:27 crc kubenswrapper[4599]: I1012 07:37:27.612998 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a560875d-c07e-457b-a77b-809cc770867c-config-volume" (OuterVolumeSpecName: "config-volume") pod "a560875d-c07e-457b-a77b-809cc770867c" (UID: "a560875d-c07e-457b-a77b-809cc770867c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:37:27 crc kubenswrapper[4599]: I1012 07:37:27.618684 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a560875d-c07e-457b-a77b-809cc770867c-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a560875d-c07e-457b-a77b-809cc770867c" (UID: "a560875d-c07e-457b-a77b-809cc770867c"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:37:27 crc kubenswrapper[4599]: I1012 07:37:27.619674 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a560875d-c07e-457b-a77b-809cc770867c-kube-api-access-pzz22" (OuterVolumeSpecName: "kube-api-access-pzz22") pod "a560875d-c07e-457b-a77b-809cc770867c" (UID: "a560875d-c07e-457b-a77b-809cc770867c"). InnerVolumeSpecName "kube-api-access-pzz22". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:37:27 crc kubenswrapper[4599]: I1012 07:37:27.713634 4599 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a560875d-c07e-457b-a77b-809cc770867c-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 12 07:37:27 crc kubenswrapper[4599]: I1012 07:37:27.713933 4599 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a560875d-c07e-457b-a77b-809cc770867c-config-volume\") on node \"crc\" DevicePath \"\"" Oct 12 07:37:27 crc kubenswrapper[4599]: I1012 07:37:27.713947 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pzz22\" (UniqueName: \"kubernetes.io/projected/a560875d-c07e-457b-a77b-809cc770867c-kube-api-access-pzz22\") on node \"crc\" DevicePath \"\"" Oct 12 07:37:27 crc kubenswrapper[4599]: I1012 07:37:27.763678 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 12 07:37:27 crc kubenswrapper[4599]: W1012 07:37:27.772571 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod6fe01788_af0e_426a_8c28_a26764993905.slice/crio-1351f8ffe5fecf2ca010cc739164e528d3eaf7373029f61cc7f5f81364db0d77 WatchSource:0}: Error finding container 1351f8ffe5fecf2ca010cc739164e528d3eaf7373029f61cc7f5f81364db0d77: Status 404 returned error can't find the container with id 1351f8ffe5fecf2ca010cc739164e528d3eaf7373029f61cc7f5f81364db0d77 Oct 12 07:37:28 crc kubenswrapper[4599]: I1012 07:37:28.139166 4599 patch_prober.go:28] interesting pod/router-default-5444994796-r64z4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 12 07:37:28 crc kubenswrapper[4599]: [-]has-synced failed: reason withheld Oct 12 07:37:28 crc kubenswrapper[4599]: [+]process-running ok Oct 12 07:37:28 crc kubenswrapper[4599]: healthz check failed Oct 12 07:37:28 crc kubenswrapper[4599]: I1012 07:37:28.139443 4599 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r64z4" podUID="1712254f-c8df-4c98-bfb2-bf79d98a6161" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 12 07:37:28 crc kubenswrapper[4599]: I1012 07:37:28.313584 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"6fe01788-af0e-426a-8c28-a26764993905","Type":"ContainerStarted","Data":"a1a94b44edae2eb9afaca77ac482c56c6ab02ba04b82502ee186a711ad272aa8"} Oct 12 07:37:28 crc kubenswrapper[4599]: I1012 07:37:28.313701 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"6fe01788-af0e-426a-8c28-a26764993905","Type":"ContainerStarted","Data":"1351f8ffe5fecf2ca010cc739164e528d3eaf7373029f61cc7f5f81364db0d77"} Oct 12 07:37:28 crc kubenswrapper[4599]: I1012 07:37:28.316157 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29337570-l9r8r" event={"ID":"a560875d-c07e-457b-a77b-809cc770867c","Type":"ContainerDied","Data":"32231bc6559ab9349eb4f37f7397496baa8825e0857c02803e02e1b4f4d429f1"} Oct 12 07:37:28 crc kubenswrapper[4599]: I1012 07:37:28.316186 4599 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="32231bc6559ab9349eb4f37f7397496baa8825e0857c02803e02e1b4f4d429f1" Oct 12 07:37:28 crc kubenswrapper[4599]: I1012 07:37:28.316246 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29337570-l9r8r" Oct 12 07:37:28 crc kubenswrapper[4599]: I1012 07:37:28.321462 4599 patch_prober.go:28] interesting pod/machine-config-daemon-5mz5c container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 07:37:28 crc kubenswrapper[4599]: I1012 07:37:28.321535 4599 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 07:37:28 crc kubenswrapper[4599]: I1012 07:37:28.329450 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=1.329411302 podStartE2EDuration="1.329411302s" podCreationTimestamp="2025-10-12 07:37:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:37:28.329232094 +0000 UTC m=+145.118427596" watchObservedRunningTime="2025-10-12 07:37:28.329411302 +0000 UTC m=+145.118606805" Oct 12 07:37:28 crc kubenswrapper[4599]: I1012 07:37:28.777431 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-v2b5m" Oct 12 07:37:28 crc kubenswrapper[4599]: I1012 07:37:28.777491 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-v2b5m" Oct 12 07:37:28 crc kubenswrapper[4599]: I1012 07:37:28.787134 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-v2b5m" Oct 12 07:37:29 crc kubenswrapper[4599]: I1012 07:37:29.109112 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-kf2t6" Oct 12 07:37:29 crc kubenswrapper[4599]: I1012 07:37:29.135640 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-r64z4" Oct 12 07:37:29 crc kubenswrapper[4599]: I1012 07:37:29.139394 4599 patch_prober.go:28] interesting pod/router-default-5444994796-r64z4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 12 07:37:29 crc kubenswrapper[4599]: [-]has-synced failed: reason withheld Oct 12 07:37:29 crc kubenswrapper[4599]: [+]process-running ok Oct 12 07:37:29 crc kubenswrapper[4599]: healthz check failed Oct 12 07:37:29 crc kubenswrapper[4599]: I1012 07:37:29.139537 4599 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r64z4" podUID="1712254f-c8df-4c98-bfb2-bf79d98a6161" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 12 07:37:29 crc kubenswrapper[4599]: I1012 07:37:29.324689 4599 generic.go:334] "Generic (PLEG): container finished" podID="6fe01788-af0e-426a-8c28-a26764993905" containerID="a1a94b44edae2eb9afaca77ac482c56c6ab02ba04b82502ee186a711ad272aa8" exitCode=0 Oct 12 07:37:29 crc kubenswrapper[4599]: I1012 07:37:29.324769 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"6fe01788-af0e-426a-8c28-a26764993905","Type":"ContainerDied","Data":"a1a94b44edae2eb9afaca77ac482c56c6ab02ba04b82502ee186a711ad272aa8"} Oct 12 07:37:29 crc kubenswrapper[4599]: I1012 07:37:29.328863 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-lwg5q" Oct 12 07:37:29 crc kubenswrapper[4599]: I1012 07:37:29.328901 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-lwg5q" Oct 12 07:37:29 crc kubenswrapper[4599]: I1012 07:37:29.329222 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-v2b5m" Oct 12 07:37:29 crc kubenswrapper[4599]: I1012 07:37:29.340118 4599 patch_prober.go:28] interesting pod/console-f9d7485db-lwg5q container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.38:8443/health\": dial tcp 10.217.0.38:8443: connect: connection refused" start-of-body= Oct 12 07:37:29 crc kubenswrapper[4599]: I1012 07:37:29.340172 4599 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-lwg5q" podUID="f1daac45-8c29-46d6-a4ca-6c42bc99f1f7" containerName="console" probeResult="failure" output="Get \"https://10.217.0.38:8443/health\": dial tcp 10.217.0.38:8443: connect: connection refused" Oct 12 07:37:29 crc kubenswrapper[4599]: I1012 07:37:29.442376 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 07:37:29 crc kubenswrapper[4599]: I1012 07:37:29.442529 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 07:37:29 crc kubenswrapper[4599]: I1012 07:37:29.442650 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 07:37:29 crc kubenswrapper[4599]: I1012 07:37:29.442690 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 07:37:29 crc kubenswrapper[4599]: I1012 07:37:29.445914 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 07:37:29 crc kubenswrapper[4599]: I1012 07:37:29.450526 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 07:37:29 crc kubenswrapper[4599]: I1012 07:37:29.451204 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 07:37:29 crc kubenswrapper[4599]: I1012 07:37:29.462697 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 07:37:29 crc kubenswrapper[4599]: I1012 07:37:29.656902 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 07:37:29 crc kubenswrapper[4599]: I1012 07:37:29.664688 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 07:37:29 crc kubenswrapper[4599]: I1012 07:37:29.666531 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 07:37:30 crc kubenswrapper[4599]: I1012 07:37:30.137016 4599 patch_prober.go:28] interesting pod/router-default-5444994796-r64z4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 12 07:37:30 crc kubenswrapper[4599]: [-]has-synced failed: reason withheld Oct 12 07:37:30 crc kubenswrapper[4599]: [+]process-running ok Oct 12 07:37:30 crc kubenswrapper[4599]: healthz check failed Oct 12 07:37:30 crc kubenswrapper[4599]: I1012 07:37:30.137092 4599 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r64z4" podUID="1712254f-c8df-4c98-bfb2-bf79d98a6161" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 12 07:37:31 crc kubenswrapper[4599]: I1012 07:37:31.137717 4599 patch_prober.go:28] interesting pod/router-default-5444994796-r64z4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 12 07:37:31 crc kubenswrapper[4599]: [-]has-synced failed: reason withheld Oct 12 07:37:31 crc kubenswrapper[4599]: [+]process-running ok Oct 12 07:37:31 crc kubenswrapper[4599]: healthz check failed Oct 12 07:37:31 crc kubenswrapper[4599]: I1012 07:37:31.137816 4599 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r64z4" podUID="1712254f-c8df-4c98-bfb2-bf79d98a6161" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 12 07:37:32 crc kubenswrapper[4599]: I1012 07:37:32.138320 4599 patch_prober.go:28] interesting pod/router-default-5444994796-r64z4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 12 07:37:32 crc kubenswrapper[4599]: [-]has-synced failed: reason withheld Oct 12 07:37:32 crc kubenswrapper[4599]: [+]process-running ok Oct 12 07:37:32 crc kubenswrapper[4599]: healthz check failed Oct 12 07:37:32 crc kubenswrapper[4599]: I1012 07:37:32.139358 4599 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r64z4" podUID="1712254f-c8df-4c98-bfb2-bf79d98a6161" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 12 07:37:32 crc kubenswrapper[4599]: I1012 07:37:32.468684 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 12 07:37:32 crc kubenswrapper[4599]: E1012 07:37:32.468933 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a560875d-c07e-457b-a77b-809cc770867c" containerName="collect-profiles" Oct 12 07:37:32 crc kubenswrapper[4599]: I1012 07:37:32.468952 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="a560875d-c07e-457b-a77b-809cc770867c" containerName="collect-profiles" Oct 12 07:37:32 crc kubenswrapper[4599]: I1012 07:37:32.469047 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="a560875d-c07e-457b-a77b-809cc770867c" containerName="collect-profiles" Oct 12 07:37:32 crc kubenswrapper[4599]: I1012 07:37:32.469453 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 12 07:37:32 crc kubenswrapper[4599]: I1012 07:37:32.472361 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Oct 12 07:37:32 crc kubenswrapper[4599]: I1012 07:37:32.473411 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Oct 12 07:37:32 crc kubenswrapper[4599]: I1012 07:37:32.475687 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 12 07:37:32 crc kubenswrapper[4599]: I1012 07:37:32.592163 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1dd71739-e6a2-4279-9d03-6e94dd3d95e8-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"1dd71739-e6a2-4279-9d03-6e94dd3d95e8\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 12 07:37:32 crc kubenswrapper[4599]: I1012 07:37:32.592210 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1dd71739-e6a2-4279-9d03-6e94dd3d95e8-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"1dd71739-e6a2-4279-9d03-6e94dd3d95e8\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 12 07:37:32 crc kubenswrapper[4599]: I1012 07:37:32.693812 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1dd71739-e6a2-4279-9d03-6e94dd3d95e8-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"1dd71739-e6a2-4279-9d03-6e94dd3d95e8\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 12 07:37:32 crc kubenswrapper[4599]: I1012 07:37:32.693958 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1dd71739-e6a2-4279-9d03-6e94dd3d95e8-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"1dd71739-e6a2-4279-9d03-6e94dd3d95e8\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 12 07:37:32 crc kubenswrapper[4599]: I1012 07:37:32.693972 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1dd71739-e6a2-4279-9d03-6e94dd3d95e8-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"1dd71739-e6a2-4279-9d03-6e94dd3d95e8\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 12 07:37:32 crc kubenswrapper[4599]: I1012 07:37:32.721402 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1dd71739-e6a2-4279-9d03-6e94dd3d95e8-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"1dd71739-e6a2-4279-9d03-6e94dd3d95e8\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 12 07:37:32 crc kubenswrapper[4599]: I1012 07:37:32.791893 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 12 07:37:33 crc kubenswrapper[4599]: I1012 07:37:33.137912 4599 patch_prober.go:28] interesting pod/router-default-5444994796-r64z4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 12 07:37:33 crc kubenswrapper[4599]: [-]has-synced failed: reason withheld Oct 12 07:37:33 crc kubenswrapper[4599]: [+]process-running ok Oct 12 07:37:33 crc kubenswrapper[4599]: healthz check failed Oct 12 07:37:33 crc kubenswrapper[4599]: I1012 07:37:33.137974 4599 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r64z4" podUID="1712254f-c8df-4c98-bfb2-bf79d98a6161" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 12 07:37:33 crc kubenswrapper[4599]: I1012 07:37:33.171807 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 12 07:37:33 crc kubenswrapper[4599]: I1012 07:37:33.302790 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6fe01788-af0e-426a-8c28-a26764993905-kubelet-dir\") pod \"6fe01788-af0e-426a-8c28-a26764993905\" (UID: \"6fe01788-af0e-426a-8c28-a26764993905\") " Oct 12 07:37:33 crc kubenswrapper[4599]: I1012 07:37:33.302859 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6fe01788-af0e-426a-8c28-a26764993905-kube-api-access\") pod \"6fe01788-af0e-426a-8c28-a26764993905\" (UID: \"6fe01788-af0e-426a-8c28-a26764993905\") " Oct 12 07:37:33 crc kubenswrapper[4599]: I1012 07:37:33.302947 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6fe01788-af0e-426a-8c28-a26764993905-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "6fe01788-af0e-426a-8c28-a26764993905" (UID: "6fe01788-af0e-426a-8c28-a26764993905"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 12 07:37:33 crc kubenswrapper[4599]: I1012 07:37:33.303184 4599 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6fe01788-af0e-426a-8c28-a26764993905-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 12 07:37:33 crc kubenswrapper[4599]: I1012 07:37:33.319987 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fe01788-af0e-426a-8c28-a26764993905-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "6fe01788-af0e-426a-8c28-a26764993905" (UID: "6fe01788-af0e-426a-8c28-a26764993905"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:37:33 crc kubenswrapper[4599]: I1012 07:37:33.361920 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"6fe01788-af0e-426a-8c28-a26764993905","Type":"ContainerDied","Data":"1351f8ffe5fecf2ca010cc739164e528d3eaf7373029f61cc7f5f81364db0d77"} Oct 12 07:37:33 crc kubenswrapper[4599]: I1012 07:37:33.361968 4599 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1351f8ffe5fecf2ca010cc739164e528d3eaf7373029f61cc7f5f81364db0d77" Oct 12 07:37:33 crc kubenswrapper[4599]: I1012 07:37:33.361979 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 12 07:37:33 crc kubenswrapper[4599]: I1012 07:37:33.403829 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6fe01788-af0e-426a-8c28-a26764993905-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 12 07:37:34 crc kubenswrapper[4599]: W1012 07:37:34.038749 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-46306db736868d3d080bc4ec078b5969a250c1c5fb87925cedb58584fb0c5750 WatchSource:0}: Error finding container 46306db736868d3d080bc4ec078b5969a250c1c5fb87925cedb58584fb0c5750: Status 404 returned error can't find the container with id 46306db736868d3d080bc4ec078b5969a250c1c5fb87925cedb58584fb0c5750 Oct 12 07:37:34 crc kubenswrapper[4599]: W1012 07:37:34.128832 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-5818746de95429267af9410545ff04dca7cae2cdcc9a338ee801924529d4a5af WatchSource:0}: Error finding container 5818746de95429267af9410545ff04dca7cae2cdcc9a338ee801924529d4a5af: Status 404 returned error can't find the container with id 5818746de95429267af9410545ff04dca7cae2cdcc9a338ee801924529d4a5af Oct 12 07:37:34 crc kubenswrapper[4599]: I1012 07:37:34.137135 4599 patch_prober.go:28] interesting pod/router-default-5444994796-r64z4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 12 07:37:34 crc kubenswrapper[4599]: [-]has-synced failed: reason withheld Oct 12 07:37:34 crc kubenswrapper[4599]: [+]process-running ok Oct 12 07:37:34 crc kubenswrapper[4599]: healthz check failed Oct 12 07:37:34 crc kubenswrapper[4599]: I1012 07:37:34.137183 4599 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r64z4" podUID="1712254f-c8df-4c98-bfb2-bf79d98a6161" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 12 07:37:34 crc kubenswrapper[4599]: I1012 07:37:34.266791 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 12 07:37:34 crc kubenswrapper[4599]: W1012 07:37:34.276219 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod1dd71739_e6a2_4279_9d03_6e94dd3d95e8.slice/crio-de5f0b3876d163f27c56f9ffb2b7a932aadef3ffa6f164cf3712752e96219d03 WatchSource:0}: Error finding container de5f0b3876d163f27c56f9ffb2b7a932aadef3ffa6f164cf3712752e96219d03: Status 404 returned error can't find the container with id de5f0b3876d163f27c56f9ffb2b7a932aadef3ffa6f164cf3712752e96219d03 Oct 12 07:37:34 crc kubenswrapper[4599]: W1012 07:37:34.277570 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-177f4b4b345fcfd647e533881914ae8b854cd219e506034e2e6e44ca2401cf8d WatchSource:0}: Error finding container 177f4b4b345fcfd647e533881914ae8b854cd219e506034e2e6e44ca2401cf8d: Status 404 returned error can't find the container with id 177f4b4b345fcfd647e533881914ae8b854cd219e506034e2e6e44ca2401cf8d Oct 12 07:37:34 crc kubenswrapper[4599]: I1012 07:37:34.378457 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"177f4b4b345fcfd647e533881914ae8b854cd219e506034e2e6e44ca2401cf8d"} Oct 12 07:37:34 crc kubenswrapper[4599]: I1012 07:37:34.383518 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"9d8d70062693556e60262e47cc1476b4d7f11f083a168aeba630a9075384eadd"} Oct 12 07:37:34 crc kubenswrapper[4599]: I1012 07:37:34.383578 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"5818746de95429267af9410545ff04dca7cae2cdcc9a338ee801924529d4a5af"} Oct 12 07:37:34 crc kubenswrapper[4599]: I1012 07:37:34.385625 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"19b2e8712eed7d1203fd553c602e1837c5b4677c018cfd64d2c5d0c392bed905"} Oct 12 07:37:34 crc kubenswrapper[4599]: I1012 07:37:34.385656 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"46306db736868d3d080bc4ec078b5969a250c1c5fb87925cedb58584fb0c5750"} Oct 12 07:37:34 crc kubenswrapper[4599]: I1012 07:37:34.388068 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"1dd71739-e6a2-4279-9d03-6e94dd3d95e8","Type":"ContainerStarted","Data":"de5f0b3876d163f27c56f9ffb2b7a932aadef3ffa6f164cf3712752e96219d03"} Oct 12 07:37:34 crc kubenswrapper[4599]: I1012 07:37:34.960353 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-h9rz4" Oct 12 07:37:35 crc kubenswrapper[4599]: I1012 07:37:35.138597 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-r64z4" Oct 12 07:37:35 crc kubenswrapper[4599]: I1012 07:37:35.141366 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-r64z4" Oct 12 07:37:35 crc kubenswrapper[4599]: I1012 07:37:35.400193 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"2058286fb3e12def39fc384eab25f4bf7766e9b661342b9fb630132ba917632d"} Oct 12 07:37:35 crc kubenswrapper[4599]: I1012 07:37:35.400258 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 07:37:35 crc kubenswrapper[4599]: I1012 07:37:35.403231 4599 generic.go:334] "Generic (PLEG): container finished" podID="1dd71739-e6a2-4279-9d03-6e94dd3d95e8" containerID="ef7a22805765bcb436243da2ef1366c0ed1de71c16d4dfbd47c18c6a778e2596" exitCode=0 Oct 12 07:37:35 crc kubenswrapper[4599]: I1012 07:37:35.403492 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"1dd71739-e6a2-4279-9d03-6e94dd3d95e8","Type":"ContainerDied","Data":"ef7a22805765bcb436243da2ef1366c0ed1de71c16d4dfbd47c18c6a778e2596"} Oct 12 07:37:38 crc kubenswrapper[4599]: I1012 07:37:38.665171 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 12 07:37:38 crc kubenswrapper[4599]: I1012 07:37:38.797133 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1dd71739-e6a2-4279-9d03-6e94dd3d95e8-kubelet-dir\") pod \"1dd71739-e6a2-4279-9d03-6e94dd3d95e8\" (UID: \"1dd71739-e6a2-4279-9d03-6e94dd3d95e8\") " Oct 12 07:37:38 crc kubenswrapper[4599]: I1012 07:37:38.797186 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1dd71739-e6a2-4279-9d03-6e94dd3d95e8-kube-api-access\") pod \"1dd71739-e6a2-4279-9d03-6e94dd3d95e8\" (UID: \"1dd71739-e6a2-4279-9d03-6e94dd3d95e8\") " Oct 12 07:37:38 crc kubenswrapper[4599]: I1012 07:37:38.797308 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1dd71739-e6a2-4279-9d03-6e94dd3d95e8-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "1dd71739-e6a2-4279-9d03-6e94dd3d95e8" (UID: "1dd71739-e6a2-4279-9d03-6e94dd3d95e8"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 12 07:37:38 crc kubenswrapper[4599]: I1012 07:37:38.797660 4599 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1dd71739-e6a2-4279-9d03-6e94dd3d95e8-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 12 07:37:38 crc kubenswrapper[4599]: I1012 07:37:38.803090 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1dd71739-e6a2-4279-9d03-6e94dd3d95e8-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1dd71739-e6a2-4279-9d03-6e94dd3d95e8" (UID: "1dd71739-e6a2-4279-9d03-6e94dd3d95e8"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:37:38 crc kubenswrapper[4599]: I1012 07:37:38.898828 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1dd71739-e6a2-4279-9d03-6e94dd3d95e8-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 12 07:37:39 crc kubenswrapper[4599]: I1012 07:37:39.333101 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-lwg5q" Oct 12 07:37:39 crc kubenswrapper[4599]: I1012 07:37:39.336742 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-lwg5q" Oct 12 07:37:39 crc kubenswrapper[4599]: I1012 07:37:39.430348 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"1dd71739-e6a2-4279-9d03-6e94dd3d95e8","Type":"ContainerDied","Data":"de5f0b3876d163f27c56f9ffb2b7a932aadef3ffa6f164cf3712752e96219d03"} Oct 12 07:37:39 crc kubenswrapper[4599]: I1012 07:37:39.430396 4599 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de5f0b3876d163f27c56f9ffb2b7a932aadef3ffa6f164cf3712752e96219d03" Oct 12 07:37:39 crc kubenswrapper[4599]: I1012 07:37:39.430451 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 12 07:37:42 crc kubenswrapper[4599]: I1012 07:37:42.555218 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3c3e76cc-139b-4a2a-b96b-6077e3706376-metrics-certs\") pod \"network-metrics-daemon-kwphq\" (UID: \"3c3e76cc-139b-4a2a-b96b-6077e3706376\") " pod="openshift-multus/network-metrics-daemon-kwphq" Oct 12 07:37:42 crc kubenswrapper[4599]: I1012 07:37:42.560984 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3c3e76cc-139b-4a2a-b96b-6077e3706376-metrics-certs\") pod \"network-metrics-daemon-kwphq\" (UID: \"3c3e76cc-139b-4a2a-b96b-6077e3706376\") " pod="openshift-multus/network-metrics-daemon-kwphq" Oct 12 07:37:42 crc kubenswrapper[4599]: I1012 07:37:42.756676 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kwphq" Oct 12 07:37:45 crc kubenswrapper[4599]: I1012 07:37:45.059647 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-fwlgl" Oct 12 07:37:45 crc kubenswrapper[4599]: I1012 07:37:45.156007 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-kwphq"] Oct 12 07:37:45 crc kubenswrapper[4599]: W1012 07:37:45.161719 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c3e76cc_139b_4a2a_b96b_6077e3706376.slice/crio-e4fcf5f956a940dfeab9e8bb5dab0b1e867be1f09efc8a747afc579706f64901 WatchSource:0}: Error finding container e4fcf5f956a940dfeab9e8bb5dab0b1e867be1f09efc8a747afc579706f64901: Status 404 returned error can't find the container with id e4fcf5f956a940dfeab9e8bb5dab0b1e867be1f09efc8a747afc579706f64901 Oct 12 07:37:45 crc kubenswrapper[4599]: I1012 07:37:45.459682 4599 generic.go:334] "Generic (PLEG): container finished" podID="ddcc528b-bf0a-404c-b121-cee466cb352c" containerID="0a603fb84f4af91bda8aa098a056118d0aa96c622321220603ce048e4bd960cc" exitCode=0 Oct 12 07:37:45 crc kubenswrapper[4599]: I1012 07:37:45.459746 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mxnmk" event={"ID":"ddcc528b-bf0a-404c-b121-cee466cb352c","Type":"ContainerDied","Data":"0a603fb84f4af91bda8aa098a056118d0aa96c622321220603ce048e4bd960cc"} Oct 12 07:37:45 crc kubenswrapper[4599]: I1012 07:37:45.462079 4599 generic.go:334] "Generic (PLEG): container finished" podID="58710253-2df7-4fef-89bd-e5b6267bd0b4" containerID="c54fab04c62c44c1ea48a4f0d7d1bd55565cfd5f1c6719ce0a0d8aebbb699381" exitCode=0 Oct 12 07:37:45 crc kubenswrapper[4599]: I1012 07:37:45.462160 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cnl9m" event={"ID":"58710253-2df7-4fef-89bd-e5b6267bd0b4","Type":"ContainerDied","Data":"c54fab04c62c44c1ea48a4f0d7d1bd55565cfd5f1c6719ce0a0d8aebbb699381"} Oct 12 07:37:45 crc kubenswrapper[4599]: I1012 07:37:45.465272 4599 generic.go:334] "Generic (PLEG): container finished" podID="a84bfa7d-3d7b-48df-ba61-ab9640a00e1e" containerID="69749f74fd526565e5458a328dc98dce3975b039fbd0d3c35f08b2eb74e00589" exitCode=0 Oct 12 07:37:45 crc kubenswrapper[4599]: I1012 07:37:45.465325 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-68sdf" event={"ID":"a84bfa7d-3d7b-48df-ba61-ab9640a00e1e","Type":"ContainerDied","Data":"69749f74fd526565e5458a328dc98dce3975b039fbd0d3c35f08b2eb74e00589"} Oct 12 07:37:45 crc kubenswrapper[4599]: I1012 07:37:45.467786 4599 generic.go:334] "Generic (PLEG): container finished" podID="cef6ba33-1bde-40ae-8025-05f6234cd636" containerID="8d175c94e96ae077668ee88795397d4d970088af5ffda7b5f89fa1bd2dfd5195" exitCode=0 Oct 12 07:37:45 crc kubenswrapper[4599]: I1012 07:37:45.467861 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h8fml" event={"ID":"cef6ba33-1bde-40ae-8025-05f6234cd636","Type":"ContainerDied","Data":"8d175c94e96ae077668ee88795397d4d970088af5ffda7b5f89fa1bd2dfd5195"} Oct 12 07:37:45 crc kubenswrapper[4599]: I1012 07:37:45.471804 4599 generic.go:334] "Generic (PLEG): container finished" podID="e2b7b084-738f-430d-8e2c-9c1b5c0ea421" containerID="065d2923576c63cd8855f2dfe51a09801faab8a8d4d795b8809fd778a5217064" exitCode=0 Oct 12 07:37:45 crc kubenswrapper[4599]: I1012 07:37:45.471852 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fklx9" event={"ID":"e2b7b084-738f-430d-8e2c-9c1b5c0ea421","Type":"ContainerDied","Data":"065d2923576c63cd8855f2dfe51a09801faab8a8d4d795b8809fd778a5217064"} Oct 12 07:37:45 crc kubenswrapper[4599]: I1012 07:37:45.475349 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fn444" event={"ID":"0521e91a-c932-4050-9981-c52eefe3e1b9","Type":"ContainerStarted","Data":"8fbfdfba6136b4f3f9c59bb132c886cc17853e477843e23494637a884cb7032a"} Oct 12 07:37:45 crc kubenswrapper[4599]: I1012 07:37:45.477471 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-kwphq" event={"ID":"3c3e76cc-139b-4a2a-b96b-6077e3706376","Type":"ContainerStarted","Data":"e4fcf5f956a940dfeab9e8bb5dab0b1e867be1f09efc8a747afc579706f64901"} Oct 12 07:37:45 crc kubenswrapper[4599]: I1012 07:37:45.480853 4599 generic.go:334] "Generic (PLEG): container finished" podID="611866fc-22a0-472d-bcee-99765386e1fb" containerID="12c02f40f3fcb5442fb475b028a9ace7e32b4f8202bf652b231fe98262c1879b" exitCode=0 Oct 12 07:37:45 crc kubenswrapper[4599]: I1012 07:37:45.480913 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6m4tn" event={"ID":"611866fc-22a0-472d-bcee-99765386e1fb","Type":"ContainerDied","Data":"12c02f40f3fcb5442fb475b028a9ace7e32b4f8202bf652b231fe98262c1879b"} Oct 12 07:37:45 crc kubenswrapper[4599]: I1012 07:37:45.482291 4599 generic.go:334] "Generic (PLEG): container finished" podID="b914c467-be05-49d4-a391-d98254248ade" containerID="8fa8e97086ab5538d032f280770d063f09ce1a6f5d2223d709ccf9bacefd5317" exitCode=0 Oct 12 07:37:45 crc kubenswrapper[4599]: I1012 07:37:45.482384 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rmrp7" event={"ID":"b914c467-be05-49d4-a391-d98254248ade","Type":"ContainerDied","Data":"8fa8e97086ab5538d032f280770d063f09ce1a6f5d2223d709ccf9bacefd5317"} Oct 12 07:37:46 crc kubenswrapper[4599]: I1012 07:37:46.489596 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h8fml" event={"ID":"cef6ba33-1bde-40ae-8025-05f6234cd636","Type":"ContainerStarted","Data":"4f37fb21a9d7b651793aa3c92edc4f69bd375b1e19319aead6676b3387455032"} Oct 12 07:37:46 crc kubenswrapper[4599]: I1012 07:37:46.492015 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fklx9" event={"ID":"e2b7b084-738f-430d-8e2c-9c1b5c0ea421","Type":"ContainerStarted","Data":"727fb17cd757444fe2568b46a7d8af258d04c9d7becd8853ea9b3d72cb8127fd"} Oct 12 07:37:46 crc kubenswrapper[4599]: I1012 07:37:46.495104 4599 generic.go:334] "Generic (PLEG): container finished" podID="0521e91a-c932-4050-9981-c52eefe3e1b9" containerID="8fbfdfba6136b4f3f9c59bb132c886cc17853e477843e23494637a884cb7032a" exitCode=0 Oct 12 07:37:46 crc kubenswrapper[4599]: I1012 07:37:46.495162 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fn444" event={"ID":"0521e91a-c932-4050-9981-c52eefe3e1b9","Type":"ContainerDied","Data":"8fbfdfba6136b4f3f9c59bb132c886cc17853e477843e23494637a884cb7032a"} Oct 12 07:37:46 crc kubenswrapper[4599]: I1012 07:37:46.495187 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fn444" event={"ID":"0521e91a-c932-4050-9981-c52eefe3e1b9","Type":"ContainerStarted","Data":"e70721671c353fbfad3dad2de2f9c5c92279e0610f67dc2a3908a9283d53f95d"} Oct 12 07:37:46 crc kubenswrapper[4599]: I1012 07:37:46.497384 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-68sdf" event={"ID":"a84bfa7d-3d7b-48df-ba61-ab9640a00e1e","Type":"ContainerStarted","Data":"0c38cf8cd6affa4a732d3f63dc6a83c1462dbd049f1ba61c52dda5e30b7e7d4b"} Oct 12 07:37:46 crc kubenswrapper[4599]: I1012 07:37:46.502132 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mxnmk" event={"ID":"ddcc528b-bf0a-404c-b121-cee466cb352c","Type":"ContainerStarted","Data":"c790f0beda1a60adb007e007d9ac5d70ab13706da271eb3da0975bf047676b30"} Oct 12 07:37:46 crc kubenswrapper[4599]: I1012 07:37:46.504254 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rmrp7" event={"ID":"b914c467-be05-49d4-a391-d98254248ade","Type":"ContainerStarted","Data":"2d8c1f800c24b0c46ecf9d481d0d9309c367575ed5e96f952ad8b4560243327f"} Oct 12 07:37:46 crc kubenswrapper[4599]: I1012 07:37:46.506291 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cnl9m" event={"ID":"58710253-2df7-4fef-89bd-e5b6267bd0b4","Type":"ContainerStarted","Data":"d27fcfd187117500009d0d39c5e4c400282c271c0e06f503096661de1b4122ff"} Oct 12 07:37:46 crc kubenswrapper[4599]: I1012 07:37:46.507188 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-h8fml" podStartSLOduration=1.714552006 podStartE2EDuration="21.507169074s" podCreationTimestamp="2025-10-12 07:37:25 +0000 UTC" firstStartedPulling="2025-10-12 07:37:26.288569811 +0000 UTC m=+143.077765313" lastFinishedPulling="2025-10-12 07:37:46.081186879 +0000 UTC m=+162.870382381" observedRunningTime="2025-10-12 07:37:46.504079924 +0000 UTC m=+163.293275427" watchObservedRunningTime="2025-10-12 07:37:46.507169074 +0000 UTC m=+163.296364576" Oct 12 07:37:46 crc kubenswrapper[4599]: I1012 07:37:46.507719 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-kwphq" event={"ID":"3c3e76cc-139b-4a2a-b96b-6077e3706376","Type":"ContainerStarted","Data":"9e9521c7fd8b5aaecdf38b494494412f0661cf4db50df2bbc27b29bdff3eed02"} Oct 12 07:37:46 crc kubenswrapper[4599]: I1012 07:37:46.507747 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-kwphq" event={"ID":"3c3e76cc-139b-4a2a-b96b-6077e3706376","Type":"ContainerStarted","Data":"38f6dfc21404cd1704d8b6c57dc3f73d31db6860567c3c6d80423178b4fb8db3"} Oct 12 07:37:46 crc kubenswrapper[4599]: I1012 07:37:46.509988 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6m4tn" event={"ID":"611866fc-22a0-472d-bcee-99765386e1fb","Type":"ContainerStarted","Data":"2a0aa2c98abbf6d8b4424f8f7ed2282f285567aae2596a4a121582c55613f982"} Oct 12 07:37:46 crc kubenswrapper[4599]: I1012 07:37:46.523513 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fklx9" podStartSLOduration=2.824889329 podStartE2EDuration="23.523501283s" podCreationTimestamp="2025-10-12 07:37:23 +0000 UTC" firstStartedPulling="2025-10-12 07:37:25.273245293 +0000 UTC m=+142.062440795" lastFinishedPulling="2025-10-12 07:37:45.971857247 +0000 UTC m=+162.761052749" observedRunningTime="2025-10-12 07:37:46.522532435 +0000 UTC m=+163.311727937" watchObservedRunningTime="2025-10-12 07:37:46.523501283 +0000 UTC m=+163.312696785" Oct 12 07:37:46 crc kubenswrapper[4599]: I1012 07:37:46.540102 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-68sdf" podStartSLOduration=2.779100061 podStartE2EDuration="23.540091711s" podCreationTimestamp="2025-10-12 07:37:23 +0000 UTC" firstStartedPulling="2025-10-12 07:37:25.256599831 +0000 UTC m=+142.045795333" lastFinishedPulling="2025-10-12 07:37:46.017591481 +0000 UTC m=+162.806786983" observedRunningTime="2025-10-12 07:37:46.535909829 +0000 UTC m=+163.325105331" watchObservedRunningTime="2025-10-12 07:37:46.540091711 +0000 UTC m=+163.329287212" Oct 12 07:37:46 crc kubenswrapper[4599]: I1012 07:37:46.551858 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fn444" podStartSLOduration=1.883236815 podStartE2EDuration="20.551849548s" podCreationTimestamp="2025-10-12 07:37:26 +0000 UTC" firstStartedPulling="2025-10-12 07:37:27.30661831 +0000 UTC m=+144.095813813" lastFinishedPulling="2025-10-12 07:37:45.975231044 +0000 UTC m=+162.764426546" observedRunningTime="2025-10-12 07:37:46.549589493 +0000 UTC m=+163.338784995" watchObservedRunningTime="2025-10-12 07:37:46.551849548 +0000 UTC m=+163.341045050" Oct 12 07:37:46 crc kubenswrapper[4599]: I1012 07:37:46.565079 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mxnmk" podStartSLOduration=3.721290462 podStartE2EDuration="24.565048526s" podCreationTimestamp="2025-10-12 07:37:22 +0000 UTC" firstStartedPulling="2025-10-12 07:37:25.261361347 +0000 UTC m=+142.050556849" lastFinishedPulling="2025-10-12 07:37:46.105119412 +0000 UTC m=+162.894314913" observedRunningTime="2025-10-12 07:37:46.562073291 +0000 UTC m=+163.351268794" watchObservedRunningTime="2025-10-12 07:37:46.565048526 +0000 UTC m=+163.354244017" Oct 12 07:37:46 crc kubenswrapper[4599]: I1012 07:37:46.600112 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rmrp7" podStartSLOduration=2.705713826 podStartE2EDuration="23.600094078s" podCreationTimestamp="2025-10-12 07:37:23 +0000 UTC" firstStartedPulling="2025-10-12 07:37:25.26810411 +0000 UTC m=+142.057299613" lastFinishedPulling="2025-10-12 07:37:46.162484362 +0000 UTC m=+162.951679865" observedRunningTime="2025-10-12 07:37:46.583688711 +0000 UTC m=+163.372884213" watchObservedRunningTime="2025-10-12 07:37:46.600094078 +0000 UTC m=+163.389289571" Oct 12 07:37:46 crc kubenswrapper[4599]: I1012 07:37:46.629412 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6m4tn" podStartSLOduration=1.943023495 podStartE2EDuration="20.6293945s" podCreationTimestamp="2025-10-12 07:37:26 +0000 UTC" firstStartedPulling="2025-10-12 07:37:27.307213303 +0000 UTC m=+144.096408805" lastFinishedPulling="2025-10-12 07:37:45.993584308 +0000 UTC m=+162.782779810" observedRunningTime="2025-10-12 07:37:46.622579109 +0000 UTC m=+163.411774611" watchObservedRunningTime="2025-10-12 07:37:46.6293945 +0000 UTC m=+163.418590002" Oct 12 07:37:46 crc kubenswrapper[4599]: I1012 07:37:46.652459 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-kwphq" podStartSLOduration=145.652441331 podStartE2EDuration="2m25.652441331s" podCreationTimestamp="2025-10-12 07:35:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:37:46.650847033 +0000 UTC m=+163.440042534" watchObservedRunningTime="2025-10-12 07:37:46.652441331 +0000 UTC m=+163.441636843" Oct 12 07:37:46 crc kubenswrapper[4599]: I1012 07:37:46.653256 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-cnl9m" podStartSLOduration=1.8987183189999999 podStartE2EDuration="21.653251811s" podCreationTimestamp="2025-10-12 07:37:25 +0000 UTC" firstStartedPulling="2025-10-12 07:37:26.294936776 +0000 UTC m=+143.084132278" lastFinishedPulling="2025-10-12 07:37:46.049470268 +0000 UTC m=+162.838665770" observedRunningTime="2025-10-12 07:37:46.63965918 +0000 UTC m=+163.428854682" watchObservedRunningTime="2025-10-12 07:37:46.653251811 +0000 UTC m=+163.442447312" Oct 12 07:37:46 crc kubenswrapper[4599]: I1012 07:37:46.864259 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fn444" Oct 12 07:37:46 crc kubenswrapper[4599]: I1012 07:37:46.864643 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fn444" Oct 12 07:37:47 crc kubenswrapper[4599]: I1012 07:37:47.978501 4599 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fn444" podUID="0521e91a-c932-4050-9981-c52eefe3e1b9" containerName="registry-server" probeResult="failure" output=< Oct 12 07:37:47 crc kubenswrapper[4599]: timeout: failed to connect service ":50051" within 1s Oct 12 07:37:47 crc kubenswrapper[4599]: > Oct 12 07:37:53 crc kubenswrapper[4599]: I1012 07:37:53.306292 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mxnmk" Oct 12 07:37:53 crc kubenswrapper[4599]: I1012 07:37:53.306613 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mxnmk" Oct 12 07:37:53 crc kubenswrapper[4599]: I1012 07:37:53.338855 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mxnmk" Oct 12 07:37:53 crc kubenswrapper[4599]: I1012 07:37:53.469802 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rmrp7" Oct 12 07:37:53 crc kubenswrapper[4599]: I1012 07:37:53.469854 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rmrp7" Oct 12 07:37:53 crc kubenswrapper[4599]: I1012 07:37:53.501091 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rmrp7" Oct 12 07:37:53 crc kubenswrapper[4599]: I1012 07:37:53.580961 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rmrp7" Oct 12 07:37:53 crc kubenswrapper[4599]: I1012 07:37:53.581313 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mxnmk" Oct 12 07:37:53 crc kubenswrapper[4599]: I1012 07:37:53.679449 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fklx9" Oct 12 07:37:53 crc kubenswrapper[4599]: I1012 07:37:53.679485 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fklx9" Oct 12 07:37:53 crc kubenswrapper[4599]: I1012 07:37:53.709244 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fklx9" Oct 12 07:37:53 crc kubenswrapper[4599]: I1012 07:37:53.900961 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-68sdf" Oct 12 07:37:53 crc kubenswrapper[4599]: I1012 07:37:53.901022 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-68sdf" Oct 12 07:37:53 crc kubenswrapper[4599]: I1012 07:37:53.931832 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-68sdf" Oct 12 07:37:54 crc kubenswrapper[4599]: I1012 07:37:54.587482 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fklx9" Oct 12 07:37:54 crc kubenswrapper[4599]: I1012 07:37:54.589264 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-68sdf" Oct 12 07:37:55 crc kubenswrapper[4599]: I1012 07:37:55.459657 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-h8fml" Oct 12 07:37:55 crc kubenswrapper[4599]: I1012 07:37:55.460107 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-h8fml" Oct 12 07:37:55 crc kubenswrapper[4599]: I1012 07:37:55.493548 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-h8fml" Oct 12 07:37:55 crc kubenswrapper[4599]: I1012 07:37:55.586708 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-h8fml" Oct 12 07:37:55 crc kubenswrapper[4599]: I1012 07:37:55.690622 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-68sdf"] Oct 12 07:37:55 crc kubenswrapper[4599]: I1012 07:37:55.878667 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-cnl9m" Oct 12 07:37:55 crc kubenswrapper[4599]: I1012 07:37:55.879247 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-cnl9m" Oct 12 07:37:55 crc kubenswrapper[4599]: I1012 07:37:55.885928 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fklx9"] Oct 12 07:37:55 crc kubenswrapper[4599]: I1012 07:37:55.909070 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-cnl9m" Oct 12 07:37:56 crc kubenswrapper[4599]: I1012 07:37:56.486175 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6m4tn" Oct 12 07:37:56 crc kubenswrapper[4599]: I1012 07:37:56.486246 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6m4tn" Oct 12 07:37:56 crc kubenswrapper[4599]: I1012 07:37:56.518798 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6m4tn" Oct 12 07:37:56 crc kubenswrapper[4599]: I1012 07:37:56.570877 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-fklx9" podUID="e2b7b084-738f-430d-8e2c-9c1b5c0ea421" containerName="registry-server" containerID="cri-o://727fb17cd757444fe2568b46a7d8af258d04c9d7becd8853ea9b3d72cb8127fd" gracePeriod=2 Oct 12 07:37:56 crc kubenswrapper[4599]: I1012 07:37:56.571712 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-68sdf" podUID="a84bfa7d-3d7b-48df-ba61-ab9640a00e1e" containerName="registry-server" containerID="cri-o://0c38cf8cd6affa4a732d3f63dc6a83c1462dbd049f1ba61c52dda5e30b7e7d4b" gracePeriod=2 Oct 12 07:37:56 crc kubenswrapper[4599]: I1012 07:37:56.616318 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-cnl9m" Oct 12 07:37:56 crc kubenswrapper[4599]: I1012 07:37:56.623956 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6m4tn" Oct 12 07:37:56 crc kubenswrapper[4599]: I1012 07:37:56.914189 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fn444" Oct 12 07:37:56 crc kubenswrapper[4599]: I1012 07:37:56.932378 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-68sdf" Oct 12 07:37:56 crc kubenswrapper[4599]: I1012 07:37:56.953301 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fn444" Oct 12 07:37:57 crc kubenswrapper[4599]: I1012 07:37:57.033546 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fklx9" Oct 12 07:37:57 crc kubenswrapper[4599]: I1012 07:37:57.062088 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a84bfa7d-3d7b-48df-ba61-ab9640a00e1e-utilities\") pod \"a84bfa7d-3d7b-48df-ba61-ab9640a00e1e\" (UID: \"a84bfa7d-3d7b-48df-ba61-ab9640a00e1e\") " Oct 12 07:37:57 crc kubenswrapper[4599]: I1012 07:37:57.062130 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a84bfa7d-3d7b-48df-ba61-ab9640a00e1e-catalog-content\") pod \"a84bfa7d-3d7b-48df-ba61-ab9640a00e1e\" (UID: \"a84bfa7d-3d7b-48df-ba61-ab9640a00e1e\") " Oct 12 07:37:57 crc kubenswrapper[4599]: I1012 07:37:57.062170 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rf2qq\" (UniqueName: \"kubernetes.io/projected/a84bfa7d-3d7b-48df-ba61-ab9640a00e1e-kube-api-access-rf2qq\") pod \"a84bfa7d-3d7b-48df-ba61-ab9640a00e1e\" (UID: \"a84bfa7d-3d7b-48df-ba61-ab9640a00e1e\") " Oct 12 07:37:57 crc kubenswrapper[4599]: I1012 07:37:57.063034 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a84bfa7d-3d7b-48df-ba61-ab9640a00e1e-utilities" (OuterVolumeSpecName: "utilities") pod "a84bfa7d-3d7b-48df-ba61-ab9640a00e1e" (UID: "a84bfa7d-3d7b-48df-ba61-ab9640a00e1e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:37:57 crc kubenswrapper[4599]: I1012 07:37:57.069998 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a84bfa7d-3d7b-48df-ba61-ab9640a00e1e-kube-api-access-rf2qq" (OuterVolumeSpecName: "kube-api-access-rf2qq") pod "a84bfa7d-3d7b-48df-ba61-ab9640a00e1e" (UID: "a84bfa7d-3d7b-48df-ba61-ab9640a00e1e"). InnerVolumeSpecName "kube-api-access-rf2qq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:37:57 crc kubenswrapper[4599]: I1012 07:37:57.102381 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a84bfa7d-3d7b-48df-ba61-ab9640a00e1e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a84bfa7d-3d7b-48df-ba61-ab9640a00e1e" (UID: "a84bfa7d-3d7b-48df-ba61-ab9640a00e1e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:37:57 crc kubenswrapper[4599]: I1012 07:37:57.163738 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k9rrr\" (UniqueName: \"kubernetes.io/projected/e2b7b084-738f-430d-8e2c-9c1b5c0ea421-kube-api-access-k9rrr\") pod \"e2b7b084-738f-430d-8e2c-9c1b5c0ea421\" (UID: \"e2b7b084-738f-430d-8e2c-9c1b5c0ea421\") " Oct 12 07:37:57 crc kubenswrapper[4599]: I1012 07:37:57.163847 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2b7b084-738f-430d-8e2c-9c1b5c0ea421-utilities\") pod \"e2b7b084-738f-430d-8e2c-9c1b5c0ea421\" (UID: \"e2b7b084-738f-430d-8e2c-9c1b5c0ea421\") " Oct 12 07:37:57 crc kubenswrapper[4599]: I1012 07:37:57.164077 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2b7b084-738f-430d-8e2c-9c1b5c0ea421-catalog-content\") pod \"e2b7b084-738f-430d-8e2c-9c1b5c0ea421\" (UID: \"e2b7b084-738f-430d-8e2c-9c1b5c0ea421\") " Oct 12 07:37:57 crc kubenswrapper[4599]: I1012 07:37:57.164654 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rf2qq\" (UniqueName: \"kubernetes.io/projected/a84bfa7d-3d7b-48df-ba61-ab9640a00e1e-kube-api-access-rf2qq\") on node \"crc\" DevicePath \"\"" Oct 12 07:37:57 crc kubenswrapper[4599]: I1012 07:37:57.164682 4599 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a84bfa7d-3d7b-48df-ba61-ab9640a00e1e-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 07:37:57 crc kubenswrapper[4599]: I1012 07:37:57.164697 4599 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a84bfa7d-3d7b-48df-ba61-ab9640a00e1e-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 07:37:57 crc kubenswrapper[4599]: I1012 07:37:57.164715 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2b7b084-738f-430d-8e2c-9c1b5c0ea421-utilities" (OuterVolumeSpecName: "utilities") pod "e2b7b084-738f-430d-8e2c-9c1b5c0ea421" (UID: "e2b7b084-738f-430d-8e2c-9c1b5c0ea421"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:37:57 crc kubenswrapper[4599]: I1012 07:37:57.166893 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2b7b084-738f-430d-8e2c-9c1b5c0ea421-kube-api-access-k9rrr" (OuterVolumeSpecName: "kube-api-access-k9rrr") pod "e2b7b084-738f-430d-8e2c-9c1b5c0ea421" (UID: "e2b7b084-738f-430d-8e2c-9c1b5c0ea421"). InnerVolumeSpecName "kube-api-access-k9rrr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:37:57 crc kubenswrapper[4599]: I1012 07:37:57.208279 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2b7b084-738f-430d-8e2c-9c1b5c0ea421-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e2b7b084-738f-430d-8e2c-9c1b5c0ea421" (UID: "e2b7b084-738f-430d-8e2c-9c1b5c0ea421"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:37:57 crc kubenswrapper[4599]: I1012 07:37:57.266442 4599 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2b7b084-738f-430d-8e2c-9c1b5c0ea421-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 07:37:57 crc kubenswrapper[4599]: I1012 07:37:57.266485 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k9rrr\" (UniqueName: \"kubernetes.io/projected/e2b7b084-738f-430d-8e2c-9c1b5c0ea421-kube-api-access-k9rrr\") on node \"crc\" DevicePath \"\"" Oct 12 07:37:57 crc kubenswrapper[4599]: I1012 07:37:57.266504 4599 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2b7b084-738f-430d-8e2c-9c1b5c0ea421-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 07:37:57 crc kubenswrapper[4599]: I1012 07:37:57.578521 4599 generic.go:334] "Generic (PLEG): container finished" podID="a84bfa7d-3d7b-48df-ba61-ab9640a00e1e" containerID="0c38cf8cd6affa4a732d3f63dc6a83c1462dbd049f1ba61c52dda5e30b7e7d4b" exitCode=0 Oct 12 07:37:57 crc kubenswrapper[4599]: I1012 07:37:57.578588 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-68sdf" event={"ID":"a84bfa7d-3d7b-48df-ba61-ab9640a00e1e","Type":"ContainerDied","Data":"0c38cf8cd6affa4a732d3f63dc6a83c1462dbd049f1ba61c52dda5e30b7e7d4b"} Oct 12 07:37:57 crc kubenswrapper[4599]: I1012 07:37:57.578993 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-68sdf" event={"ID":"a84bfa7d-3d7b-48df-ba61-ab9640a00e1e","Type":"ContainerDied","Data":"1a3830d45c4e9ce666697152963d61721646271a0f8924f5d83ccb55b101f425"} Oct 12 07:37:57 crc kubenswrapper[4599]: I1012 07:37:57.579025 4599 scope.go:117] "RemoveContainer" containerID="0c38cf8cd6affa4a732d3f63dc6a83c1462dbd049f1ba61c52dda5e30b7e7d4b" Oct 12 07:37:57 crc kubenswrapper[4599]: I1012 07:37:57.578761 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-68sdf" Oct 12 07:37:57 crc kubenswrapper[4599]: I1012 07:37:57.586045 4599 generic.go:334] "Generic (PLEG): container finished" podID="e2b7b084-738f-430d-8e2c-9c1b5c0ea421" containerID="727fb17cd757444fe2568b46a7d8af258d04c9d7becd8853ea9b3d72cb8127fd" exitCode=0 Oct 12 07:37:57 crc kubenswrapper[4599]: I1012 07:37:57.586140 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fklx9" Oct 12 07:37:57 crc kubenswrapper[4599]: I1012 07:37:57.586211 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fklx9" event={"ID":"e2b7b084-738f-430d-8e2c-9c1b5c0ea421","Type":"ContainerDied","Data":"727fb17cd757444fe2568b46a7d8af258d04c9d7becd8853ea9b3d72cb8127fd"} Oct 12 07:37:57 crc kubenswrapper[4599]: I1012 07:37:57.586249 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fklx9" event={"ID":"e2b7b084-738f-430d-8e2c-9c1b5c0ea421","Type":"ContainerDied","Data":"cf14898142682e25100c496652de2202483bb25e8a0d0434fa1c472d854207e1"} Oct 12 07:37:57 crc kubenswrapper[4599]: I1012 07:37:57.611211 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-68sdf"] Oct 12 07:37:57 crc kubenswrapper[4599]: I1012 07:37:57.613528 4599 scope.go:117] "RemoveContainer" containerID="69749f74fd526565e5458a328dc98dce3975b039fbd0d3c35f08b2eb74e00589" Oct 12 07:37:57 crc kubenswrapper[4599]: I1012 07:37:57.614540 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-68sdf"] Oct 12 07:37:57 crc kubenswrapper[4599]: I1012 07:37:57.618392 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fklx9"] Oct 12 07:37:57 crc kubenswrapper[4599]: I1012 07:37:57.619064 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-fklx9"] Oct 12 07:37:57 crc kubenswrapper[4599]: I1012 07:37:57.629241 4599 scope.go:117] "RemoveContainer" containerID="68d8337fb2cbf76dad3dfd0f642bede2bdc1f9299272093f7d09940757493074" Oct 12 07:37:57 crc kubenswrapper[4599]: I1012 07:37:57.641556 4599 scope.go:117] "RemoveContainer" containerID="0c38cf8cd6affa4a732d3f63dc6a83c1462dbd049f1ba61c52dda5e30b7e7d4b" Oct 12 07:37:57 crc kubenswrapper[4599]: E1012 07:37:57.642106 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c38cf8cd6affa4a732d3f63dc6a83c1462dbd049f1ba61c52dda5e30b7e7d4b\": container with ID starting with 0c38cf8cd6affa4a732d3f63dc6a83c1462dbd049f1ba61c52dda5e30b7e7d4b not found: ID does not exist" containerID="0c38cf8cd6affa4a732d3f63dc6a83c1462dbd049f1ba61c52dda5e30b7e7d4b" Oct 12 07:37:57 crc kubenswrapper[4599]: I1012 07:37:57.642149 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c38cf8cd6affa4a732d3f63dc6a83c1462dbd049f1ba61c52dda5e30b7e7d4b"} err="failed to get container status \"0c38cf8cd6affa4a732d3f63dc6a83c1462dbd049f1ba61c52dda5e30b7e7d4b\": rpc error: code = NotFound desc = could not find container \"0c38cf8cd6affa4a732d3f63dc6a83c1462dbd049f1ba61c52dda5e30b7e7d4b\": container with ID starting with 0c38cf8cd6affa4a732d3f63dc6a83c1462dbd049f1ba61c52dda5e30b7e7d4b not found: ID does not exist" Oct 12 07:37:57 crc kubenswrapper[4599]: I1012 07:37:57.642199 4599 scope.go:117] "RemoveContainer" containerID="69749f74fd526565e5458a328dc98dce3975b039fbd0d3c35f08b2eb74e00589" Oct 12 07:37:57 crc kubenswrapper[4599]: E1012 07:37:57.642554 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69749f74fd526565e5458a328dc98dce3975b039fbd0d3c35f08b2eb74e00589\": container with ID starting with 69749f74fd526565e5458a328dc98dce3975b039fbd0d3c35f08b2eb74e00589 not found: ID does not exist" containerID="69749f74fd526565e5458a328dc98dce3975b039fbd0d3c35f08b2eb74e00589" Oct 12 07:37:57 crc kubenswrapper[4599]: I1012 07:37:57.642597 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69749f74fd526565e5458a328dc98dce3975b039fbd0d3c35f08b2eb74e00589"} err="failed to get container status \"69749f74fd526565e5458a328dc98dce3975b039fbd0d3c35f08b2eb74e00589\": rpc error: code = NotFound desc = could not find container \"69749f74fd526565e5458a328dc98dce3975b039fbd0d3c35f08b2eb74e00589\": container with ID starting with 69749f74fd526565e5458a328dc98dce3975b039fbd0d3c35f08b2eb74e00589 not found: ID does not exist" Oct 12 07:37:57 crc kubenswrapper[4599]: I1012 07:37:57.642625 4599 scope.go:117] "RemoveContainer" containerID="68d8337fb2cbf76dad3dfd0f642bede2bdc1f9299272093f7d09940757493074" Oct 12 07:37:57 crc kubenswrapper[4599]: E1012 07:37:57.642957 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68d8337fb2cbf76dad3dfd0f642bede2bdc1f9299272093f7d09940757493074\": container with ID starting with 68d8337fb2cbf76dad3dfd0f642bede2bdc1f9299272093f7d09940757493074 not found: ID does not exist" containerID="68d8337fb2cbf76dad3dfd0f642bede2bdc1f9299272093f7d09940757493074" Oct 12 07:37:57 crc kubenswrapper[4599]: I1012 07:37:57.642986 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68d8337fb2cbf76dad3dfd0f642bede2bdc1f9299272093f7d09940757493074"} err="failed to get container status \"68d8337fb2cbf76dad3dfd0f642bede2bdc1f9299272093f7d09940757493074\": rpc error: code = NotFound desc = could not find container \"68d8337fb2cbf76dad3dfd0f642bede2bdc1f9299272093f7d09940757493074\": container with ID starting with 68d8337fb2cbf76dad3dfd0f642bede2bdc1f9299272093f7d09940757493074 not found: ID does not exist" Oct 12 07:37:57 crc kubenswrapper[4599]: I1012 07:37:57.643008 4599 scope.go:117] "RemoveContainer" containerID="727fb17cd757444fe2568b46a7d8af258d04c9d7becd8853ea9b3d72cb8127fd" Oct 12 07:37:57 crc kubenswrapper[4599]: I1012 07:37:57.655022 4599 scope.go:117] "RemoveContainer" containerID="065d2923576c63cd8855f2dfe51a09801faab8a8d4d795b8809fd778a5217064" Oct 12 07:37:57 crc kubenswrapper[4599]: I1012 07:37:57.675410 4599 scope.go:117] "RemoveContainer" containerID="69a0bb4c9375306a32148df64258e11dc2c7f391451403ccc07eae70380950ac" Oct 12 07:37:57 crc kubenswrapper[4599]: I1012 07:37:57.696920 4599 scope.go:117] "RemoveContainer" containerID="727fb17cd757444fe2568b46a7d8af258d04c9d7becd8853ea9b3d72cb8127fd" Oct 12 07:37:57 crc kubenswrapper[4599]: E1012 07:37:57.697299 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"727fb17cd757444fe2568b46a7d8af258d04c9d7becd8853ea9b3d72cb8127fd\": container with ID starting with 727fb17cd757444fe2568b46a7d8af258d04c9d7becd8853ea9b3d72cb8127fd not found: ID does not exist" containerID="727fb17cd757444fe2568b46a7d8af258d04c9d7becd8853ea9b3d72cb8127fd" Oct 12 07:37:57 crc kubenswrapper[4599]: I1012 07:37:57.697349 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"727fb17cd757444fe2568b46a7d8af258d04c9d7becd8853ea9b3d72cb8127fd"} err="failed to get container status \"727fb17cd757444fe2568b46a7d8af258d04c9d7becd8853ea9b3d72cb8127fd\": rpc error: code = NotFound desc = could not find container \"727fb17cd757444fe2568b46a7d8af258d04c9d7becd8853ea9b3d72cb8127fd\": container with ID starting with 727fb17cd757444fe2568b46a7d8af258d04c9d7becd8853ea9b3d72cb8127fd not found: ID does not exist" Oct 12 07:37:57 crc kubenswrapper[4599]: I1012 07:37:57.697369 4599 scope.go:117] "RemoveContainer" containerID="065d2923576c63cd8855f2dfe51a09801faab8a8d4d795b8809fd778a5217064" Oct 12 07:37:57 crc kubenswrapper[4599]: E1012 07:37:57.697677 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"065d2923576c63cd8855f2dfe51a09801faab8a8d4d795b8809fd778a5217064\": container with ID starting with 065d2923576c63cd8855f2dfe51a09801faab8a8d4d795b8809fd778a5217064 not found: ID does not exist" containerID="065d2923576c63cd8855f2dfe51a09801faab8a8d4d795b8809fd778a5217064" Oct 12 07:37:57 crc kubenswrapper[4599]: I1012 07:37:57.697725 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"065d2923576c63cd8855f2dfe51a09801faab8a8d4d795b8809fd778a5217064"} err="failed to get container status \"065d2923576c63cd8855f2dfe51a09801faab8a8d4d795b8809fd778a5217064\": rpc error: code = NotFound desc = could not find container \"065d2923576c63cd8855f2dfe51a09801faab8a8d4d795b8809fd778a5217064\": container with ID starting with 065d2923576c63cd8855f2dfe51a09801faab8a8d4d795b8809fd778a5217064 not found: ID does not exist" Oct 12 07:37:57 crc kubenswrapper[4599]: I1012 07:37:57.697754 4599 scope.go:117] "RemoveContainer" containerID="69a0bb4c9375306a32148df64258e11dc2c7f391451403ccc07eae70380950ac" Oct 12 07:37:57 crc kubenswrapper[4599]: E1012 07:37:57.698158 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69a0bb4c9375306a32148df64258e11dc2c7f391451403ccc07eae70380950ac\": container with ID starting with 69a0bb4c9375306a32148df64258e11dc2c7f391451403ccc07eae70380950ac not found: ID does not exist" containerID="69a0bb4c9375306a32148df64258e11dc2c7f391451403ccc07eae70380950ac" Oct 12 07:37:57 crc kubenswrapper[4599]: I1012 07:37:57.698202 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69a0bb4c9375306a32148df64258e11dc2c7f391451403ccc07eae70380950ac"} err="failed to get container status \"69a0bb4c9375306a32148df64258e11dc2c7f391451403ccc07eae70380950ac\": rpc error: code = NotFound desc = could not find container \"69a0bb4c9375306a32148df64258e11dc2c7f391451403ccc07eae70380950ac\": container with ID starting with 69a0bb4c9375306a32148df64258e11dc2c7f391451403ccc07eae70380950ac not found: ID does not exist" Oct 12 07:37:58 crc kubenswrapper[4599]: I1012 07:37:58.087896 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cnl9m"] Oct 12 07:37:58 crc kubenswrapper[4599]: I1012 07:37:58.322206 4599 patch_prober.go:28] interesting pod/machine-config-daemon-5mz5c container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 07:37:58 crc kubenswrapper[4599]: I1012 07:37:58.322294 4599 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 07:37:59 crc kubenswrapper[4599]: I1012 07:37:59.551783 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a84bfa7d-3d7b-48df-ba61-ab9640a00e1e" path="/var/lib/kubelet/pods/a84bfa7d-3d7b-48df-ba61-ab9640a00e1e/volumes" Oct 12 07:37:59 crc kubenswrapper[4599]: I1012 07:37:59.552950 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2b7b084-738f-430d-8e2c-9c1b5c0ea421" path="/var/lib/kubelet/pods/e2b7b084-738f-430d-8e2c-9c1b5c0ea421/volumes" Oct 12 07:37:59 crc kubenswrapper[4599]: I1012 07:37:59.581299 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r6fd4" Oct 12 07:37:59 crc kubenswrapper[4599]: I1012 07:37:59.598597 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-cnl9m" podUID="58710253-2df7-4fef-89bd-e5b6267bd0b4" containerName="registry-server" containerID="cri-o://d27fcfd187117500009d0d39c5e4c400282c271c0e06f503096661de1b4122ff" gracePeriod=2 Oct 12 07:37:59 crc kubenswrapper[4599]: I1012 07:37:59.994681 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cnl9m" Oct 12 07:38:00 crc kubenswrapper[4599]: I1012 07:38:00.127209 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58710253-2df7-4fef-89bd-e5b6267bd0b4-utilities\") pod \"58710253-2df7-4fef-89bd-e5b6267bd0b4\" (UID: \"58710253-2df7-4fef-89bd-e5b6267bd0b4\") " Oct 12 07:38:00 crc kubenswrapper[4599]: I1012 07:38:00.127304 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58710253-2df7-4fef-89bd-e5b6267bd0b4-catalog-content\") pod \"58710253-2df7-4fef-89bd-e5b6267bd0b4\" (UID: \"58710253-2df7-4fef-89bd-e5b6267bd0b4\") " Oct 12 07:38:00 crc kubenswrapper[4599]: I1012 07:38:00.127494 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hxk5n\" (UniqueName: \"kubernetes.io/projected/58710253-2df7-4fef-89bd-e5b6267bd0b4-kube-api-access-hxk5n\") pod \"58710253-2df7-4fef-89bd-e5b6267bd0b4\" (UID: \"58710253-2df7-4fef-89bd-e5b6267bd0b4\") " Oct 12 07:38:00 crc kubenswrapper[4599]: I1012 07:38:00.128029 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58710253-2df7-4fef-89bd-e5b6267bd0b4-utilities" (OuterVolumeSpecName: "utilities") pod "58710253-2df7-4fef-89bd-e5b6267bd0b4" (UID: "58710253-2df7-4fef-89bd-e5b6267bd0b4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:38:00 crc kubenswrapper[4599]: I1012 07:38:00.133259 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58710253-2df7-4fef-89bd-e5b6267bd0b4-kube-api-access-hxk5n" (OuterVolumeSpecName: "kube-api-access-hxk5n") pod "58710253-2df7-4fef-89bd-e5b6267bd0b4" (UID: "58710253-2df7-4fef-89bd-e5b6267bd0b4"). InnerVolumeSpecName "kube-api-access-hxk5n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:38:00 crc kubenswrapper[4599]: I1012 07:38:00.137712 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58710253-2df7-4fef-89bd-e5b6267bd0b4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "58710253-2df7-4fef-89bd-e5b6267bd0b4" (UID: "58710253-2df7-4fef-89bd-e5b6267bd0b4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:38:00 crc kubenswrapper[4599]: I1012 07:38:00.229083 4599 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58710253-2df7-4fef-89bd-e5b6267bd0b4-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 07:38:00 crc kubenswrapper[4599]: I1012 07:38:00.229115 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hxk5n\" (UniqueName: \"kubernetes.io/projected/58710253-2df7-4fef-89bd-e5b6267bd0b4-kube-api-access-hxk5n\") on node \"crc\" DevicePath \"\"" Oct 12 07:38:00 crc kubenswrapper[4599]: I1012 07:38:00.229128 4599 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58710253-2df7-4fef-89bd-e5b6267bd0b4-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 07:38:00 crc kubenswrapper[4599]: I1012 07:38:00.488894 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fn444"] Oct 12 07:38:00 crc kubenswrapper[4599]: I1012 07:38:00.489127 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fn444" podUID="0521e91a-c932-4050-9981-c52eefe3e1b9" containerName="registry-server" containerID="cri-o://e70721671c353fbfad3dad2de2f9c5c92279e0610f67dc2a3908a9283d53f95d" gracePeriod=2 Oct 12 07:38:00 crc kubenswrapper[4599]: I1012 07:38:00.609817 4599 generic.go:334] "Generic (PLEG): container finished" podID="58710253-2df7-4fef-89bd-e5b6267bd0b4" containerID="d27fcfd187117500009d0d39c5e4c400282c271c0e06f503096661de1b4122ff" exitCode=0 Oct 12 07:38:00 crc kubenswrapper[4599]: I1012 07:38:00.609902 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cnl9m" event={"ID":"58710253-2df7-4fef-89bd-e5b6267bd0b4","Type":"ContainerDied","Data":"d27fcfd187117500009d0d39c5e4c400282c271c0e06f503096661de1b4122ff"} Oct 12 07:38:00 crc kubenswrapper[4599]: I1012 07:38:00.609975 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cnl9m" event={"ID":"58710253-2df7-4fef-89bd-e5b6267bd0b4","Type":"ContainerDied","Data":"14e694855222391d392f8d524d45dfa4ad3502201f52be123c81a8ec403ca850"} Oct 12 07:38:00 crc kubenswrapper[4599]: I1012 07:38:00.609994 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cnl9m" Oct 12 07:38:00 crc kubenswrapper[4599]: I1012 07:38:00.610022 4599 scope.go:117] "RemoveContainer" containerID="d27fcfd187117500009d0d39c5e4c400282c271c0e06f503096661de1b4122ff" Oct 12 07:38:00 crc kubenswrapper[4599]: I1012 07:38:00.623601 4599 scope.go:117] "RemoveContainer" containerID="c54fab04c62c44c1ea48a4f0d7d1bd55565cfd5f1c6719ce0a0d8aebbb699381" Oct 12 07:38:00 crc kubenswrapper[4599]: I1012 07:38:00.638559 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cnl9m"] Oct 12 07:38:00 crc kubenswrapper[4599]: I1012 07:38:00.641164 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-cnl9m"] Oct 12 07:38:00 crc kubenswrapper[4599]: I1012 07:38:00.655191 4599 scope.go:117] "RemoveContainer" containerID="946159b5aec9641badd7ea8eef8018d45800ea7343e5ae8073faf5077358b29c" Oct 12 07:38:00 crc kubenswrapper[4599]: I1012 07:38:00.667030 4599 scope.go:117] "RemoveContainer" containerID="d27fcfd187117500009d0d39c5e4c400282c271c0e06f503096661de1b4122ff" Oct 12 07:38:00 crc kubenswrapper[4599]: E1012 07:38:00.667387 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d27fcfd187117500009d0d39c5e4c400282c271c0e06f503096661de1b4122ff\": container with ID starting with d27fcfd187117500009d0d39c5e4c400282c271c0e06f503096661de1b4122ff not found: ID does not exist" containerID="d27fcfd187117500009d0d39c5e4c400282c271c0e06f503096661de1b4122ff" Oct 12 07:38:00 crc kubenswrapper[4599]: I1012 07:38:00.667426 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d27fcfd187117500009d0d39c5e4c400282c271c0e06f503096661de1b4122ff"} err="failed to get container status \"d27fcfd187117500009d0d39c5e4c400282c271c0e06f503096661de1b4122ff\": rpc error: code = NotFound desc = could not find container \"d27fcfd187117500009d0d39c5e4c400282c271c0e06f503096661de1b4122ff\": container with ID starting with d27fcfd187117500009d0d39c5e4c400282c271c0e06f503096661de1b4122ff not found: ID does not exist" Oct 12 07:38:00 crc kubenswrapper[4599]: I1012 07:38:00.667449 4599 scope.go:117] "RemoveContainer" containerID="c54fab04c62c44c1ea48a4f0d7d1bd55565cfd5f1c6719ce0a0d8aebbb699381" Oct 12 07:38:00 crc kubenswrapper[4599]: E1012 07:38:00.667777 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c54fab04c62c44c1ea48a4f0d7d1bd55565cfd5f1c6719ce0a0d8aebbb699381\": container with ID starting with c54fab04c62c44c1ea48a4f0d7d1bd55565cfd5f1c6719ce0a0d8aebbb699381 not found: ID does not exist" containerID="c54fab04c62c44c1ea48a4f0d7d1bd55565cfd5f1c6719ce0a0d8aebbb699381" Oct 12 07:38:00 crc kubenswrapper[4599]: I1012 07:38:00.667801 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c54fab04c62c44c1ea48a4f0d7d1bd55565cfd5f1c6719ce0a0d8aebbb699381"} err="failed to get container status \"c54fab04c62c44c1ea48a4f0d7d1bd55565cfd5f1c6719ce0a0d8aebbb699381\": rpc error: code = NotFound desc = could not find container \"c54fab04c62c44c1ea48a4f0d7d1bd55565cfd5f1c6719ce0a0d8aebbb699381\": container with ID starting with c54fab04c62c44c1ea48a4f0d7d1bd55565cfd5f1c6719ce0a0d8aebbb699381 not found: ID does not exist" Oct 12 07:38:00 crc kubenswrapper[4599]: I1012 07:38:00.667815 4599 scope.go:117] "RemoveContainer" containerID="946159b5aec9641badd7ea8eef8018d45800ea7343e5ae8073faf5077358b29c" Oct 12 07:38:00 crc kubenswrapper[4599]: E1012 07:38:00.668118 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"946159b5aec9641badd7ea8eef8018d45800ea7343e5ae8073faf5077358b29c\": container with ID starting with 946159b5aec9641badd7ea8eef8018d45800ea7343e5ae8073faf5077358b29c not found: ID does not exist" containerID="946159b5aec9641badd7ea8eef8018d45800ea7343e5ae8073faf5077358b29c" Oct 12 07:38:00 crc kubenswrapper[4599]: I1012 07:38:00.668137 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"946159b5aec9641badd7ea8eef8018d45800ea7343e5ae8073faf5077358b29c"} err="failed to get container status \"946159b5aec9641badd7ea8eef8018d45800ea7343e5ae8073faf5077358b29c\": rpc error: code = NotFound desc = could not find container \"946159b5aec9641badd7ea8eef8018d45800ea7343e5ae8073faf5077358b29c\": container with ID starting with 946159b5aec9641badd7ea8eef8018d45800ea7343e5ae8073faf5077358b29c not found: ID does not exist" Oct 12 07:38:00 crc kubenswrapper[4599]: I1012 07:38:00.904821 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fn444" Oct 12 07:38:01 crc kubenswrapper[4599]: I1012 07:38:01.043576 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0521e91a-c932-4050-9981-c52eefe3e1b9-utilities\") pod \"0521e91a-c932-4050-9981-c52eefe3e1b9\" (UID: \"0521e91a-c932-4050-9981-c52eefe3e1b9\") " Oct 12 07:38:01 crc kubenswrapper[4599]: I1012 07:38:01.043640 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0521e91a-c932-4050-9981-c52eefe3e1b9-catalog-content\") pod \"0521e91a-c932-4050-9981-c52eefe3e1b9\" (UID: \"0521e91a-c932-4050-9981-c52eefe3e1b9\") " Oct 12 07:38:01 crc kubenswrapper[4599]: I1012 07:38:01.043700 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wb5r9\" (UniqueName: \"kubernetes.io/projected/0521e91a-c932-4050-9981-c52eefe3e1b9-kube-api-access-wb5r9\") pod \"0521e91a-c932-4050-9981-c52eefe3e1b9\" (UID: \"0521e91a-c932-4050-9981-c52eefe3e1b9\") " Oct 12 07:38:01 crc kubenswrapper[4599]: I1012 07:38:01.044467 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0521e91a-c932-4050-9981-c52eefe3e1b9-utilities" (OuterVolumeSpecName: "utilities") pod "0521e91a-c932-4050-9981-c52eefe3e1b9" (UID: "0521e91a-c932-4050-9981-c52eefe3e1b9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:38:01 crc kubenswrapper[4599]: I1012 07:38:01.052224 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0521e91a-c932-4050-9981-c52eefe3e1b9-kube-api-access-wb5r9" (OuterVolumeSpecName: "kube-api-access-wb5r9") pod "0521e91a-c932-4050-9981-c52eefe3e1b9" (UID: "0521e91a-c932-4050-9981-c52eefe3e1b9"). InnerVolumeSpecName "kube-api-access-wb5r9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:38:01 crc kubenswrapper[4599]: I1012 07:38:01.113912 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0521e91a-c932-4050-9981-c52eefe3e1b9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0521e91a-c932-4050-9981-c52eefe3e1b9" (UID: "0521e91a-c932-4050-9981-c52eefe3e1b9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:38:01 crc kubenswrapper[4599]: I1012 07:38:01.145946 4599 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0521e91a-c932-4050-9981-c52eefe3e1b9-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 07:38:01 crc kubenswrapper[4599]: I1012 07:38:01.145983 4599 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0521e91a-c932-4050-9981-c52eefe3e1b9-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 07:38:01 crc kubenswrapper[4599]: I1012 07:38:01.145999 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wb5r9\" (UniqueName: \"kubernetes.io/projected/0521e91a-c932-4050-9981-c52eefe3e1b9-kube-api-access-wb5r9\") on node \"crc\" DevicePath \"\"" Oct 12 07:38:01 crc kubenswrapper[4599]: I1012 07:38:01.551082 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58710253-2df7-4fef-89bd-e5b6267bd0b4" path="/var/lib/kubelet/pods/58710253-2df7-4fef-89bd-e5b6267bd0b4/volumes" Oct 12 07:38:01 crc kubenswrapper[4599]: I1012 07:38:01.617314 4599 generic.go:334] "Generic (PLEG): container finished" podID="0521e91a-c932-4050-9981-c52eefe3e1b9" containerID="e70721671c353fbfad3dad2de2f9c5c92279e0610f67dc2a3908a9283d53f95d" exitCode=0 Oct 12 07:38:01 crc kubenswrapper[4599]: I1012 07:38:01.617379 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fn444" event={"ID":"0521e91a-c932-4050-9981-c52eefe3e1b9","Type":"ContainerDied","Data":"e70721671c353fbfad3dad2de2f9c5c92279e0610f67dc2a3908a9283d53f95d"} Oct 12 07:38:01 crc kubenswrapper[4599]: I1012 07:38:01.617410 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fn444" event={"ID":"0521e91a-c932-4050-9981-c52eefe3e1b9","Type":"ContainerDied","Data":"eac9f4bfcd96770f2335a6c6bfb1276c915b913648eb5124b1e2b6f8e3f99fe0"} Oct 12 07:38:01 crc kubenswrapper[4599]: I1012 07:38:01.617428 4599 scope.go:117] "RemoveContainer" containerID="e70721671c353fbfad3dad2de2f9c5c92279e0610f67dc2a3908a9283d53f95d" Oct 12 07:38:01 crc kubenswrapper[4599]: I1012 07:38:01.617563 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fn444" Oct 12 07:38:01 crc kubenswrapper[4599]: I1012 07:38:01.630318 4599 scope.go:117] "RemoveContainer" containerID="8fbfdfba6136b4f3f9c59bb132c886cc17853e477843e23494637a884cb7032a" Oct 12 07:38:01 crc kubenswrapper[4599]: I1012 07:38:01.630601 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fn444"] Oct 12 07:38:01 crc kubenswrapper[4599]: I1012 07:38:01.636643 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fn444"] Oct 12 07:38:01 crc kubenswrapper[4599]: I1012 07:38:01.644367 4599 scope.go:117] "RemoveContainer" containerID="e237d10b2590c0496d66772c943e401f156d1300bac87cde71792883e70ad367" Oct 12 07:38:01 crc kubenswrapper[4599]: I1012 07:38:01.656747 4599 scope.go:117] "RemoveContainer" containerID="e70721671c353fbfad3dad2de2f9c5c92279e0610f67dc2a3908a9283d53f95d" Oct 12 07:38:01 crc kubenswrapper[4599]: E1012 07:38:01.657150 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e70721671c353fbfad3dad2de2f9c5c92279e0610f67dc2a3908a9283d53f95d\": container with ID starting with e70721671c353fbfad3dad2de2f9c5c92279e0610f67dc2a3908a9283d53f95d not found: ID does not exist" containerID="e70721671c353fbfad3dad2de2f9c5c92279e0610f67dc2a3908a9283d53f95d" Oct 12 07:38:01 crc kubenswrapper[4599]: I1012 07:38:01.657187 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e70721671c353fbfad3dad2de2f9c5c92279e0610f67dc2a3908a9283d53f95d"} err="failed to get container status \"e70721671c353fbfad3dad2de2f9c5c92279e0610f67dc2a3908a9283d53f95d\": rpc error: code = NotFound desc = could not find container \"e70721671c353fbfad3dad2de2f9c5c92279e0610f67dc2a3908a9283d53f95d\": container with ID starting with e70721671c353fbfad3dad2de2f9c5c92279e0610f67dc2a3908a9283d53f95d not found: ID does not exist" Oct 12 07:38:01 crc kubenswrapper[4599]: I1012 07:38:01.657212 4599 scope.go:117] "RemoveContainer" containerID="8fbfdfba6136b4f3f9c59bb132c886cc17853e477843e23494637a884cb7032a" Oct 12 07:38:01 crc kubenswrapper[4599]: E1012 07:38:01.657510 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8fbfdfba6136b4f3f9c59bb132c886cc17853e477843e23494637a884cb7032a\": container with ID starting with 8fbfdfba6136b4f3f9c59bb132c886cc17853e477843e23494637a884cb7032a not found: ID does not exist" containerID="8fbfdfba6136b4f3f9c59bb132c886cc17853e477843e23494637a884cb7032a" Oct 12 07:38:01 crc kubenswrapper[4599]: I1012 07:38:01.657553 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8fbfdfba6136b4f3f9c59bb132c886cc17853e477843e23494637a884cb7032a"} err="failed to get container status \"8fbfdfba6136b4f3f9c59bb132c886cc17853e477843e23494637a884cb7032a\": rpc error: code = NotFound desc = could not find container \"8fbfdfba6136b4f3f9c59bb132c886cc17853e477843e23494637a884cb7032a\": container with ID starting with 8fbfdfba6136b4f3f9c59bb132c886cc17853e477843e23494637a884cb7032a not found: ID does not exist" Oct 12 07:38:01 crc kubenswrapper[4599]: I1012 07:38:01.657584 4599 scope.go:117] "RemoveContainer" containerID="e237d10b2590c0496d66772c943e401f156d1300bac87cde71792883e70ad367" Oct 12 07:38:01 crc kubenswrapper[4599]: E1012 07:38:01.657862 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e237d10b2590c0496d66772c943e401f156d1300bac87cde71792883e70ad367\": container with ID starting with e237d10b2590c0496d66772c943e401f156d1300bac87cde71792883e70ad367 not found: ID does not exist" containerID="e237d10b2590c0496d66772c943e401f156d1300bac87cde71792883e70ad367" Oct 12 07:38:01 crc kubenswrapper[4599]: I1012 07:38:01.657888 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e237d10b2590c0496d66772c943e401f156d1300bac87cde71792883e70ad367"} err="failed to get container status \"e237d10b2590c0496d66772c943e401f156d1300bac87cde71792883e70ad367\": rpc error: code = NotFound desc = could not find container \"e237d10b2590c0496d66772c943e401f156d1300bac87cde71792883e70ad367\": container with ID starting with e237d10b2590c0496d66772c943e401f156d1300bac87cde71792883e70ad367 not found: ID does not exist" Oct 12 07:38:03 crc kubenswrapper[4599]: I1012 07:38:03.551501 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0521e91a-c932-4050-9981-c52eefe3e1b9" path="/var/lib/kubelet/pods/0521e91a-c932-4050-9981-c52eefe3e1b9/volumes" Oct 12 07:38:07 crc kubenswrapper[4599]: I1012 07:38:07.452710 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-jlnn9"] Oct 12 07:38:09 crc kubenswrapper[4599]: I1012 07:38:09.664649 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 07:38:28 crc kubenswrapper[4599]: I1012 07:38:28.322291 4599 patch_prober.go:28] interesting pod/machine-config-daemon-5mz5c container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 07:38:28 crc kubenswrapper[4599]: I1012 07:38:28.323097 4599 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 07:38:28 crc kubenswrapper[4599]: I1012 07:38:28.323166 4599 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" Oct 12 07:38:28 crc kubenswrapper[4599]: I1012 07:38:28.323869 4599 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"238fe8d2b2e7f08d70bc5f938aca3c90b68c447db21a713d651d47b47a8127c2"} pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 12 07:38:28 crc kubenswrapper[4599]: I1012 07:38:28.323945 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" containerName="machine-config-daemon" containerID="cri-o://238fe8d2b2e7f08d70bc5f938aca3c90b68c447db21a713d651d47b47a8127c2" gracePeriod=600 Oct 12 07:38:28 crc kubenswrapper[4599]: I1012 07:38:28.759701 4599 generic.go:334] "Generic (PLEG): container finished" podID="cc694bce-8c25-4729-b452-29d44d3efe6e" containerID="238fe8d2b2e7f08d70bc5f938aca3c90b68c447db21a713d651d47b47a8127c2" exitCode=0 Oct 12 07:38:28 crc kubenswrapper[4599]: I1012 07:38:28.759794 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" event={"ID":"cc694bce-8c25-4729-b452-29d44d3efe6e","Type":"ContainerDied","Data":"238fe8d2b2e7f08d70bc5f938aca3c90b68c447db21a713d651d47b47a8127c2"} Oct 12 07:38:28 crc kubenswrapper[4599]: I1012 07:38:28.760026 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" event={"ID":"cc694bce-8c25-4729-b452-29d44d3efe6e","Type":"ContainerStarted","Data":"96c58ca22fd64ff7166d37c6b5588563180da0da78f1666f39d10296101df256"} Oct 12 07:38:32 crc kubenswrapper[4599]: I1012 07:38:32.475702 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-jlnn9" podUID="06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1" containerName="oauth-openshift" containerID="cri-o://17b1c6b7ed4f23206782035e25723153ac8f1f4bdd6b7d5d18ecd73d10dbbb65" gracePeriod=15 Oct 12 07:38:32 crc kubenswrapper[4599]: I1012 07:38:32.786788 4599 generic.go:334] "Generic (PLEG): container finished" podID="06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1" containerID="17b1c6b7ed4f23206782035e25723153ac8f1f4bdd6b7d5d18ecd73d10dbbb65" exitCode=0 Oct 12 07:38:32 crc kubenswrapper[4599]: I1012 07:38:32.786892 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-jlnn9" event={"ID":"06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1","Type":"ContainerDied","Data":"17b1c6b7ed4f23206782035e25723153ac8f1f4bdd6b7d5d18ecd73d10dbbb65"} Oct 12 07:38:32 crc kubenswrapper[4599]: I1012 07:38:32.787050 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-jlnn9" event={"ID":"06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1","Type":"ContainerDied","Data":"8bbdada1748118d6e5a8ff116d1f17170316717597f8675d98a27970170f45c7"} Oct 12 07:38:32 crc kubenswrapper[4599]: I1012 07:38:32.787069 4599 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8bbdada1748118d6e5a8ff116d1f17170316717597f8675d98a27970170f45c7" Oct 12 07:38:32 crc kubenswrapper[4599]: I1012 07:38:32.801662 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-jlnn9" Oct 12 07:38:32 crc kubenswrapper[4599]: I1012 07:38:32.826758 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-f578d5c8f-9xf5f"] Oct 12 07:38:32 crc kubenswrapper[4599]: E1012 07:38:32.826972 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fe01788-af0e-426a-8c28-a26764993905" containerName="pruner" Oct 12 07:38:32 crc kubenswrapper[4599]: I1012 07:38:32.826990 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fe01788-af0e-426a-8c28-a26764993905" containerName="pruner" Oct 12 07:38:32 crc kubenswrapper[4599]: E1012 07:38:32.826998 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a84bfa7d-3d7b-48df-ba61-ab9640a00e1e" containerName="extract-utilities" Oct 12 07:38:32 crc kubenswrapper[4599]: I1012 07:38:32.827007 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="a84bfa7d-3d7b-48df-ba61-ab9640a00e1e" containerName="extract-utilities" Oct 12 07:38:32 crc kubenswrapper[4599]: E1012 07:38:32.827018 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58710253-2df7-4fef-89bd-e5b6267bd0b4" containerName="extract-content" Oct 12 07:38:32 crc kubenswrapper[4599]: I1012 07:38:32.827023 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="58710253-2df7-4fef-89bd-e5b6267bd0b4" containerName="extract-content" Oct 12 07:38:32 crc kubenswrapper[4599]: E1012 07:38:32.827030 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a84bfa7d-3d7b-48df-ba61-ab9640a00e1e" containerName="extract-content" Oct 12 07:38:32 crc kubenswrapper[4599]: I1012 07:38:32.827037 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="a84bfa7d-3d7b-48df-ba61-ab9640a00e1e" containerName="extract-content" Oct 12 07:38:32 crc kubenswrapper[4599]: E1012 07:38:32.827049 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2b7b084-738f-430d-8e2c-9c1b5c0ea421" containerName="registry-server" Oct 12 07:38:32 crc kubenswrapper[4599]: I1012 07:38:32.827054 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2b7b084-738f-430d-8e2c-9c1b5c0ea421" containerName="registry-server" Oct 12 07:38:32 crc kubenswrapper[4599]: E1012 07:38:32.827063 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dd71739-e6a2-4279-9d03-6e94dd3d95e8" containerName="pruner" Oct 12 07:38:32 crc kubenswrapper[4599]: I1012 07:38:32.827068 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dd71739-e6a2-4279-9d03-6e94dd3d95e8" containerName="pruner" Oct 12 07:38:32 crc kubenswrapper[4599]: E1012 07:38:32.827074 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a84bfa7d-3d7b-48df-ba61-ab9640a00e1e" containerName="registry-server" Oct 12 07:38:32 crc kubenswrapper[4599]: I1012 07:38:32.827079 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="a84bfa7d-3d7b-48df-ba61-ab9640a00e1e" containerName="registry-server" Oct 12 07:38:32 crc kubenswrapper[4599]: E1012 07:38:32.827087 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2b7b084-738f-430d-8e2c-9c1b5c0ea421" containerName="extract-content" Oct 12 07:38:32 crc kubenswrapper[4599]: I1012 07:38:32.827093 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2b7b084-738f-430d-8e2c-9c1b5c0ea421" containerName="extract-content" Oct 12 07:38:32 crc kubenswrapper[4599]: E1012 07:38:32.827100 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0521e91a-c932-4050-9981-c52eefe3e1b9" containerName="registry-server" Oct 12 07:38:32 crc kubenswrapper[4599]: I1012 07:38:32.827107 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="0521e91a-c932-4050-9981-c52eefe3e1b9" containerName="registry-server" Oct 12 07:38:32 crc kubenswrapper[4599]: E1012 07:38:32.827116 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0521e91a-c932-4050-9981-c52eefe3e1b9" containerName="extract-utilities" Oct 12 07:38:32 crc kubenswrapper[4599]: I1012 07:38:32.827122 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="0521e91a-c932-4050-9981-c52eefe3e1b9" containerName="extract-utilities" Oct 12 07:38:32 crc kubenswrapper[4599]: E1012 07:38:32.827130 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1" containerName="oauth-openshift" Oct 12 07:38:32 crc kubenswrapper[4599]: I1012 07:38:32.827137 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1" containerName="oauth-openshift" Oct 12 07:38:32 crc kubenswrapper[4599]: E1012 07:38:32.827147 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58710253-2df7-4fef-89bd-e5b6267bd0b4" containerName="extract-utilities" Oct 12 07:38:32 crc kubenswrapper[4599]: I1012 07:38:32.827153 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="58710253-2df7-4fef-89bd-e5b6267bd0b4" containerName="extract-utilities" Oct 12 07:38:32 crc kubenswrapper[4599]: E1012 07:38:32.827160 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0521e91a-c932-4050-9981-c52eefe3e1b9" containerName="extract-content" Oct 12 07:38:32 crc kubenswrapper[4599]: I1012 07:38:32.827165 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="0521e91a-c932-4050-9981-c52eefe3e1b9" containerName="extract-content" Oct 12 07:38:32 crc kubenswrapper[4599]: E1012 07:38:32.827171 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58710253-2df7-4fef-89bd-e5b6267bd0b4" containerName="registry-server" Oct 12 07:38:32 crc kubenswrapper[4599]: I1012 07:38:32.827176 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="58710253-2df7-4fef-89bd-e5b6267bd0b4" containerName="registry-server" Oct 12 07:38:32 crc kubenswrapper[4599]: E1012 07:38:32.827183 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2b7b084-738f-430d-8e2c-9c1b5c0ea421" containerName="extract-utilities" Oct 12 07:38:32 crc kubenswrapper[4599]: I1012 07:38:32.827189 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2b7b084-738f-430d-8e2c-9c1b5c0ea421" containerName="extract-utilities" Oct 12 07:38:32 crc kubenswrapper[4599]: I1012 07:38:32.827270 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="58710253-2df7-4fef-89bd-e5b6267bd0b4" containerName="registry-server" Oct 12 07:38:32 crc kubenswrapper[4599]: I1012 07:38:32.827279 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2b7b084-738f-430d-8e2c-9c1b5c0ea421" containerName="registry-server" Oct 12 07:38:32 crc kubenswrapper[4599]: I1012 07:38:32.827288 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="0521e91a-c932-4050-9981-c52eefe3e1b9" containerName="registry-server" Oct 12 07:38:32 crc kubenswrapper[4599]: I1012 07:38:32.827295 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="1dd71739-e6a2-4279-9d03-6e94dd3d95e8" containerName="pruner" Oct 12 07:38:32 crc kubenswrapper[4599]: I1012 07:38:32.827307 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fe01788-af0e-426a-8c28-a26764993905" containerName="pruner" Oct 12 07:38:32 crc kubenswrapper[4599]: I1012 07:38:32.827315 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="a84bfa7d-3d7b-48df-ba61-ab9640a00e1e" containerName="registry-server" Oct 12 07:38:32 crc kubenswrapper[4599]: I1012 07:38:32.827321 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1" containerName="oauth-openshift" Oct 12 07:38:32 crc kubenswrapper[4599]: I1012 07:38:32.827723 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-f578d5c8f-9xf5f" Oct 12 07:38:32 crc kubenswrapper[4599]: I1012 07:38:32.835574 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-f578d5c8f-9xf5f"] Oct 12 07:38:32 crc kubenswrapper[4599]: I1012 07:38:32.971549 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1-v4-0-config-system-service-ca\") pod \"06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1\" (UID: \"06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1\") " Oct 12 07:38:32 crc kubenswrapper[4599]: I1012 07:38:32.971618 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zws9k\" (UniqueName: \"kubernetes.io/projected/06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1-kube-api-access-zws9k\") pod \"06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1\" (UID: \"06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1\") " Oct 12 07:38:32 crc kubenswrapper[4599]: I1012 07:38:32.971657 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1-v4-0-config-user-template-error\") pod \"06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1\" (UID: \"06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1\") " Oct 12 07:38:32 crc kubenswrapper[4599]: I1012 07:38:32.971677 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1-v4-0-config-user-idp-0-file-data\") pod \"06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1\" (UID: \"06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1\") " Oct 12 07:38:32 crc kubenswrapper[4599]: I1012 07:38:32.971694 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1-v4-0-config-system-cliconfig\") pod \"06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1\" (UID: \"06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1\") " Oct 12 07:38:32 crc kubenswrapper[4599]: I1012 07:38:32.971719 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1-v4-0-config-system-ocp-branding-template\") pod \"06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1\" (UID: \"06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1\") " Oct 12 07:38:32 crc kubenswrapper[4599]: I1012 07:38:32.971743 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1-v4-0-config-system-session\") pod \"06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1\" (UID: \"06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1\") " Oct 12 07:38:32 crc kubenswrapper[4599]: I1012 07:38:32.971769 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1-v4-0-config-user-template-provider-selection\") pod \"06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1\" (UID: \"06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1\") " Oct 12 07:38:32 crc kubenswrapper[4599]: I1012 07:38:32.971792 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1-v4-0-config-system-serving-cert\") pod \"06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1\" (UID: \"06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1\") " Oct 12 07:38:32 crc kubenswrapper[4599]: I1012 07:38:32.971812 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1-v4-0-config-system-trusted-ca-bundle\") pod \"06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1\" (UID: \"06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1\") " Oct 12 07:38:32 crc kubenswrapper[4599]: I1012 07:38:32.971829 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1-audit-policies\") pod \"06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1\" (UID: \"06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1\") " Oct 12 07:38:32 crc kubenswrapper[4599]: I1012 07:38:32.971851 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1-v4-0-config-system-router-certs\") pod \"06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1\" (UID: \"06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1\") " Oct 12 07:38:32 crc kubenswrapper[4599]: I1012 07:38:32.971872 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1-v4-0-config-user-template-login\") pod \"06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1\" (UID: \"06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1\") " Oct 12 07:38:32 crc kubenswrapper[4599]: I1012 07:38:32.971890 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1-audit-dir\") pod \"06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1\" (UID: \"06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1\") " Oct 12 07:38:32 crc kubenswrapper[4599]: I1012 07:38:32.971966 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r77dd\" (UniqueName: \"kubernetes.io/projected/a5dd3683-3b10-487e-8655-b5b24e4e6d52-kube-api-access-r77dd\") pod \"oauth-openshift-f578d5c8f-9xf5f\" (UID: \"a5dd3683-3b10-487e-8655-b5b24e4e6d52\") " pod="openshift-authentication/oauth-openshift-f578d5c8f-9xf5f" Oct 12 07:38:32 crc kubenswrapper[4599]: I1012 07:38:32.971994 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a5dd3683-3b10-487e-8655-b5b24e4e6d52-v4-0-config-user-template-error\") pod \"oauth-openshift-f578d5c8f-9xf5f\" (UID: \"a5dd3683-3b10-487e-8655-b5b24e4e6d52\") " pod="openshift-authentication/oauth-openshift-f578d5c8f-9xf5f" Oct 12 07:38:32 crc kubenswrapper[4599]: I1012 07:38:32.972018 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a5dd3683-3b10-487e-8655-b5b24e4e6d52-audit-policies\") pod \"oauth-openshift-f578d5c8f-9xf5f\" (UID: \"a5dd3683-3b10-487e-8655-b5b24e4e6d52\") " pod="openshift-authentication/oauth-openshift-f578d5c8f-9xf5f" Oct 12 07:38:32 crc kubenswrapper[4599]: I1012 07:38:32.972048 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a5dd3683-3b10-487e-8655-b5b24e4e6d52-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-f578d5c8f-9xf5f\" (UID: \"a5dd3683-3b10-487e-8655-b5b24e4e6d52\") " pod="openshift-authentication/oauth-openshift-f578d5c8f-9xf5f" Oct 12 07:38:32 crc kubenswrapper[4599]: I1012 07:38:32.972068 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a5dd3683-3b10-487e-8655-b5b24e4e6d52-v4-0-config-system-serving-cert\") pod \"oauth-openshift-f578d5c8f-9xf5f\" (UID: \"a5dd3683-3b10-487e-8655-b5b24e4e6d52\") " pod="openshift-authentication/oauth-openshift-f578d5c8f-9xf5f" Oct 12 07:38:32 crc kubenswrapper[4599]: I1012 07:38:32.972087 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a5dd3683-3b10-487e-8655-b5b24e4e6d52-v4-0-config-system-service-ca\") pod \"oauth-openshift-f578d5c8f-9xf5f\" (UID: \"a5dd3683-3b10-487e-8655-b5b24e4e6d52\") " pod="openshift-authentication/oauth-openshift-f578d5c8f-9xf5f" Oct 12 07:38:32 crc kubenswrapper[4599]: I1012 07:38:32.972105 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a5dd3683-3b10-487e-8655-b5b24e4e6d52-audit-dir\") pod \"oauth-openshift-f578d5c8f-9xf5f\" (UID: \"a5dd3683-3b10-487e-8655-b5b24e4e6d52\") " pod="openshift-authentication/oauth-openshift-f578d5c8f-9xf5f" Oct 12 07:38:32 crc kubenswrapper[4599]: I1012 07:38:32.972124 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a5dd3683-3b10-487e-8655-b5b24e4e6d52-v4-0-config-system-session\") pod \"oauth-openshift-f578d5c8f-9xf5f\" (UID: \"a5dd3683-3b10-487e-8655-b5b24e4e6d52\") " pod="openshift-authentication/oauth-openshift-f578d5c8f-9xf5f" Oct 12 07:38:32 crc kubenswrapper[4599]: I1012 07:38:32.972144 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a5dd3683-3b10-487e-8655-b5b24e4e6d52-v4-0-config-system-cliconfig\") pod \"oauth-openshift-f578d5c8f-9xf5f\" (UID: \"a5dd3683-3b10-487e-8655-b5b24e4e6d52\") " pod="openshift-authentication/oauth-openshift-f578d5c8f-9xf5f" Oct 12 07:38:32 crc kubenswrapper[4599]: I1012 07:38:32.972163 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a5dd3683-3b10-487e-8655-b5b24e4e6d52-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-f578d5c8f-9xf5f\" (UID: \"a5dd3683-3b10-487e-8655-b5b24e4e6d52\") " pod="openshift-authentication/oauth-openshift-f578d5c8f-9xf5f" Oct 12 07:38:32 crc kubenswrapper[4599]: I1012 07:38:32.972185 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a5dd3683-3b10-487e-8655-b5b24e4e6d52-v4-0-config-system-router-certs\") pod \"oauth-openshift-f578d5c8f-9xf5f\" (UID: \"a5dd3683-3b10-487e-8655-b5b24e4e6d52\") " pod="openshift-authentication/oauth-openshift-f578d5c8f-9xf5f" Oct 12 07:38:32 crc kubenswrapper[4599]: I1012 07:38:32.972220 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/a5dd3683-3b10-487e-8655-b5b24e4e6d52-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-f578d5c8f-9xf5f\" (UID: \"a5dd3683-3b10-487e-8655-b5b24e4e6d52\") " pod="openshift-authentication/oauth-openshift-f578d5c8f-9xf5f" Oct 12 07:38:32 crc kubenswrapper[4599]: I1012 07:38:32.972245 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a5dd3683-3b10-487e-8655-b5b24e4e6d52-v4-0-config-user-template-login\") pod \"oauth-openshift-f578d5c8f-9xf5f\" (UID: \"a5dd3683-3b10-487e-8655-b5b24e4e6d52\") " pod="openshift-authentication/oauth-openshift-f578d5c8f-9xf5f" Oct 12 07:38:32 crc kubenswrapper[4599]: I1012 07:38:32.972266 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a5dd3683-3b10-487e-8655-b5b24e4e6d52-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-f578d5c8f-9xf5f\" (UID: \"a5dd3683-3b10-487e-8655-b5b24e4e6d52\") " pod="openshift-authentication/oauth-openshift-f578d5c8f-9xf5f" Oct 12 07:38:32 crc kubenswrapper[4599]: I1012 07:38:32.973083 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1" (UID: "06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:38:32 crc kubenswrapper[4599]: I1012 07:38:32.973323 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1" (UID: "06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 12 07:38:32 crc kubenswrapper[4599]: I1012 07:38:32.974104 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1" (UID: "06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:38:32 crc kubenswrapper[4599]: I1012 07:38:32.974376 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1" (UID: "06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:38:32 crc kubenswrapper[4599]: I1012 07:38:32.974744 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1" (UID: "06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:38:32 crc kubenswrapper[4599]: I1012 07:38:32.979714 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1" (UID: "06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:38:32 crc kubenswrapper[4599]: I1012 07:38:32.980318 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1-kube-api-access-zws9k" (OuterVolumeSpecName: "kube-api-access-zws9k") pod "06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1" (UID: "06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1"). InnerVolumeSpecName "kube-api-access-zws9k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:38:32 crc kubenswrapper[4599]: I1012 07:38:32.980777 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1" (UID: "06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:38:32 crc kubenswrapper[4599]: I1012 07:38:32.981077 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1" (UID: "06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:38:32 crc kubenswrapper[4599]: I1012 07:38:32.981318 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1" (UID: "06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:38:32 crc kubenswrapper[4599]: I1012 07:38:32.981626 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1" (UID: "06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:38:32 crc kubenswrapper[4599]: I1012 07:38:32.981876 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1" (UID: "06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:38:32 crc kubenswrapper[4599]: I1012 07:38:32.982136 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1" (UID: "06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:38:32 crc kubenswrapper[4599]: I1012 07:38:32.984093 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1" (UID: "06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:38:33 crc kubenswrapper[4599]: I1012 07:38:33.073745 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a5dd3683-3b10-487e-8655-b5b24e4e6d52-v4-0-config-user-template-error\") pod \"oauth-openshift-f578d5c8f-9xf5f\" (UID: \"a5dd3683-3b10-487e-8655-b5b24e4e6d52\") " pod="openshift-authentication/oauth-openshift-f578d5c8f-9xf5f" Oct 12 07:38:33 crc kubenswrapper[4599]: I1012 07:38:33.073798 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a5dd3683-3b10-487e-8655-b5b24e4e6d52-audit-policies\") pod \"oauth-openshift-f578d5c8f-9xf5f\" (UID: \"a5dd3683-3b10-487e-8655-b5b24e4e6d52\") " pod="openshift-authentication/oauth-openshift-f578d5c8f-9xf5f" Oct 12 07:38:33 crc kubenswrapper[4599]: I1012 07:38:33.073826 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a5dd3683-3b10-487e-8655-b5b24e4e6d52-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-f578d5c8f-9xf5f\" (UID: \"a5dd3683-3b10-487e-8655-b5b24e4e6d52\") " pod="openshift-authentication/oauth-openshift-f578d5c8f-9xf5f" Oct 12 07:38:33 crc kubenswrapper[4599]: I1012 07:38:33.073847 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a5dd3683-3b10-487e-8655-b5b24e4e6d52-v4-0-config-system-serving-cert\") pod \"oauth-openshift-f578d5c8f-9xf5f\" (UID: \"a5dd3683-3b10-487e-8655-b5b24e4e6d52\") " pod="openshift-authentication/oauth-openshift-f578d5c8f-9xf5f" Oct 12 07:38:33 crc kubenswrapper[4599]: I1012 07:38:33.073862 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a5dd3683-3b10-487e-8655-b5b24e4e6d52-v4-0-config-system-service-ca\") pod \"oauth-openshift-f578d5c8f-9xf5f\" (UID: \"a5dd3683-3b10-487e-8655-b5b24e4e6d52\") " pod="openshift-authentication/oauth-openshift-f578d5c8f-9xf5f" Oct 12 07:38:33 crc kubenswrapper[4599]: I1012 07:38:33.073880 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a5dd3683-3b10-487e-8655-b5b24e4e6d52-audit-dir\") pod \"oauth-openshift-f578d5c8f-9xf5f\" (UID: \"a5dd3683-3b10-487e-8655-b5b24e4e6d52\") " pod="openshift-authentication/oauth-openshift-f578d5c8f-9xf5f" Oct 12 07:38:33 crc kubenswrapper[4599]: I1012 07:38:33.073896 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a5dd3683-3b10-487e-8655-b5b24e4e6d52-v4-0-config-system-session\") pod \"oauth-openshift-f578d5c8f-9xf5f\" (UID: \"a5dd3683-3b10-487e-8655-b5b24e4e6d52\") " pod="openshift-authentication/oauth-openshift-f578d5c8f-9xf5f" Oct 12 07:38:33 crc kubenswrapper[4599]: I1012 07:38:33.073913 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a5dd3683-3b10-487e-8655-b5b24e4e6d52-v4-0-config-system-cliconfig\") pod \"oauth-openshift-f578d5c8f-9xf5f\" (UID: \"a5dd3683-3b10-487e-8655-b5b24e4e6d52\") " pod="openshift-authentication/oauth-openshift-f578d5c8f-9xf5f" Oct 12 07:38:33 crc kubenswrapper[4599]: I1012 07:38:33.073933 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a5dd3683-3b10-487e-8655-b5b24e4e6d52-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-f578d5c8f-9xf5f\" (UID: \"a5dd3683-3b10-487e-8655-b5b24e4e6d52\") " pod="openshift-authentication/oauth-openshift-f578d5c8f-9xf5f" Oct 12 07:38:33 crc kubenswrapper[4599]: I1012 07:38:33.073954 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a5dd3683-3b10-487e-8655-b5b24e4e6d52-v4-0-config-system-router-certs\") pod \"oauth-openshift-f578d5c8f-9xf5f\" (UID: \"a5dd3683-3b10-487e-8655-b5b24e4e6d52\") " pod="openshift-authentication/oauth-openshift-f578d5c8f-9xf5f" Oct 12 07:38:33 crc kubenswrapper[4599]: I1012 07:38:33.073997 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/a5dd3683-3b10-487e-8655-b5b24e4e6d52-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-f578d5c8f-9xf5f\" (UID: \"a5dd3683-3b10-487e-8655-b5b24e4e6d52\") " pod="openshift-authentication/oauth-openshift-f578d5c8f-9xf5f" Oct 12 07:38:33 crc kubenswrapper[4599]: I1012 07:38:33.074017 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a5dd3683-3b10-487e-8655-b5b24e4e6d52-v4-0-config-user-template-login\") pod \"oauth-openshift-f578d5c8f-9xf5f\" (UID: \"a5dd3683-3b10-487e-8655-b5b24e4e6d52\") " pod="openshift-authentication/oauth-openshift-f578d5c8f-9xf5f" Oct 12 07:38:33 crc kubenswrapper[4599]: I1012 07:38:33.074034 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a5dd3683-3b10-487e-8655-b5b24e4e6d52-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-f578d5c8f-9xf5f\" (UID: \"a5dd3683-3b10-487e-8655-b5b24e4e6d52\") " pod="openshift-authentication/oauth-openshift-f578d5c8f-9xf5f" Oct 12 07:38:33 crc kubenswrapper[4599]: I1012 07:38:33.074061 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r77dd\" (UniqueName: \"kubernetes.io/projected/a5dd3683-3b10-487e-8655-b5b24e4e6d52-kube-api-access-r77dd\") pod \"oauth-openshift-f578d5c8f-9xf5f\" (UID: \"a5dd3683-3b10-487e-8655-b5b24e4e6d52\") " pod="openshift-authentication/oauth-openshift-f578d5c8f-9xf5f" Oct 12 07:38:33 crc kubenswrapper[4599]: I1012 07:38:33.074098 4599 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 12 07:38:33 crc kubenswrapper[4599]: I1012 07:38:33.074112 4599 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 12 07:38:33 crc kubenswrapper[4599]: I1012 07:38:33.074124 4599 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 12 07:38:33 crc kubenswrapper[4599]: I1012 07:38:33.074134 4599 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 07:38:33 crc kubenswrapper[4599]: I1012 07:38:33.074143 4599 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 12 07:38:33 crc kubenswrapper[4599]: I1012 07:38:33.074153 4599 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 12 07:38:33 crc kubenswrapper[4599]: I1012 07:38:33.074164 4599 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 12 07:38:33 crc kubenswrapper[4599]: I1012 07:38:33.074173 4599 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1-audit-dir\") on node \"crc\" DevicePath \"\"" Oct 12 07:38:33 crc kubenswrapper[4599]: I1012 07:38:33.074183 4599 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 12 07:38:33 crc kubenswrapper[4599]: I1012 07:38:33.074191 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zws9k\" (UniqueName: \"kubernetes.io/projected/06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1-kube-api-access-zws9k\") on node \"crc\" DevicePath \"\"" Oct 12 07:38:33 crc kubenswrapper[4599]: I1012 07:38:33.074201 4599 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 12 07:38:33 crc kubenswrapper[4599]: I1012 07:38:33.074211 4599 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 12 07:38:33 crc kubenswrapper[4599]: I1012 07:38:33.074221 4599 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 12 07:38:33 crc kubenswrapper[4599]: I1012 07:38:33.074232 4599 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 12 07:38:33 crc kubenswrapper[4599]: I1012 07:38:33.074498 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a5dd3683-3b10-487e-8655-b5b24e4e6d52-audit-dir\") pod \"oauth-openshift-f578d5c8f-9xf5f\" (UID: \"a5dd3683-3b10-487e-8655-b5b24e4e6d52\") " pod="openshift-authentication/oauth-openshift-f578d5c8f-9xf5f" Oct 12 07:38:33 crc kubenswrapper[4599]: I1012 07:38:33.074781 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a5dd3683-3b10-487e-8655-b5b24e4e6d52-v4-0-config-system-service-ca\") pod \"oauth-openshift-f578d5c8f-9xf5f\" (UID: \"a5dd3683-3b10-487e-8655-b5b24e4e6d52\") " pod="openshift-authentication/oauth-openshift-f578d5c8f-9xf5f" Oct 12 07:38:33 crc kubenswrapper[4599]: I1012 07:38:33.075076 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a5dd3683-3b10-487e-8655-b5b24e4e6d52-audit-policies\") pod \"oauth-openshift-f578d5c8f-9xf5f\" (UID: \"a5dd3683-3b10-487e-8655-b5b24e4e6d52\") " pod="openshift-authentication/oauth-openshift-f578d5c8f-9xf5f" Oct 12 07:38:33 crc kubenswrapper[4599]: I1012 07:38:33.075274 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a5dd3683-3b10-487e-8655-b5b24e4e6d52-v4-0-config-system-cliconfig\") pod \"oauth-openshift-f578d5c8f-9xf5f\" (UID: \"a5dd3683-3b10-487e-8655-b5b24e4e6d52\") " pod="openshift-authentication/oauth-openshift-f578d5c8f-9xf5f" Oct 12 07:38:33 crc kubenswrapper[4599]: I1012 07:38:33.075449 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a5dd3683-3b10-487e-8655-b5b24e4e6d52-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-f578d5c8f-9xf5f\" (UID: \"a5dd3683-3b10-487e-8655-b5b24e4e6d52\") " pod="openshift-authentication/oauth-openshift-f578d5c8f-9xf5f" Oct 12 07:38:33 crc kubenswrapper[4599]: I1012 07:38:33.077896 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a5dd3683-3b10-487e-8655-b5b24e4e6d52-v4-0-config-system-session\") pod \"oauth-openshift-f578d5c8f-9xf5f\" (UID: \"a5dd3683-3b10-487e-8655-b5b24e4e6d52\") " pod="openshift-authentication/oauth-openshift-f578d5c8f-9xf5f" Oct 12 07:38:33 crc kubenswrapper[4599]: I1012 07:38:33.078105 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a5dd3683-3b10-487e-8655-b5b24e4e6d52-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-f578d5c8f-9xf5f\" (UID: \"a5dd3683-3b10-487e-8655-b5b24e4e6d52\") " pod="openshift-authentication/oauth-openshift-f578d5c8f-9xf5f" Oct 12 07:38:33 crc kubenswrapper[4599]: I1012 07:38:33.078151 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a5dd3683-3b10-487e-8655-b5b24e4e6d52-v4-0-config-system-serving-cert\") pod \"oauth-openshift-f578d5c8f-9xf5f\" (UID: \"a5dd3683-3b10-487e-8655-b5b24e4e6d52\") " pod="openshift-authentication/oauth-openshift-f578d5c8f-9xf5f" Oct 12 07:38:33 crc kubenswrapper[4599]: I1012 07:38:33.078328 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a5dd3683-3b10-487e-8655-b5b24e4e6d52-v4-0-config-user-template-error\") pod \"oauth-openshift-f578d5c8f-9xf5f\" (UID: \"a5dd3683-3b10-487e-8655-b5b24e4e6d52\") " pod="openshift-authentication/oauth-openshift-f578d5c8f-9xf5f" Oct 12 07:38:33 crc kubenswrapper[4599]: I1012 07:38:33.078989 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a5dd3683-3b10-487e-8655-b5b24e4e6d52-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-f578d5c8f-9xf5f\" (UID: \"a5dd3683-3b10-487e-8655-b5b24e4e6d52\") " pod="openshift-authentication/oauth-openshift-f578d5c8f-9xf5f" Oct 12 07:38:33 crc kubenswrapper[4599]: I1012 07:38:33.078990 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a5dd3683-3b10-487e-8655-b5b24e4e6d52-v4-0-config-user-template-login\") pod \"oauth-openshift-f578d5c8f-9xf5f\" (UID: \"a5dd3683-3b10-487e-8655-b5b24e4e6d52\") " pod="openshift-authentication/oauth-openshift-f578d5c8f-9xf5f" Oct 12 07:38:33 crc kubenswrapper[4599]: I1012 07:38:33.079130 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a5dd3683-3b10-487e-8655-b5b24e4e6d52-v4-0-config-system-router-certs\") pod \"oauth-openshift-f578d5c8f-9xf5f\" (UID: \"a5dd3683-3b10-487e-8655-b5b24e4e6d52\") " pod="openshift-authentication/oauth-openshift-f578d5c8f-9xf5f" Oct 12 07:38:33 crc kubenswrapper[4599]: I1012 07:38:33.079850 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/a5dd3683-3b10-487e-8655-b5b24e4e6d52-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-f578d5c8f-9xf5f\" (UID: \"a5dd3683-3b10-487e-8655-b5b24e4e6d52\") " pod="openshift-authentication/oauth-openshift-f578d5c8f-9xf5f" Oct 12 07:38:33 crc kubenswrapper[4599]: I1012 07:38:33.089442 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r77dd\" (UniqueName: \"kubernetes.io/projected/a5dd3683-3b10-487e-8655-b5b24e4e6d52-kube-api-access-r77dd\") pod \"oauth-openshift-f578d5c8f-9xf5f\" (UID: \"a5dd3683-3b10-487e-8655-b5b24e4e6d52\") " pod="openshift-authentication/oauth-openshift-f578d5c8f-9xf5f" Oct 12 07:38:33 crc kubenswrapper[4599]: I1012 07:38:33.140463 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-f578d5c8f-9xf5f" Oct 12 07:38:33 crc kubenswrapper[4599]: I1012 07:38:33.493395 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-f578d5c8f-9xf5f"] Oct 12 07:38:33 crc kubenswrapper[4599]: I1012 07:38:33.793163 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-f578d5c8f-9xf5f" event={"ID":"a5dd3683-3b10-487e-8655-b5b24e4e6d52","Type":"ContainerStarted","Data":"e2ef71903961072ffc4fde68ed672d9e789795dad230689205606e51c4ffb585"} Oct 12 07:38:33 crc kubenswrapper[4599]: I1012 07:38:33.793656 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-f578d5c8f-9xf5f" Oct 12 07:38:33 crc kubenswrapper[4599]: I1012 07:38:33.793738 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-f578d5c8f-9xf5f" event={"ID":"a5dd3683-3b10-487e-8655-b5b24e4e6d52","Type":"ContainerStarted","Data":"3ca25602fad3fe659773dfd986ddce564e4e8a46f1c4aa9c351f13c3632915c7"} Oct 12 07:38:33 crc kubenswrapper[4599]: I1012 07:38:33.793221 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-jlnn9" Oct 12 07:38:33 crc kubenswrapper[4599]: I1012 07:38:33.815370 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-f578d5c8f-9xf5f" podStartSLOduration=26.815327442 podStartE2EDuration="26.815327442s" podCreationTimestamp="2025-10-12 07:38:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:38:33.810432369 +0000 UTC m=+210.599627870" watchObservedRunningTime="2025-10-12 07:38:33.815327442 +0000 UTC m=+210.604522944" Oct 12 07:38:33 crc kubenswrapper[4599]: I1012 07:38:33.827259 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-jlnn9"] Oct 12 07:38:33 crc kubenswrapper[4599]: I1012 07:38:33.832685 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-jlnn9"] Oct 12 07:38:34 crc kubenswrapper[4599]: I1012 07:38:34.025227 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-f578d5c8f-9xf5f" Oct 12 07:38:35 crc kubenswrapper[4599]: I1012 07:38:35.550367 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1" path="/var/lib/kubelet/pods/06a7a4ad-fac1-44c9-adb4-c0ccd968c6b1/volumes" Oct 12 07:38:44 crc kubenswrapper[4599]: I1012 07:38:44.705083 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rmrp7"] Oct 12 07:38:44 crc kubenswrapper[4599]: I1012 07:38:44.705850 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-rmrp7" podUID="b914c467-be05-49d4-a391-d98254248ade" containerName="registry-server" containerID="cri-o://2d8c1f800c24b0c46ecf9d481d0d9309c367575ed5e96f952ad8b4560243327f" gracePeriod=30 Oct 12 07:38:44 crc kubenswrapper[4599]: I1012 07:38:44.709360 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mxnmk"] Oct 12 07:38:44 crc kubenswrapper[4599]: I1012 07:38:44.709650 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mxnmk" podUID="ddcc528b-bf0a-404c-b121-cee466cb352c" containerName="registry-server" containerID="cri-o://c790f0beda1a60adb007e007d9ac5d70ab13706da271eb3da0975bf047676b30" gracePeriod=30 Oct 12 07:38:44 crc kubenswrapper[4599]: I1012 07:38:44.717028 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ks8z9"] Oct 12 07:38:44 crc kubenswrapper[4599]: I1012 07:38:44.717205 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-ks8z9" podUID="d7e17165-80a0-4e3a-921f-a7ccbf9a9fe5" containerName="marketplace-operator" containerID="cri-o://54a02e07e2e27b2c5e82be454f3c251d42fcc14b5404f74bd9a3ac1e6bc3a845" gracePeriod=30 Oct 12 07:38:44 crc kubenswrapper[4599]: I1012 07:38:44.730424 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-h8fml"] Oct 12 07:38:44 crc kubenswrapper[4599]: I1012 07:38:44.730728 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-h8fml" podUID="cef6ba33-1bde-40ae-8025-05f6234cd636" containerName="registry-server" containerID="cri-o://4f37fb21a9d7b651793aa3c92edc4f69bd375b1e19319aead6676b3387455032" gracePeriod=30 Oct 12 07:38:44 crc kubenswrapper[4599]: I1012 07:38:44.731155 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2vntc"] Oct 12 07:38:44 crc kubenswrapper[4599]: I1012 07:38:44.732010 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-2vntc" Oct 12 07:38:44 crc kubenswrapper[4599]: I1012 07:38:44.745957 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2vntc"] Oct 12 07:38:44 crc kubenswrapper[4599]: I1012 07:38:44.748273 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6m4tn"] Oct 12 07:38:44 crc kubenswrapper[4599]: I1012 07:38:44.748572 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6m4tn" podUID="611866fc-22a0-472d-bcee-99765386e1fb" containerName="registry-server" containerID="cri-o://2a0aa2c98abbf6d8b4424f8f7ed2282f285567aae2596a4a121582c55613f982" gracePeriod=30 Oct 12 07:38:44 crc kubenswrapper[4599]: I1012 07:38:44.838377 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/04a727ef-7194-4df1-b0a2-0107085a972d-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-2vntc\" (UID: \"04a727ef-7194-4df1-b0a2-0107085a972d\") " pod="openshift-marketplace/marketplace-operator-79b997595-2vntc" Oct 12 07:38:44 crc kubenswrapper[4599]: I1012 07:38:44.838453 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/04a727ef-7194-4df1-b0a2-0107085a972d-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-2vntc\" (UID: \"04a727ef-7194-4df1-b0a2-0107085a972d\") " pod="openshift-marketplace/marketplace-operator-79b997595-2vntc" Oct 12 07:38:44 crc kubenswrapper[4599]: I1012 07:38:44.838490 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ln7cg\" (UniqueName: \"kubernetes.io/projected/04a727ef-7194-4df1-b0a2-0107085a972d-kube-api-access-ln7cg\") pod \"marketplace-operator-79b997595-2vntc\" (UID: \"04a727ef-7194-4df1-b0a2-0107085a972d\") " pod="openshift-marketplace/marketplace-operator-79b997595-2vntc" Oct 12 07:38:44 crc kubenswrapper[4599]: I1012 07:38:44.852746 4599 generic.go:334] "Generic (PLEG): container finished" podID="ddcc528b-bf0a-404c-b121-cee466cb352c" containerID="c790f0beda1a60adb007e007d9ac5d70ab13706da271eb3da0975bf047676b30" exitCode=0 Oct 12 07:38:44 crc kubenswrapper[4599]: I1012 07:38:44.852897 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mxnmk" event={"ID":"ddcc528b-bf0a-404c-b121-cee466cb352c","Type":"ContainerDied","Data":"c790f0beda1a60adb007e007d9ac5d70ab13706da271eb3da0975bf047676b30"} Oct 12 07:38:44 crc kubenswrapper[4599]: I1012 07:38:44.857997 4599 generic.go:334] "Generic (PLEG): container finished" podID="cef6ba33-1bde-40ae-8025-05f6234cd636" containerID="4f37fb21a9d7b651793aa3c92edc4f69bd375b1e19319aead6676b3387455032" exitCode=0 Oct 12 07:38:44 crc kubenswrapper[4599]: I1012 07:38:44.858089 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h8fml" event={"ID":"cef6ba33-1bde-40ae-8025-05f6234cd636","Type":"ContainerDied","Data":"4f37fb21a9d7b651793aa3c92edc4f69bd375b1e19319aead6676b3387455032"} Oct 12 07:38:44 crc kubenswrapper[4599]: I1012 07:38:44.861050 4599 generic.go:334] "Generic (PLEG): container finished" podID="b914c467-be05-49d4-a391-d98254248ade" containerID="2d8c1f800c24b0c46ecf9d481d0d9309c367575ed5e96f952ad8b4560243327f" exitCode=0 Oct 12 07:38:44 crc kubenswrapper[4599]: I1012 07:38:44.861124 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rmrp7" event={"ID":"b914c467-be05-49d4-a391-d98254248ade","Type":"ContainerDied","Data":"2d8c1f800c24b0c46ecf9d481d0d9309c367575ed5e96f952ad8b4560243327f"} Oct 12 07:38:44 crc kubenswrapper[4599]: I1012 07:38:44.862864 4599 generic.go:334] "Generic (PLEG): container finished" podID="d7e17165-80a0-4e3a-921f-a7ccbf9a9fe5" containerID="54a02e07e2e27b2c5e82be454f3c251d42fcc14b5404f74bd9a3ac1e6bc3a845" exitCode=0 Oct 12 07:38:44 crc kubenswrapper[4599]: I1012 07:38:44.862925 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-ks8z9" event={"ID":"d7e17165-80a0-4e3a-921f-a7ccbf9a9fe5","Type":"ContainerDied","Data":"54a02e07e2e27b2c5e82be454f3c251d42fcc14b5404f74bd9a3ac1e6bc3a845"} Oct 12 07:38:44 crc kubenswrapper[4599]: I1012 07:38:44.940393 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/04a727ef-7194-4df1-b0a2-0107085a972d-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-2vntc\" (UID: \"04a727ef-7194-4df1-b0a2-0107085a972d\") " pod="openshift-marketplace/marketplace-operator-79b997595-2vntc" Oct 12 07:38:44 crc kubenswrapper[4599]: I1012 07:38:44.940469 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/04a727ef-7194-4df1-b0a2-0107085a972d-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-2vntc\" (UID: \"04a727ef-7194-4df1-b0a2-0107085a972d\") " pod="openshift-marketplace/marketplace-operator-79b997595-2vntc" Oct 12 07:38:44 crc kubenswrapper[4599]: I1012 07:38:44.940506 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ln7cg\" (UniqueName: \"kubernetes.io/projected/04a727ef-7194-4df1-b0a2-0107085a972d-kube-api-access-ln7cg\") pod \"marketplace-operator-79b997595-2vntc\" (UID: \"04a727ef-7194-4df1-b0a2-0107085a972d\") " pod="openshift-marketplace/marketplace-operator-79b997595-2vntc" Oct 12 07:38:44 crc kubenswrapper[4599]: I1012 07:38:44.942068 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/04a727ef-7194-4df1-b0a2-0107085a972d-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-2vntc\" (UID: \"04a727ef-7194-4df1-b0a2-0107085a972d\") " pod="openshift-marketplace/marketplace-operator-79b997595-2vntc" Oct 12 07:38:44 crc kubenswrapper[4599]: I1012 07:38:44.949375 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/04a727ef-7194-4df1-b0a2-0107085a972d-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-2vntc\" (UID: \"04a727ef-7194-4df1-b0a2-0107085a972d\") " pod="openshift-marketplace/marketplace-operator-79b997595-2vntc" Oct 12 07:38:44 crc kubenswrapper[4599]: I1012 07:38:44.955831 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ln7cg\" (UniqueName: \"kubernetes.io/projected/04a727ef-7194-4df1-b0a2-0107085a972d-kube-api-access-ln7cg\") pod \"marketplace-operator-79b997595-2vntc\" (UID: \"04a727ef-7194-4df1-b0a2-0107085a972d\") " pod="openshift-marketplace/marketplace-operator-79b997595-2vntc" Oct 12 07:38:45 crc kubenswrapper[4599]: I1012 07:38:45.049943 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-2vntc" Oct 12 07:38:45 crc kubenswrapper[4599]: I1012 07:38:45.135050 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rmrp7" Oct 12 07:38:45 crc kubenswrapper[4599]: I1012 07:38:45.142568 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b914c467-be05-49d4-a391-d98254248ade-catalog-content\") pod \"b914c467-be05-49d4-a391-d98254248ade\" (UID: \"b914c467-be05-49d4-a391-d98254248ade\") " Oct 12 07:38:45 crc kubenswrapper[4599]: I1012 07:38:45.142615 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b914c467-be05-49d4-a391-d98254248ade-utilities\") pod \"b914c467-be05-49d4-a391-d98254248ade\" (UID: \"b914c467-be05-49d4-a391-d98254248ade\") " Oct 12 07:38:45 crc kubenswrapper[4599]: I1012 07:38:45.142643 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5dbtv\" (UniqueName: \"kubernetes.io/projected/b914c467-be05-49d4-a391-d98254248ade-kube-api-access-5dbtv\") pod \"b914c467-be05-49d4-a391-d98254248ade\" (UID: \"b914c467-be05-49d4-a391-d98254248ade\") " Oct 12 07:38:45 crc kubenswrapper[4599]: I1012 07:38:45.147949 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b914c467-be05-49d4-a391-d98254248ade-utilities" (OuterVolumeSpecName: "utilities") pod "b914c467-be05-49d4-a391-d98254248ade" (UID: "b914c467-be05-49d4-a391-d98254248ade"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:38:45 crc kubenswrapper[4599]: I1012 07:38:45.165595 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b914c467-be05-49d4-a391-d98254248ade-kube-api-access-5dbtv" (OuterVolumeSpecName: "kube-api-access-5dbtv") pod "b914c467-be05-49d4-a391-d98254248ade" (UID: "b914c467-be05-49d4-a391-d98254248ade"). InnerVolumeSpecName "kube-api-access-5dbtv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:38:45 crc kubenswrapper[4599]: I1012 07:38:45.197773 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b914c467-be05-49d4-a391-d98254248ade-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b914c467-be05-49d4-a391-d98254248ade" (UID: "b914c467-be05-49d4-a391-d98254248ade"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:38:45 crc kubenswrapper[4599]: I1012 07:38:45.209862 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h8fml" Oct 12 07:38:45 crc kubenswrapper[4599]: I1012 07:38:45.227169 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-ks8z9" Oct 12 07:38:45 crc kubenswrapper[4599]: I1012 07:38:45.236267 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mxnmk" Oct 12 07:38:45 crc kubenswrapper[4599]: I1012 07:38:45.240155 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6m4tn" Oct 12 07:38:45 crc kubenswrapper[4599]: I1012 07:38:45.244458 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/611866fc-22a0-472d-bcee-99765386e1fb-catalog-content\") pod \"611866fc-22a0-472d-bcee-99765386e1fb\" (UID: \"611866fc-22a0-472d-bcee-99765386e1fb\") " Oct 12 07:38:45 crc kubenswrapper[4599]: I1012 07:38:45.244495 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/611866fc-22a0-472d-bcee-99765386e1fb-utilities\") pod \"611866fc-22a0-472d-bcee-99765386e1fb\" (UID: \"611866fc-22a0-472d-bcee-99765386e1fb\") " Oct 12 07:38:45 crc kubenswrapper[4599]: I1012 07:38:45.244542 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mt5rp\" (UniqueName: \"kubernetes.io/projected/ddcc528b-bf0a-404c-b121-cee466cb352c-kube-api-access-mt5rp\") pod \"ddcc528b-bf0a-404c-b121-cee466cb352c\" (UID: \"ddcc528b-bf0a-404c-b121-cee466cb352c\") " Oct 12 07:38:45 crc kubenswrapper[4599]: I1012 07:38:45.244565 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ddcc528b-bf0a-404c-b121-cee466cb352c-utilities\") pod \"ddcc528b-bf0a-404c-b121-cee466cb352c\" (UID: \"ddcc528b-bf0a-404c-b121-cee466cb352c\") " Oct 12 07:38:45 crc kubenswrapper[4599]: I1012 07:38:45.244902 4599 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b914c467-be05-49d4-a391-d98254248ade-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 07:38:45 crc kubenswrapper[4599]: I1012 07:38:45.244960 4599 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b914c467-be05-49d4-a391-d98254248ade-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 07:38:45 crc kubenswrapper[4599]: I1012 07:38:45.244972 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5dbtv\" (UniqueName: \"kubernetes.io/projected/b914c467-be05-49d4-a391-d98254248ade-kube-api-access-5dbtv\") on node \"crc\" DevicePath \"\"" Oct 12 07:38:45 crc kubenswrapper[4599]: I1012 07:38:45.245748 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/611866fc-22a0-472d-bcee-99765386e1fb-utilities" (OuterVolumeSpecName: "utilities") pod "611866fc-22a0-472d-bcee-99765386e1fb" (UID: "611866fc-22a0-472d-bcee-99765386e1fb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:38:45 crc kubenswrapper[4599]: I1012 07:38:45.249500 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddcc528b-bf0a-404c-b121-cee466cb352c-kube-api-access-mt5rp" (OuterVolumeSpecName: "kube-api-access-mt5rp") pod "ddcc528b-bf0a-404c-b121-cee466cb352c" (UID: "ddcc528b-bf0a-404c-b121-cee466cb352c"). InnerVolumeSpecName "kube-api-access-mt5rp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:38:45 crc kubenswrapper[4599]: I1012 07:38:45.251789 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ddcc528b-bf0a-404c-b121-cee466cb352c-utilities" (OuterVolumeSpecName: "utilities") pod "ddcc528b-bf0a-404c-b121-cee466cb352c" (UID: "ddcc528b-bf0a-404c-b121-cee466cb352c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:38:45 crc kubenswrapper[4599]: I1012 07:38:45.319195 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/611866fc-22a0-472d-bcee-99765386e1fb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "611866fc-22a0-472d-bcee-99765386e1fb" (UID: "611866fc-22a0-472d-bcee-99765386e1fb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:38:45 crc kubenswrapper[4599]: I1012 07:38:45.345365 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jbqcx\" (UniqueName: \"kubernetes.io/projected/611866fc-22a0-472d-bcee-99765386e1fb-kube-api-access-jbqcx\") pod \"611866fc-22a0-472d-bcee-99765386e1fb\" (UID: \"611866fc-22a0-472d-bcee-99765386e1fb\") " Oct 12 07:38:45 crc kubenswrapper[4599]: I1012 07:38:45.345441 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cef6ba33-1bde-40ae-8025-05f6234cd636-catalog-content\") pod \"cef6ba33-1bde-40ae-8025-05f6234cd636\" (UID: \"cef6ba33-1bde-40ae-8025-05f6234cd636\") " Oct 12 07:38:45 crc kubenswrapper[4599]: I1012 07:38:45.345472 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jlglp\" (UniqueName: \"kubernetes.io/projected/cef6ba33-1bde-40ae-8025-05f6234cd636-kube-api-access-jlglp\") pod \"cef6ba33-1bde-40ae-8025-05f6234cd636\" (UID: \"cef6ba33-1bde-40ae-8025-05f6234cd636\") " Oct 12 07:38:45 crc kubenswrapper[4599]: I1012 07:38:45.345495 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d7e17165-80a0-4e3a-921f-a7ccbf9a9fe5-marketplace-trusted-ca\") pod \"d7e17165-80a0-4e3a-921f-a7ccbf9a9fe5\" (UID: \"d7e17165-80a0-4e3a-921f-a7ccbf9a9fe5\") " Oct 12 07:38:45 crc kubenswrapper[4599]: I1012 07:38:45.345519 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cef6ba33-1bde-40ae-8025-05f6234cd636-utilities\") pod \"cef6ba33-1bde-40ae-8025-05f6234cd636\" (UID: \"cef6ba33-1bde-40ae-8025-05f6234cd636\") " Oct 12 07:38:45 crc kubenswrapper[4599]: I1012 07:38:45.345541 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d9sst\" (UniqueName: \"kubernetes.io/projected/d7e17165-80a0-4e3a-921f-a7ccbf9a9fe5-kube-api-access-d9sst\") pod \"d7e17165-80a0-4e3a-921f-a7ccbf9a9fe5\" (UID: \"d7e17165-80a0-4e3a-921f-a7ccbf9a9fe5\") " Oct 12 07:38:45 crc kubenswrapper[4599]: I1012 07:38:45.345582 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d7e17165-80a0-4e3a-921f-a7ccbf9a9fe5-marketplace-operator-metrics\") pod \"d7e17165-80a0-4e3a-921f-a7ccbf9a9fe5\" (UID: \"d7e17165-80a0-4e3a-921f-a7ccbf9a9fe5\") " Oct 12 07:38:45 crc kubenswrapper[4599]: I1012 07:38:45.345643 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ddcc528b-bf0a-404c-b121-cee466cb352c-catalog-content\") pod \"ddcc528b-bf0a-404c-b121-cee466cb352c\" (UID: \"ddcc528b-bf0a-404c-b121-cee466cb352c\") " Oct 12 07:38:45 crc kubenswrapper[4599]: I1012 07:38:45.345824 4599 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/611866fc-22a0-472d-bcee-99765386e1fb-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 07:38:45 crc kubenswrapper[4599]: I1012 07:38:45.345843 4599 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/611866fc-22a0-472d-bcee-99765386e1fb-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 07:38:45 crc kubenswrapper[4599]: I1012 07:38:45.345853 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mt5rp\" (UniqueName: \"kubernetes.io/projected/ddcc528b-bf0a-404c-b121-cee466cb352c-kube-api-access-mt5rp\") on node \"crc\" DevicePath \"\"" Oct 12 07:38:45 crc kubenswrapper[4599]: I1012 07:38:45.345863 4599 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ddcc528b-bf0a-404c-b121-cee466cb352c-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 07:38:45 crc kubenswrapper[4599]: I1012 07:38:45.346253 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cef6ba33-1bde-40ae-8025-05f6234cd636-utilities" (OuterVolumeSpecName: "utilities") pod "cef6ba33-1bde-40ae-8025-05f6234cd636" (UID: "cef6ba33-1bde-40ae-8025-05f6234cd636"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:38:45 crc kubenswrapper[4599]: I1012 07:38:45.346676 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7e17165-80a0-4e3a-921f-a7ccbf9a9fe5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "d7e17165-80a0-4e3a-921f-a7ccbf9a9fe5" (UID: "d7e17165-80a0-4e3a-921f-a7ccbf9a9fe5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:38:45 crc kubenswrapper[4599]: I1012 07:38:45.348907 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cef6ba33-1bde-40ae-8025-05f6234cd636-kube-api-access-jlglp" (OuterVolumeSpecName: "kube-api-access-jlglp") pod "cef6ba33-1bde-40ae-8025-05f6234cd636" (UID: "cef6ba33-1bde-40ae-8025-05f6234cd636"). InnerVolumeSpecName "kube-api-access-jlglp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:38:45 crc kubenswrapper[4599]: I1012 07:38:45.349107 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/611866fc-22a0-472d-bcee-99765386e1fb-kube-api-access-jbqcx" (OuterVolumeSpecName: "kube-api-access-jbqcx") pod "611866fc-22a0-472d-bcee-99765386e1fb" (UID: "611866fc-22a0-472d-bcee-99765386e1fb"). InnerVolumeSpecName "kube-api-access-jbqcx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:38:45 crc kubenswrapper[4599]: I1012 07:38:45.349211 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7e17165-80a0-4e3a-921f-a7ccbf9a9fe5-kube-api-access-d9sst" (OuterVolumeSpecName: "kube-api-access-d9sst") pod "d7e17165-80a0-4e3a-921f-a7ccbf9a9fe5" (UID: "d7e17165-80a0-4e3a-921f-a7ccbf9a9fe5"). InnerVolumeSpecName "kube-api-access-d9sst". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:38:45 crc kubenswrapper[4599]: I1012 07:38:45.349487 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7e17165-80a0-4e3a-921f-a7ccbf9a9fe5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "d7e17165-80a0-4e3a-921f-a7ccbf9a9fe5" (UID: "d7e17165-80a0-4e3a-921f-a7ccbf9a9fe5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:38:45 crc kubenswrapper[4599]: I1012 07:38:45.359614 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cef6ba33-1bde-40ae-8025-05f6234cd636-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cef6ba33-1bde-40ae-8025-05f6234cd636" (UID: "cef6ba33-1bde-40ae-8025-05f6234cd636"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:38:45 crc kubenswrapper[4599]: I1012 07:38:45.392981 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ddcc528b-bf0a-404c-b121-cee466cb352c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ddcc528b-bf0a-404c-b121-cee466cb352c" (UID: "ddcc528b-bf0a-404c-b121-cee466cb352c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:38:45 crc kubenswrapper[4599]: I1012 07:38:45.446670 4599 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ddcc528b-bf0a-404c-b121-cee466cb352c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 07:38:45 crc kubenswrapper[4599]: I1012 07:38:45.447142 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jbqcx\" (UniqueName: \"kubernetes.io/projected/611866fc-22a0-472d-bcee-99765386e1fb-kube-api-access-jbqcx\") on node \"crc\" DevicePath \"\"" Oct 12 07:38:45 crc kubenswrapper[4599]: I1012 07:38:45.447209 4599 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cef6ba33-1bde-40ae-8025-05f6234cd636-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 07:38:45 crc kubenswrapper[4599]: I1012 07:38:45.447271 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jlglp\" (UniqueName: \"kubernetes.io/projected/cef6ba33-1bde-40ae-8025-05f6234cd636-kube-api-access-jlglp\") on node \"crc\" DevicePath \"\"" Oct 12 07:38:45 crc kubenswrapper[4599]: I1012 07:38:45.447323 4599 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d7e17165-80a0-4e3a-921f-a7ccbf9a9fe5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 12 07:38:45 crc kubenswrapper[4599]: I1012 07:38:45.447394 4599 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cef6ba33-1bde-40ae-8025-05f6234cd636-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 07:38:45 crc kubenswrapper[4599]: I1012 07:38:45.447456 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d9sst\" (UniqueName: \"kubernetes.io/projected/d7e17165-80a0-4e3a-921f-a7ccbf9a9fe5-kube-api-access-d9sst\") on node \"crc\" DevicePath \"\"" Oct 12 07:38:45 crc kubenswrapper[4599]: I1012 07:38:45.447506 4599 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d7e17165-80a0-4e3a-921f-a7ccbf9a9fe5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 12 07:38:45 crc kubenswrapper[4599]: I1012 07:38:45.501811 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2vntc"] Oct 12 07:38:45 crc kubenswrapper[4599]: I1012 07:38:45.869687 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rmrp7" event={"ID":"b914c467-be05-49d4-a391-d98254248ade","Type":"ContainerDied","Data":"cb1a4eb0176453d2db2fd0781e4bb24420268e13aae3fee3ec5f1f6f46db20bf"} Oct 12 07:38:45 crc kubenswrapper[4599]: I1012 07:38:45.869977 4599 scope.go:117] "RemoveContainer" containerID="2d8c1f800c24b0c46ecf9d481d0d9309c367575ed5e96f952ad8b4560243327f" Oct 12 07:38:45 crc kubenswrapper[4599]: I1012 07:38:45.869794 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rmrp7" Oct 12 07:38:45 crc kubenswrapper[4599]: I1012 07:38:45.873915 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h8fml" event={"ID":"cef6ba33-1bde-40ae-8025-05f6234cd636","Type":"ContainerDied","Data":"363864e6962e87086bd9cdba331a628918793ccc61055dd55edf3f96ce957a92"} Oct 12 07:38:45 crc kubenswrapper[4599]: I1012 07:38:45.874039 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h8fml" Oct 12 07:38:45 crc kubenswrapper[4599]: I1012 07:38:45.875907 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-ks8z9" event={"ID":"d7e17165-80a0-4e3a-921f-a7ccbf9a9fe5","Type":"ContainerDied","Data":"b4392f94f29a381b82692ec9446e895be57b994a594e05e1c1a0a4ff0e032d96"} Oct 12 07:38:45 crc kubenswrapper[4599]: I1012 07:38:45.875919 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-ks8z9" Oct 12 07:38:45 crc kubenswrapper[4599]: I1012 07:38:45.877762 4599 generic.go:334] "Generic (PLEG): container finished" podID="611866fc-22a0-472d-bcee-99765386e1fb" containerID="2a0aa2c98abbf6d8b4424f8f7ed2282f285567aae2596a4a121582c55613f982" exitCode=0 Oct 12 07:38:45 crc kubenswrapper[4599]: I1012 07:38:45.877805 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6m4tn" event={"ID":"611866fc-22a0-472d-bcee-99765386e1fb","Type":"ContainerDied","Data":"2a0aa2c98abbf6d8b4424f8f7ed2282f285567aae2596a4a121582c55613f982"} Oct 12 07:38:45 crc kubenswrapper[4599]: I1012 07:38:45.877821 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6m4tn" event={"ID":"611866fc-22a0-472d-bcee-99765386e1fb","Type":"ContainerDied","Data":"828ea93e289771fbe9c6ec8eaf07d5f983c250a55738e23993598393acb0e85a"} Oct 12 07:38:45 crc kubenswrapper[4599]: I1012 07:38:45.877847 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6m4tn" Oct 12 07:38:45 crc kubenswrapper[4599]: I1012 07:38:45.881874 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-2vntc" event={"ID":"04a727ef-7194-4df1-b0a2-0107085a972d","Type":"ContainerStarted","Data":"e94d55aef8cf67fb327c272b892ab3554a5525a571619474918f3f4467733d56"} Oct 12 07:38:45 crc kubenswrapper[4599]: I1012 07:38:45.881895 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-2vntc" event={"ID":"04a727ef-7194-4df1-b0a2-0107085a972d","Type":"ContainerStarted","Data":"4d0611a72de3eb012697ee67ff3e6e74d707d5f833f2dca7c3e3975b448f523d"} Oct 12 07:38:45 crc kubenswrapper[4599]: I1012 07:38:45.882139 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-2vntc" Oct 12 07:38:45 crc kubenswrapper[4599]: I1012 07:38:45.885210 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mxnmk" event={"ID":"ddcc528b-bf0a-404c-b121-cee466cb352c","Type":"ContainerDied","Data":"95c3ef1d3053912b86beb5a65805e9b00636503d078e67ca47e8a1b4c17423c1"} Oct 12 07:38:45 crc kubenswrapper[4599]: I1012 07:38:45.885289 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mxnmk" Oct 12 07:38:45 crc kubenswrapper[4599]: I1012 07:38:45.886076 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-2vntc" Oct 12 07:38:45 crc kubenswrapper[4599]: I1012 07:38:45.896855 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-2vntc" podStartSLOduration=1.896836298 podStartE2EDuration="1.896836298s" podCreationTimestamp="2025-10-12 07:38:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:38:45.896281652 +0000 UTC m=+222.685477154" watchObservedRunningTime="2025-10-12 07:38:45.896836298 +0000 UTC m=+222.686031800" Oct 12 07:38:45 crc kubenswrapper[4599]: I1012 07:38:45.901993 4599 scope.go:117] "RemoveContainer" containerID="8fa8e97086ab5538d032f280770d063f09ce1a6f5d2223d709ccf9bacefd5317" Oct 12 07:38:45 crc kubenswrapper[4599]: I1012 07:38:45.917830 4599 scope.go:117] "RemoveContainer" containerID="34746348f8f031d16440cfbf96cb4273432f70222570190aa8f7506da485c2f0" Oct 12 07:38:45 crc kubenswrapper[4599]: I1012 07:38:45.941176 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-h8fml"] Oct 12 07:38:45 crc kubenswrapper[4599]: I1012 07:38:45.941369 4599 scope.go:117] "RemoveContainer" containerID="4f37fb21a9d7b651793aa3c92edc4f69bd375b1e19319aead6676b3387455032" Oct 12 07:38:45 crc kubenswrapper[4599]: I1012 07:38:45.945279 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-h8fml"] Oct 12 07:38:45 crc kubenswrapper[4599]: I1012 07:38:45.947929 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rmrp7"] Oct 12 07:38:45 crc kubenswrapper[4599]: I1012 07:38:45.950226 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-rmrp7"] Oct 12 07:38:45 crc kubenswrapper[4599]: I1012 07:38:45.953727 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mxnmk"] Oct 12 07:38:45 crc kubenswrapper[4599]: I1012 07:38:45.955641 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mxnmk"] Oct 12 07:38:45 crc kubenswrapper[4599]: I1012 07:38:45.964579 4599 scope.go:117] "RemoveContainer" containerID="8d175c94e96ae077668ee88795397d4d970088af5ffda7b5f89fa1bd2dfd5195" Oct 12 07:38:45 crc kubenswrapper[4599]: I1012 07:38:45.966900 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6m4tn"] Oct 12 07:38:45 crc kubenswrapper[4599]: I1012 07:38:45.969362 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6m4tn"] Oct 12 07:38:45 crc kubenswrapper[4599]: I1012 07:38:45.973079 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ks8z9"] Oct 12 07:38:45 crc kubenswrapper[4599]: I1012 07:38:45.974953 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ks8z9"] Oct 12 07:38:45 crc kubenswrapper[4599]: I1012 07:38:45.977592 4599 scope.go:117] "RemoveContainer" containerID="f9bcdfcfcbd626b50e1866253e8a8023879ea09f069527d80a08f5b24e586abe" Oct 12 07:38:45 crc kubenswrapper[4599]: I1012 07:38:45.990162 4599 scope.go:117] "RemoveContainer" containerID="54a02e07e2e27b2c5e82be454f3c251d42fcc14b5404f74bd9a3ac1e6bc3a845" Oct 12 07:38:46 crc kubenswrapper[4599]: I1012 07:38:46.002548 4599 scope.go:117] "RemoveContainer" containerID="2a0aa2c98abbf6d8b4424f8f7ed2282f285567aae2596a4a121582c55613f982" Oct 12 07:38:46 crc kubenswrapper[4599]: I1012 07:38:46.012204 4599 scope.go:117] "RemoveContainer" containerID="12c02f40f3fcb5442fb475b028a9ace7e32b4f8202bf652b231fe98262c1879b" Oct 12 07:38:46 crc kubenswrapper[4599]: I1012 07:38:46.024670 4599 scope.go:117] "RemoveContainer" containerID="a88249d06a85ba5bd9b7f291cfbd8aef6edf63782fbf6926332fe4dcd164c1fd" Oct 12 07:38:46 crc kubenswrapper[4599]: I1012 07:38:46.037392 4599 scope.go:117] "RemoveContainer" containerID="2a0aa2c98abbf6d8b4424f8f7ed2282f285567aae2596a4a121582c55613f982" Oct 12 07:38:46 crc kubenswrapper[4599]: E1012 07:38:46.037772 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a0aa2c98abbf6d8b4424f8f7ed2282f285567aae2596a4a121582c55613f982\": container with ID starting with 2a0aa2c98abbf6d8b4424f8f7ed2282f285567aae2596a4a121582c55613f982 not found: ID does not exist" containerID="2a0aa2c98abbf6d8b4424f8f7ed2282f285567aae2596a4a121582c55613f982" Oct 12 07:38:46 crc kubenswrapper[4599]: I1012 07:38:46.037812 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a0aa2c98abbf6d8b4424f8f7ed2282f285567aae2596a4a121582c55613f982"} err="failed to get container status \"2a0aa2c98abbf6d8b4424f8f7ed2282f285567aae2596a4a121582c55613f982\": rpc error: code = NotFound desc = could not find container \"2a0aa2c98abbf6d8b4424f8f7ed2282f285567aae2596a4a121582c55613f982\": container with ID starting with 2a0aa2c98abbf6d8b4424f8f7ed2282f285567aae2596a4a121582c55613f982 not found: ID does not exist" Oct 12 07:38:46 crc kubenswrapper[4599]: I1012 07:38:46.037837 4599 scope.go:117] "RemoveContainer" containerID="12c02f40f3fcb5442fb475b028a9ace7e32b4f8202bf652b231fe98262c1879b" Oct 12 07:38:46 crc kubenswrapper[4599]: E1012 07:38:46.038187 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12c02f40f3fcb5442fb475b028a9ace7e32b4f8202bf652b231fe98262c1879b\": container with ID starting with 12c02f40f3fcb5442fb475b028a9ace7e32b4f8202bf652b231fe98262c1879b not found: ID does not exist" containerID="12c02f40f3fcb5442fb475b028a9ace7e32b4f8202bf652b231fe98262c1879b" Oct 12 07:38:46 crc kubenswrapper[4599]: I1012 07:38:46.038225 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12c02f40f3fcb5442fb475b028a9ace7e32b4f8202bf652b231fe98262c1879b"} err="failed to get container status \"12c02f40f3fcb5442fb475b028a9ace7e32b4f8202bf652b231fe98262c1879b\": rpc error: code = NotFound desc = could not find container \"12c02f40f3fcb5442fb475b028a9ace7e32b4f8202bf652b231fe98262c1879b\": container with ID starting with 12c02f40f3fcb5442fb475b028a9ace7e32b4f8202bf652b231fe98262c1879b not found: ID does not exist" Oct 12 07:38:46 crc kubenswrapper[4599]: I1012 07:38:46.038282 4599 scope.go:117] "RemoveContainer" containerID="a88249d06a85ba5bd9b7f291cfbd8aef6edf63782fbf6926332fe4dcd164c1fd" Oct 12 07:38:46 crc kubenswrapper[4599]: E1012 07:38:46.038600 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a88249d06a85ba5bd9b7f291cfbd8aef6edf63782fbf6926332fe4dcd164c1fd\": container with ID starting with a88249d06a85ba5bd9b7f291cfbd8aef6edf63782fbf6926332fe4dcd164c1fd not found: ID does not exist" containerID="a88249d06a85ba5bd9b7f291cfbd8aef6edf63782fbf6926332fe4dcd164c1fd" Oct 12 07:38:46 crc kubenswrapper[4599]: I1012 07:38:46.038640 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a88249d06a85ba5bd9b7f291cfbd8aef6edf63782fbf6926332fe4dcd164c1fd"} err="failed to get container status \"a88249d06a85ba5bd9b7f291cfbd8aef6edf63782fbf6926332fe4dcd164c1fd\": rpc error: code = NotFound desc = could not find container \"a88249d06a85ba5bd9b7f291cfbd8aef6edf63782fbf6926332fe4dcd164c1fd\": container with ID starting with a88249d06a85ba5bd9b7f291cfbd8aef6edf63782fbf6926332fe4dcd164c1fd not found: ID does not exist" Oct 12 07:38:46 crc kubenswrapper[4599]: I1012 07:38:46.038668 4599 scope.go:117] "RemoveContainer" containerID="c790f0beda1a60adb007e007d9ac5d70ab13706da271eb3da0975bf047676b30" Oct 12 07:38:46 crc kubenswrapper[4599]: I1012 07:38:46.051306 4599 scope.go:117] "RemoveContainer" containerID="0a603fb84f4af91bda8aa098a056118d0aa96c622321220603ce048e4bd960cc" Oct 12 07:38:46 crc kubenswrapper[4599]: I1012 07:38:46.066490 4599 scope.go:117] "RemoveContainer" containerID="76588b57c57fba6c8aaa04548ee3ad8b47b75c9f8dba00cc60db84fd032524f6" Oct 12 07:38:46 crc kubenswrapper[4599]: I1012 07:38:46.922456 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-xkm4x"] Oct 12 07:38:46 crc kubenswrapper[4599]: E1012 07:38:46.922964 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7e17165-80a0-4e3a-921f-a7ccbf9a9fe5" containerName="marketplace-operator" Oct 12 07:38:46 crc kubenswrapper[4599]: I1012 07:38:46.922978 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7e17165-80a0-4e3a-921f-a7ccbf9a9fe5" containerName="marketplace-operator" Oct 12 07:38:46 crc kubenswrapper[4599]: E1012 07:38:46.922990 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddcc528b-bf0a-404c-b121-cee466cb352c" containerName="registry-server" Oct 12 07:38:46 crc kubenswrapper[4599]: I1012 07:38:46.922997 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddcc528b-bf0a-404c-b121-cee466cb352c" containerName="registry-server" Oct 12 07:38:46 crc kubenswrapper[4599]: E1012 07:38:46.923010 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="611866fc-22a0-472d-bcee-99765386e1fb" containerName="extract-utilities" Oct 12 07:38:46 crc kubenswrapper[4599]: I1012 07:38:46.923016 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="611866fc-22a0-472d-bcee-99765386e1fb" containerName="extract-utilities" Oct 12 07:38:46 crc kubenswrapper[4599]: E1012 07:38:46.923024 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="611866fc-22a0-472d-bcee-99765386e1fb" containerName="registry-server" Oct 12 07:38:46 crc kubenswrapper[4599]: I1012 07:38:46.923030 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="611866fc-22a0-472d-bcee-99765386e1fb" containerName="registry-server" Oct 12 07:38:46 crc kubenswrapper[4599]: E1012 07:38:46.923039 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cef6ba33-1bde-40ae-8025-05f6234cd636" containerName="extract-content" Oct 12 07:38:46 crc kubenswrapper[4599]: I1012 07:38:46.923045 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="cef6ba33-1bde-40ae-8025-05f6234cd636" containerName="extract-content" Oct 12 07:38:46 crc kubenswrapper[4599]: E1012 07:38:46.923055 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b914c467-be05-49d4-a391-d98254248ade" containerName="extract-content" Oct 12 07:38:46 crc kubenswrapper[4599]: I1012 07:38:46.923061 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="b914c467-be05-49d4-a391-d98254248ade" containerName="extract-content" Oct 12 07:38:46 crc kubenswrapper[4599]: E1012 07:38:46.923067 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cef6ba33-1bde-40ae-8025-05f6234cd636" containerName="registry-server" Oct 12 07:38:46 crc kubenswrapper[4599]: I1012 07:38:46.923073 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="cef6ba33-1bde-40ae-8025-05f6234cd636" containerName="registry-server" Oct 12 07:38:46 crc kubenswrapper[4599]: E1012 07:38:46.923082 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddcc528b-bf0a-404c-b121-cee466cb352c" containerName="extract-content" Oct 12 07:38:46 crc kubenswrapper[4599]: I1012 07:38:46.923088 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddcc528b-bf0a-404c-b121-cee466cb352c" containerName="extract-content" Oct 12 07:38:46 crc kubenswrapper[4599]: E1012 07:38:46.923097 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cef6ba33-1bde-40ae-8025-05f6234cd636" containerName="extract-utilities" Oct 12 07:38:46 crc kubenswrapper[4599]: I1012 07:38:46.923125 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="cef6ba33-1bde-40ae-8025-05f6234cd636" containerName="extract-utilities" Oct 12 07:38:46 crc kubenswrapper[4599]: E1012 07:38:46.923135 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="611866fc-22a0-472d-bcee-99765386e1fb" containerName="extract-content" Oct 12 07:38:46 crc kubenswrapper[4599]: I1012 07:38:46.923143 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="611866fc-22a0-472d-bcee-99765386e1fb" containerName="extract-content" Oct 12 07:38:46 crc kubenswrapper[4599]: E1012 07:38:46.923152 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b914c467-be05-49d4-a391-d98254248ade" containerName="extract-utilities" Oct 12 07:38:46 crc kubenswrapper[4599]: I1012 07:38:46.923158 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="b914c467-be05-49d4-a391-d98254248ade" containerName="extract-utilities" Oct 12 07:38:46 crc kubenswrapper[4599]: E1012 07:38:46.923167 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b914c467-be05-49d4-a391-d98254248ade" containerName="registry-server" Oct 12 07:38:46 crc kubenswrapper[4599]: I1012 07:38:46.923172 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="b914c467-be05-49d4-a391-d98254248ade" containerName="registry-server" Oct 12 07:38:46 crc kubenswrapper[4599]: E1012 07:38:46.923180 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddcc528b-bf0a-404c-b121-cee466cb352c" containerName="extract-utilities" Oct 12 07:38:46 crc kubenswrapper[4599]: I1012 07:38:46.923188 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddcc528b-bf0a-404c-b121-cee466cb352c" containerName="extract-utilities" Oct 12 07:38:46 crc kubenswrapper[4599]: I1012 07:38:46.923304 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="611866fc-22a0-472d-bcee-99765386e1fb" containerName="registry-server" Oct 12 07:38:46 crc kubenswrapper[4599]: I1012 07:38:46.923313 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7e17165-80a0-4e3a-921f-a7ccbf9a9fe5" containerName="marketplace-operator" Oct 12 07:38:46 crc kubenswrapper[4599]: I1012 07:38:46.923324 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddcc528b-bf0a-404c-b121-cee466cb352c" containerName="registry-server" Oct 12 07:38:46 crc kubenswrapper[4599]: I1012 07:38:46.923349 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="b914c467-be05-49d4-a391-d98254248ade" containerName="registry-server" Oct 12 07:38:46 crc kubenswrapper[4599]: I1012 07:38:46.923355 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="cef6ba33-1bde-40ae-8025-05f6234cd636" containerName="registry-server" Oct 12 07:38:46 crc kubenswrapper[4599]: I1012 07:38:46.924042 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xkm4x" Oct 12 07:38:46 crc kubenswrapper[4599]: I1012 07:38:46.925770 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 12 07:38:46 crc kubenswrapper[4599]: I1012 07:38:46.931474 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xkm4x"] Oct 12 07:38:47 crc kubenswrapper[4599]: I1012 07:38:47.065537 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgrj4\" (UniqueName: \"kubernetes.io/projected/4b6aaa9e-4f52-45ec-994b-c0b4e600c00f-kube-api-access-qgrj4\") pod \"redhat-marketplace-xkm4x\" (UID: \"4b6aaa9e-4f52-45ec-994b-c0b4e600c00f\") " pod="openshift-marketplace/redhat-marketplace-xkm4x" Oct 12 07:38:47 crc kubenswrapper[4599]: I1012 07:38:47.065674 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b6aaa9e-4f52-45ec-994b-c0b4e600c00f-utilities\") pod \"redhat-marketplace-xkm4x\" (UID: \"4b6aaa9e-4f52-45ec-994b-c0b4e600c00f\") " pod="openshift-marketplace/redhat-marketplace-xkm4x" Oct 12 07:38:47 crc kubenswrapper[4599]: I1012 07:38:47.065989 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b6aaa9e-4f52-45ec-994b-c0b4e600c00f-catalog-content\") pod \"redhat-marketplace-xkm4x\" (UID: \"4b6aaa9e-4f52-45ec-994b-c0b4e600c00f\") " pod="openshift-marketplace/redhat-marketplace-xkm4x" Oct 12 07:38:47 crc kubenswrapper[4599]: I1012 07:38:47.126664 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9d992"] Oct 12 07:38:47 crc kubenswrapper[4599]: I1012 07:38:47.127718 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9d992" Oct 12 07:38:47 crc kubenswrapper[4599]: I1012 07:38:47.130275 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 12 07:38:47 crc kubenswrapper[4599]: I1012 07:38:47.134562 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9d992"] Oct 12 07:38:47 crc kubenswrapper[4599]: I1012 07:38:47.167097 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgrj4\" (UniqueName: \"kubernetes.io/projected/4b6aaa9e-4f52-45ec-994b-c0b4e600c00f-kube-api-access-qgrj4\") pod \"redhat-marketplace-xkm4x\" (UID: \"4b6aaa9e-4f52-45ec-994b-c0b4e600c00f\") " pod="openshift-marketplace/redhat-marketplace-xkm4x" Oct 12 07:38:47 crc kubenswrapper[4599]: I1012 07:38:47.167145 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff5e25ae-0293-4ddc-8e3e-1c27d4aa9d89-catalog-content\") pod \"redhat-operators-9d992\" (UID: \"ff5e25ae-0293-4ddc-8e3e-1c27d4aa9d89\") " pod="openshift-marketplace/redhat-operators-9d992" Oct 12 07:38:47 crc kubenswrapper[4599]: I1012 07:38:47.167179 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jj4qk\" (UniqueName: \"kubernetes.io/projected/ff5e25ae-0293-4ddc-8e3e-1c27d4aa9d89-kube-api-access-jj4qk\") pod \"redhat-operators-9d992\" (UID: \"ff5e25ae-0293-4ddc-8e3e-1c27d4aa9d89\") " pod="openshift-marketplace/redhat-operators-9d992" Oct 12 07:38:47 crc kubenswrapper[4599]: I1012 07:38:47.167204 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b6aaa9e-4f52-45ec-994b-c0b4e600c00f-utilities\") pod \"redhat-marketplace-xkm4x\" (UID: \"4b6aaa9e-4f52-45ec-994b-c0b4e600c00f\") " pod="openshift-marketplace/redhat-marketplace-xkm4x" Oct 12 07:38:47 crc kubenswrapper[4599]: I1012 07:38:47.167247 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b6aaa9e-4f52-45ec-994b-c0b4e600c00f-catalog-content\") pod \"redhat-marketplace-xkm4x\" (UID: \"4b6aaa9e-4f52-45ec-994b-c0b4e600c00f\") " pod="openshift-marketplace/redhat-marketplace-xkm4x" Oct 12 07:38:47 crc kubenswrapper[4599]: I1012 07:38:47.167273 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff5e25ae-0293-4ddc-8e3e-1c27d4aa9d89-utilities\") pod \"redhat-operators-9d992\" (UID: \"ff5e25ae-0293-4ddc-8e3e-1c27d4aa9d89\") " pod="openshift-marketplace/redhat-operators-9d992" Oct 12 07:38:47 crc kubenswrapper[4599]: I1012 07:38:47.167684 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b6aaa9e-4f52-45ec-994b-c0b4e600c00f-utilities\") pod \"redhat-marketplace-xkm4x\" (UID: \"4b6aaa9e-4f52-45ec-994b-c0b4e600c00f\") " pod="openshift-marketplace/redhat-marketplace-xkm4x" Oct 12 07:38:47 crc kubenswrapper[4599]: I1012 07:38:47.167747 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b6aaa9e-4f52-45ec-994b-c0b4e600c00f-catalog-content\") pod \"redhat-marketplace-xkm4x\" (UID: \"4b6aaa9e-4f52-45ec-994b-c0b4e600c00f\") " pod="openshift-marketplace/redhat-marketplace-xkm4x" Oct 12 07:38:47 crc kubenswrapper[4599]: I1012 07:38:47.182455 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgrj4\" (UniqueName: \"kubernetes.io/projected/4b6aaa9e-4f52-45ec-994b-c0b4e600c00f-kube-api-access-qgrj4\") pod \"redhat-marketplace-xkm4x\" (UID: \"4b6aaa9e-4f52-45ec-994b-c0b4e600c00f\") " pod="openshift-marketplace/redhat-marketplace-xkm4x" Oct 12 07:38:47 crc kubenswrapper[4599]: I1012 07:38:47.244561 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xkm4x" Oct 12 07:38:47 crc kubenswrapper[4599]: I1012 07:38:47.281431 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff5e25ae-0293-4ddc-8e3e-1c27d4aa9d89-catalog-content\") pod \"redhat-operators-9d992\" (UID: \"ff5e25ae-0293-4ddc-8e3e-1c27d4aa9d89\") " pod="openshift-marketplace/redhat-operators-9d992" Oct 12 07:38:47 crc kubenswrapper[4599]: I1012 07:38:47.281636 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jj4qk\" (UniqueName: \"kubernetes.io/projected/ff5e25ae-0293-4ddc-8e3e-1c27d4aa9d89-kube-api-access-jj4qk\") pod \"redhat-operators-9d992\" (UID: \"ff5e25ae-0293-4ddc-8e3e-1c27d4aa9d89\") " pod="openshift-marketplace/redhat-operators-9d992" Oct 12 07:38:47 crc kubenswrapper[4599]: I1012 07:38:47.281740 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff5e25ae-0293-4ddc-8e3e-1c27d4aa9d89-utilities\") pod \"redhat-operators-9d992\" (UID: \"ff5e25ae-0293-4ddc-8e3e-1c27d4aa9d89\") " pod="openshift-marketplace/redhat-operators-9d992" Oct 12 07:38:47 crc kubenswrapper[4599]: I1012 07:38:47.282150 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff5e25ae-0293-4ddc-8e3e-1c27d4aa9d89-catalog-content\") pod \"redhat-operators-9d992\" (UID: \"ff5e25ae-0293-4ddc-8e3e-1c27d4aa9d89\") " pod="openshift-marketplace/redhat-operators-9d992" Oct 12 07:38:47 crc kubenswrapper[4599]: I1012 07:38:47.282219 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff5e25ae-0293-4ddc-8e3e-1c27d4aa9d89-utilities\") pod \"redhat-operators-9d992\" (UID: \"ff5e25ae-0293-4ddc-8e3e-1c27d4aa9d89\") " pod="openshift-marketplace/redhat-operators-9d992" Oct 12 07:38:47 crc kubenswrapper[4599]: I1012 07:38:47.297891 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jj4qk\" (UniqueName: \"kubernetes.io/projected/ff5e25ae-0293-4ddc-8e3e-1c27d4aa9d89-kube-api-access-jj4qk\") pod \"redhat-operators-9d992\" (UID: \"ff5e25ae-0293-4ddc-8e3e-1c27d4aa9d89\") " pod="openshift-marketplace/redhat-operators-9d992" Oct 12 07:38:47 crc kubenswrapper[4599]: I1012 07:38:47.446660 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9d992" Oct 12 07:38:47 crc kubenswrapper[4599]: I1012 07:38:47.550882 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="611866fc-22a0-472d-bcee-99765386e1fb" path="/var/lib/kubelet/pods/611866fc-22a0-472d-bcee-99765386e1fb/volumes" Oct 12 07:38:47 crc kubenswrapper[4599]: I1012 07:38:47.551554 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b914c467-be05-49d4-a391-d98254248ade" path="/var/lib/kubelet/pods/b914c467-be05-49d4-a391-d98254248ade/volumes" Oct 12 07:38:47 crc kubenswrapper[4599]: I1012 07:38:47.552190 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cef6ba33-1bde-40ae-8025-05f6234cd636" path="/var/lib/kubelet/pods/cef6ba33-1bde-40ae-8025-05f6234cd636/volumes" Oct 12 07:38:47 crc kubenswrapper[4599]: I1012 07:38:47.553203 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7e17165-80a0-4e3a-921f-a7ccbf9a9fe5" path="/var/lib/kubelet/pods/d7e17165-80a0-4e3a-921f-a7ccbf9a9fe5/volumes" Oct 12 07:38:47 crc kubenswrapper[4599]: I1012 07:38:47.553800 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddcc528b-bf0a-404c-b121-cee466cb352c" path="/var/lib/kubelet/pods/ddcc528b-bf0a-404c-b121-cee466cb352c/volumes" Oct 12 07:38:47 crc kubenswrapper[4599]: I1012 07:38:47.602731 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xkm4x"] Oct 12 07:38:47 crc kubenswrapper[4599]: W1012 07:38:47.607751 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b6aaa9e_4f52_45ec_994b_c0b4e600c00f.slice/crio-1751b66bd0fd6310cd7f4e493a3c69cd71a876cb4622f587161a49d88ea257c7 WatchSource:0}: Error finding container 1751b66bd0fd6310cd7f4e493a3c69cd71a876cb4622f587161a49d88ea257c7: Status 404 returned error can't find the container with id 1751b66bd0fd6310cd7f4e493a3c69cd71a876cb4622f587161a49d88ea257c7 Oct 12 07:38:47 crc kubenswrapper[4599]: I1012 07:38:47.789940 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9d992"] Oct 12 07:38:47 crc kubenswrapper[4599]: W1012 07:38:47.796092 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff5e25ae_0293_4ddc_8e3e_1c27d4aa9d89.slice/crio-d4431beec3554babdf2ac6cf791a0c7a11f733f1d893f408a054bc82b848191e WatchSource:0}: Error finding container d4431beec3554babdf2ac6cf791a0c7a11f733f1d893f408a054bc82b848191e: Status 404 returned error can't find the container with id d4431beec3554babdf2ac6cf791a0c7a11f733f1d893f408a054bc82b848191e Oct 12 07:38:47 crc kubenswrapper[4599]: I1012 07:38:47.901910 4599 generic.go:334] "Generic (PLEG): container finished" podID="ff5e25ae-0293-4ddc-8e3e-1c27d4aa9d89" containerID="3261b9f462d514a3ed31537a88a1b2758f09d9c5a709248362d0b359b30c3a74" exitCode=0 Oct 12 07:38:47 crc kubenswrapper[4599]: I1012 07:38:47.902012 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9d992" event={"ID":"ff5e25ae-0293-4ddc-8e3e-1c27d4aa9d89","Type":"ContainerDied","Data":"3261b9f462d514a3ed31537a88a1b2758f09d9c5a709248362d0b359b30c3a74"} Oct 12 07:38:47 crc kubenswrapper[4599]: I1012 07:38:47.902356 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9d992" event={"ID":"ff5e25ae-0293-4ddc-8e3e-1c27d4aa9d89","Type":"ContainerStarted","Data":"d4431beec3554babdf2ac6cf791a0c7a11f733f1d893f408a054bc82b848191e"} Oct 12 07:38:47 crc kubenswrapper[4599]: I1012 07:38:47.907032 4599 generic.go:334] "Generic (PLEG): container finished" podID="4b6aaa9e-4f52-45ec-994b-c0b4e600c00f" containerID="9f5e27a4386da3b0c394afb7c28970933eb375a7d2c18de65a600f3c628b9792" exitCode=0 Oct 12 07:38:47 crc kubenswrapper[4599]: I1012 07:38:47.907131 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xkm4x" event={"ID":"4b6aaa9e-4f52-45ec-994b-c0b4e600c00f","Type":"ContainerDied","Data":"9f5e27a4386da3b0c394afb7c28970933eb375a7d2c18de65a600f3c628b9792"} Oct 12 07:38:47 crc kubenswrapper[4599]: I1012 07:38:47.907185 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xkm4x" event={"ID":"4b6aaa9e-4f52-45ec-994b-c0b4e600c00f","Type":"ContainerStarted","Data":"1751b66bd0fd6310cd7f4e493a3c69cd71a876cb4622f587161a49d88ea257c7"} Oct 12 07:38:48 crc kubenswrapper[4599]: I1012 07:38:48.914500 4599 generic.go:334] "Generic (PLEG): container finished" podID="4b6aaa9e-4f52-45ec-994b-c0b4e600c00f" containerID="33e88e6c53c59110fa851aa0df802610e9b653f35779c1417ceae531a72c5732" exitCode=0 Oct 12 07:38:48 crc kubenswrapper[4599]: I1012 07:38:48.914586 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xkm4x" event={"ID":"4b6aaa9e-4f52-45ec-994b-c0b4e600c00f","Type":"ContainerDied","Data":"33e88e6c53c59110fa851aa0df802610e9b653f35779c1417ceae531a72c5732"} Oct 12 07:38:48 crc kubenswrapper[4599]: I1012 07:38:48.916541 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9d992" event={"ID":"ff5e25ae-0293-4ddc-8e3e-1c27d4aa9d89","Type":"ContainerStarted","Data":"959687c7cddd47effc329f426d290826767d7892980eb6e9509409fb6a195047"} Oct 12 07:38:49 crc kubenswrapper[4599]: I1012 07:38:49.329042 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vz257"] Oct 12 07:38:49 crc kubenswrapper[4599]: I1012 07:38:49.330066 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vz257" Oct 12 07:38:49 crc kubenswrapper[4599]: I1012 07:38:49.332471 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 12 07:38:49 crc kubenswrapper[4599]: I1012 07:38:49.334251 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vz257"] Oct 12 07:38:49 crc kubenswrapper[4599]: I1012 07:38:49.407895 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldqt5\" (UniqueName: \"kubernetes.io/projected/45933fcb-2999-4505-8356-fb50f8d1e2c7-kube-api-access-ldqt5\") pod \"community-operators-vz257\" (UID: \"45933fcb-2999-4505-8356-fb50f8d1e2c7\") " pod="openshift-marketplace/community-operators-vz257" Oct 12 07:38:49 crc kubenswrapper[4599]: I1012 07:38:49.407961 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45933fcb-2999-4505-8356-fb50f8d1e2c7-catalog-content\") pod \"community-operators-vz257\" (UID: \"45933fcb-2999-4505-8356-fb50f8d1e2c7\") " pod="openshift-marketplace/community-operators-vz257" Oct 12 07:38:49 crc kubenswrapper[4599]: I1012 07:38:49.408043 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45933fcb-2999-4505-8356-fb50f8d1e2c7-utilities\") pod \"community-operators-vz257\" (UID: \"45933fcb-2999-4505-8356-fb50f8d1e2c7\") " pod="openshift-marketplace/community-operators-vz257" Oct 12 07:38:49 crc kubenswrapper[4599]: I1012 07:38:49.508536 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldqt5\" (UniqueName: \"kubernetes.io/projected/45933fcb-2999-4505-8356-fb50f8d1e2c7-kube-api-access-ldqt5\") pod \"community-operators-vz257\" (UID: \"45933fcb-2999-4505-8356-fb50f8d1e2c7\") " pod="openshift-marketplace/community-operators-vz257" Oct 12 07:38:49 crc kubenswrapper[4599]: I1012 07:38:49.508599 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45933fcb-2999-4505-8356-fb50f8d1e2c7-catalog-content\") pod \"community-operators-vz257\" (UID: \"45933fcb-2999-4505-8356-fb50f8d1e2c7\") " pod="openshift-marketplace/community-operators-vz257" Oct 12 07:38:49 crc kubenswrapper[4599]: I1012 07:38:49.508635 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45933fcb-2999-4505-8356-fb50f8d1e2c7-utilities\") pod \"community-operators-vz257\" (UID: \"45933fcb-2999-4505-8356-fb50f8d1e2c7\") " pod="openshift-marketplace/community-operators-vz257" Oct 12 07:38:49 crc kubenswrapper[4599]: I1012 07:38:49.509089 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45933fcb-2999-4505-8356-fb50f8d1e2c7-utilities\") pod \"community-operators-vz257\" (UID: \"45933fcb-2999-4505-8356-fb50f8d1e2c7\") " pod="openshift-marketplace/community-operators-vz257" Oct 12 07:38:49 crc kubenswrapper[4599]: I1012 07:38:49.509319 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45933fcb-2999-4505-8356-fb50f8d1e2c7-catalog-content\") pod \"community-operators-vz257\" (UID: \"45933fcb-2999-4505-8356-fb50f8d1e2c7\") " pod="openshift-marketplace/community-operators-vz257" Oct 12 07:38:49 crc kubenswrapper[4599]: I1012 07:38:49.526234 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-sn9x6"] Oct 12 07:38:49 crc kubenswrapper[4599]: I1012 07:38:49.527197 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sn9x6" Oct 12 07:38:49 crc kubenswrapper[4599]: I1012 07:38:49.530172 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldqt5\" (UniqueName: \"kubernetes.io/projected/45933fcb-2999-4505-8356-fb50f8d1e2c7-kube-api-access-ldqt5\") pod \"community-operators-vz257\" (UID: \"45933fcb-2999-4505-8356-fb50f8d1e2c7\") " pod="openshift-marketplace/community-operators-vz257" Oct 12 07:38:49 crc kubenswrapper[4599]: I1012 07:38:49.532159 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 12 07:38:49 crc kubenswrapper[4599]: I1012 07:38:49.537587 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sn9x6"] Oct 12 07:38:49 crc kubenswrapper[4599]: I1012 07:38:49.609822 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3548e29-bf51-474a-9110-60bfad743fd3-catalog-content\") pod \"certified-operators-sn9x6\" (UID: \"c3548e29-bf51-474a-9110-60bfad743fd3\") " pod="openshift-marketplace/certified-operators-sn9x6" Oct 12 07:38:49 crc kubenswrapper[4599]: I1012 07:38:49.610021 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3548e29-bf51-474a-9110-60bfad743fd3-utilities\") pod \"certified-operators-sn9x6\" (UID: \"c3548e29-bf51-474a-9110-60bfad743fd3\") " pod="openshift-marketplace/certified-operators-sn9x6" Oct 12 07:38:49 crc kubenswrapper[4599]: I1012 07:38:49.610119 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gd588\" (UniqueName: \"kubernetes.io/projected/c3548e29-bf51-474a-9110-60bfad743fd3-kube-api-access-gd588\") pod \"certified-operators-sn9x6\" (UID: \"c3548e29-bf51-474a-9110-60bfad743fd3\") " pod="openshift-marketplace/certified-operators-sn9x6" Oct 12 07:38:49 crc kubenswrapper[4599]: I1012 07:38:49.648661 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vz257" Oct 12 07:38:49 crc kubenswrapper[4599]: I1012 07:38:49.712239 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gd588\" (UniqueName: \"kubernetes.io/projected/c3548e29-bf51-474a-9110-60bfad743fd3-kube-api-access-gd588\") pod \"certified-operators-sn9x6\" (UID: \"c3548e29-bf51-474a-9110-60bfad743fd3\") " pod="openshift-marketplace/certified-operators-sn9x6" Oct 12 07:38:49 crc kubenswrapper[4599]: I1012 07:38:49.712733 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3548e29-bf51-474a-9110-60bfad743fd3-catalog-content\") pod \"certified-operators-sn9x6\" (UID: \"c3548e29-bf51-474a-9110-60bfad743fd3\") " pod="openshift-marketplace/certified-operators-sn9x6" Oct 12 07:38:49 crc kubenswrapper[4599]: I1012 07:38:49.712766 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3548e29-bf51-474a-9110-60bfad743fd3-utilities\") pod \"certified-operators-sn9x6\" (UID: \"c3548e29-bf51-474a-9110-60bfad743fd3\") " pod="openshift-marketplace/certified-operators-sn9x6" Oct 12 07:38:49 crc kubenswrapper[4599]: I1012 07:38:49.713186 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3548e29-bf51-474a-9110-60bfad743fd3-catalog-content\") pod \"certified-operators-sn9x6\" (UID: \"c3548e29-bf51-474a-9110-60bfad743fd3\") " pod="openshift-marketplace/certified-operators-sn9x6" Oct 12 07:38:49 crc kubenswrapper[4599]: I1012 07:38:49.713276 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3548e29-bf51-474a-9110-60bfad743fd3-utilities\") pod \"certified-operators-sn9x6\" (UID: \"c3548e29-bf51-474a-9110-60bfad743fd3\") " pod="openshift-marketplace/certified-operators-sn9x6" Oct 12 07:38:49 crc kubenswrapper[4599]: I1012 07:38:49.732907 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gd588\" (UniqueName: \"kubernetes.io/projected/c3548e29-bf51-474a-9110-60bfad743fd3-kube-api-access-gd588\") pod \"certified-operators-sn9x6\" (UID: \"c3548e29-bf51-474a-9110-60bfad743fd3\") " pod="openshift-marketplace/certified-operators-sn9x6" Oct 12 07:38:49 crc kubenswrapper[4599]: I1012 07:38:49.854592 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sn9x6" Oct 12 07:38:49 crc kubenswrapper[4599]: I1012 07:38:49.922870 4599 generic.go:334] "Generic (PLEG): container finished" podID="ff5e25ae-0293-4ddc-8e3e-1c27d4aa9d89" containerID="959687c7cddd47effc329f426d290826767d7892980eb6e9509409fb6a195047" exitCode=0 Oct 12 07:38:49 crc kubenswrapper[4599]: I1012 07:38:49.922948 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9d992" event={"ID":"ff5e25ae-0293-4ddc-8e3e-1c27d4aa9d89","Type":"ContainerDied","Data":"959687c7cddd47effc329f426d290826767d7892980eb6e9509409fb6a195047"} Oct 12 07:38:49 crc kubenswrapper[4599]: I1012 07:38:49.925425 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xkm4x" event={"ID":"4b6aaa9e-4f52-45ec-994b-c0b4e600c00f","Type":"ContainerStarted","Data":"86a268f52441369ceaa785c37a5c1ca81ed5d6bf1918fedbbca92987774f8f26"} Oct 12 07:38:50 crc kubenswrapper[4599]: I1012 07:38:50.018273 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-xkm4x" podStartSLOduration=2.547650598 podStartE2EDuration="4.018249773s" podCreationTimestamp="2025-10-12 07:38:46 +0000 UTC" firstStartedPulling="2025-10-12 07:38:47.908729679 +0000 UTC m=+224.697925181" lastFinishedPulling="2025-10-12 07:38:49.379328854 +0000 UTC m=+226.168524356" observedRunningTime="2025-10-12 07:38:49.957355875 +0000 UTC m=+226.746551376" watchObservedRunningTime="2025-10-12 07:38:50.018249773 +0000 UTC m=+226.807445274" Oct 12 07:38:50 crc kubenswrapper[4599]: I1012 07:38:50.022488 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sn9x6"] Oct 12 07:38:50 crc kubenswrapper[4599]: W1012 07:38:50.025971 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3548e29_bf51_474a_9110_60bfad743fd3.slice/crio-0564840188e51cfc3d888fe182cf739d6f500954e78fd69cfa21189310a444c0 WatchSource:0}: Error finding container 0564840188e51cfc3d888fe182cf739d6f500954e78fd69cfa21189310a444c0: Status 404 returned error can't find the container with id 0564840188e51cfc3d888fe182cf739d6f500954e78fd69cfa21189310a444c0 Oct 12 07:38:50 crc kubenswrapper[4599]: I1012 07:38:50.033980 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vz257"] Oct 12 07:38:50 crc kubenswrapper[4599]: I1012 07:38:50.934140 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9d992" event={"ID":"ff5e25ae-0293-4ddc-8e3e-1c27d4aa9d89","Type":"ContainerStarted","Data":"53fc07bfc7af02dfa1dc91d9498ef84a550fe864792f90c4c64aba2279e04db0"} Oct 12 07:38:50 crc kubenswrapper[4599]: I1012 07:38:50.937390 4599 generic.go:334] "Generic (PLEG): container finished" podID="45933fcb-2999-4505-8356-fb50f8d1e2c7" containerID="d9d47349b2bae2287bd902f04f772c2b8acf97ef49367d56d1dc9ba452e53fe8" exitCode=0 Oct 12 07:38:50 crc kubenswrapper[4599]: I1012 07:38:50.937555 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vz257" event={"ID":"45933fcb-2999-4505-8356-fb50f8d1e2c7","Type":"ContainerDied","Data":"d9d47349b2bae2287bd902f04f772c2b8acf97ef49367d56d1dc9ba452e53fe8"} Oct 12 07:38:50 crc kubenswrapper[4599]: I1012 07:38:50.937620 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vz257" event={"ID":"45933fcb-2999-4505-8356-fb50f8d1e2c7","Type":"ContainerStarted","Data":"f57edd82363a65b8baf652675aea20a8fcf7875c9619bd22512f50b8bcd3c515"} Oct 12 07:38:50 crc kubenswrapper[4599]: I1012 07:38:50.944594 4599 generic.go:334] "Generic (PLEG): container finished" podID="c3548e29-bf51-474a-9110-60bfad743fd3" containerID="cae6fe0522e1ec96058bde5e729651dc35ad8f6cedb40d556e1f30b9fc4faa8b" exitCode=0 Oct 12 07:38:50 crc kubenswrapper[4599]: I1012 07:38:50.944823 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sn9x6" event={"ID":"c3548e29-bf51-474a-9110-60bfad743fd3","Type":"ContainerDied","Data":"cae6fe0522e1ec96058bde5e729651dc35ad8f6cedb40d556e1f30b9fc4faa8b"} Oct 12 07:38:50 crc kubenswrapper[4599]: I1012 07:38:50.944888 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sn9x6" event={"ID":"c3548e29-bf51-474a-9110-60bfad743fd3","Type":"ContainerStarted","Data":"0564840188e51cfc3d888fe182cf739d6f500954e78fd69cfa21189310a444c0"} Oct 12 07:38:50 crc kubenswrapper[4599]: I1012 07:38:50.952137 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9d992" podStartSLOduration=1.374920675 podStartE2EDuration="3.952119121s" podCreationTimestamp="2025-10-12 07:38:47 +0000 UTC" firstStartedPulling="2025-10-12 07:38:47.904081542 +0000 UTC m=+224.693277044" lastFinishedPulling="2025-10-12 07:38:50.481279988 +0000 UTC m=+227.270475490" observedRunningTime="2025-10-12 07:38:50.950469731 +0000 UTC m=+227.739665233" watchObservedRunningTime="2025-10-12 07:38:50.952119121 +0000 UTC m=+227.741314624" Oct 12 07:38:51 crc kubenswrapper[4599]: I1012 07:38:51.951444 4599 generic.go:334] "Generic (PLEG): container finished" podID="45933fcb-2999-4505-8356-fb50f8d1e2c7" containerID="db38674b828cccd13d95d2e9f11a274dccd199b94dc50ec07f6388dc02464466" exitCode=0 Oct 12 07:38:51 crc kubenswrapper[4599]: I1012 07:38:51.951547 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vz257" event={"ID":"45933fcb-2999-4505-8356-fb50f8d1e2c7","Type":"ContainerDied","Data":"db38674b828cccd13d95d2e9f11a274dccd199b94dc50ec07f6388dc02464466"} Oct 12 07:38:51 crc kubenswrapper[4599]: I1012 07:38:51.954826 4599 generic.go:334] "Generic (PLEG): container finished" podID="c3548e29-bf51-474a-9110-60bfad743fd3" containerID="d6c7908b330235a5b12e94139404fca90a50bece6d0c9e13210a734a3a299229" exitCode=0 Oct 12 07:38:51 crc kubenswrapper[4599]: I1012 07:38:51.954881 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sn9x6" event={"ID":"c3548e29-bf51-474a-9110-60bfad743fd3","Type":"ContainerDied","Data":"d6c7908b330235a5b12e94139404fca90a50bece6d0c9e13210a734a3a299229"} Oct 12 07:38:52 crc kubenswrapper[4599]: I1012 07:38:52.962107 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vz257" event={"ID":"45933fcb-2999-4505-8356-fb50f8d1e2c7","Type":"ContainerStarted","Data":"0de00d68f84605328074550c3d57f3caf5789a13abd69577aca7163d1c4db166"} Oct 12 07:38:52 crc kubenswrapper[4599]: I1012 07:38:52.964751 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sn9x6" event={"ID":"c3548e29-bf51-474a-9110-60bfad743fd3","Type":"ContainerStarted","Data":"0958a3e1a3d3411777a29155b52de70573c28eba816245e0e295fa0d7685410e"} Oct 12 07:38:52 crc kubenswrapper[4599]: I1012 07:38:52.981756 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vz257" podStartSLOduration=2.500538957 podStartE2EDuration="3.981744508s" podCreationTimestamp="2025-10-12 07:38:49 +0000 UTC" firstStartedPulling="2025-10-12 07:38:50.944519046 +0000 UTC m=+227.733714548" lastFinishedPulling="2025-10-12 07:38:52.425724597 +0000 UTC m=+229.214920099" observedRunningTime="2025-10-12 07:38:52.981239716 +0000 UTC m=+229.770435218" watchObservedRunningTime="2025-10-12 07:38:52.981744508 +0000 UTC m=+229.770940011" Oct 12 07:38:52 crc kubenswrapper[4599]: I1012 07:38:52.999665 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-sn9x6" podStartSLOduration=2.5117548210000002 podStartE2EDuration="3.999652937s" podCreationTimestamp="2025-10-12 07:38:49 +0000 UTC" firstStartedPulling="2025-10-12 07:38:50.946121799 +0000 UTC m=+227.735317300" lastFinishedPulling="2025-10-12 07:38:52.434019913 +0000 UTC m=+229.223215416" observedRunningTime="2025-10-12 07:38:52.997511407 +0000 UTC m=+229.786706910" watchObservedRunningTime="2025-10-12 07:38:52.999652937 +0000 UTC m=+229.788848439" Oct 12 07:38:57 crc kubenswrapper[4599]: I1012 07:38:57.245197 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-xkm4x" Oct 12 07:38:57 crc kubenswrapper[4599]: I1012 07:38:57.245866 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-xkm4x" Oct 12 07:38:57 crc kubenswrapper[4599]: I1012 07:38:57.280545 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-xkm4x" Oct 12 07:38:57 crc kubenswrapper[4599]: I1012 07:38:57.447184 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9d992" Oct 12 07:38:57 crc kubenswrapper[4599]: I1012 07:38:57.447287 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9d992" Oct 12 07:38:57 crc kubenswrapper[4599]: I1012 07:38:57.475185 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9d992" Oct 12 07:38:58 crc kubenswrapper[4599]: I1012 07:38:58.017506 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9d992" Oct 12 07:38:58 crc kubenswrapper[4599]: I1012 07:38:58.017569 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-xkm4x" Oct 12 07:38:59 crc kubenswrapper[4599]: I1012 07:38:59.649487 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vz257" Oct 12 07:38:59 crc kubenswrapper[4599]: I1012 07:38:59.649550 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vz257" Oct 12 07:38:59 crc kubenswrapper[4599]: I1012 07:38:59.681884 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vz257" Oct 12 07:38:59 crc kubenswrapper[4599]: I1012 07:38:59.854875 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-sn9x6" Oct 12 07:38:59 crc kubenswrapper[4599]: I1012 07:38:59.855057 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-sn9x6" Oct 12 07:38:59 crc kubenswrapper[4599]: I1012 07:38:59.885110 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-sn9x6" Oct 12 07:39:00 crc kubenswrapper[4599]: I1012 07:39:00.025692 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-sn9x6" Oct 12 07:39:00 crc kubenswrapper[4599]: I1012 07:39:00.027364 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vz257" Oct 12 07:40:28 crc kubenswrapper[4599]: I1012 07:40:28.322296 4599 patch_prober.go:28] interesting pod/machine-config-daemon-5mz5c container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 07:40:28 crc kubenswrapper[4599]: I1012 07:40:28.324511 4599 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 07:40:58 crc kubenswrapper[4599]: I1012 07:40:58.321437 4599 patch_prober.go:28] interesting pod/machine-config-daemon-5mz5c container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 07:40:58 crc kubenswrapper[4599]: I1012 07:40:58.322185 4599 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 07:41:28 crc kubenswrapper[4599]: I1012 07:41:28.322263 4599 patch_prober.go:28] interesting pod/machine-config-daemon-5mz5c container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 07:41:28 crc kubenswrapper[4599]: I1012 07:41:28.322929 4599 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 07:41:28 crc kubenswrapper[4599]: I1012 07:41:28.322985 4599 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" Oct 12 07:41:28 crc kubenswrapper[4599]: I1012 07:41:28.323690 4599 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"96c58ca22fd64ff7166d37c6b5588563180da0da78f1666f39d10296101df256"} pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 12 07:41:28 crc kubenswrapper[4599]: I1012 07:41:28.323758 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" containerName="machine-config-daemon" containerID="cri-o://96c58ca22fd64ff7166d37c6b5588563180da0da78f1666f39d10296101df256" gracePeriod=600 Oct 12 07:41:28 crc kubenswrapper[4599]: I1012 07:41:28.695785 4599 generic.go:334] "Generic (PLEG): container finished" podID="cc694bce-8c25-4729-b452-29d44d3efe6e" containerID="96c58ca22fd64ff7166d37c6b5588563180da0da78f1666f39d10296101df256" exitCode=0 Oct 12 07:41:28 crc kubenswrapper[4599]: I1012 07:41:28.695871 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" event={"ID":"cc694bce-8c25-4729-b452-29d44d3efe6e","Type":"ContainerDied","Data":"96c58ca22fd64ff7166d37c6b5588563180da0da78f1666f39d10296101df256"} Oct 12 07:41:28 crc kubenswrapper[4599]: I1012 07:41:28.696095 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" event={"ID":"cc694bce-8c25-4729-b452-29d44d3efe6e","Type":"ContainerStarted","Data":"fa4ea3304924aa5e47754fb164316c2f3c9af596068fee357fa89cb1b44eb67a"} Oct 12 07:41:28 crc kubenswrapper[4599]: I1012 07:41:28.696121 4599 scope.go:117] "RemoveContainer" containerID="238fe8d2b2e7f08d70bc5f938aca3c90b68c447db21a713d651d47b47a8127c2" Oct 12 07:42:44 crc kubenswrapper[4599]: I1012 07:42:44.369930 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-rhjz5"] Oct 12 07:42:44 crc kubenswrapper[4599]: I1012 07:42:44.371072 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-rhjz5" Oct 12 07:42:44 crc kubenswrapper[4599]: I1012 07:42:44.384187 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-rhjz5"] Oct 12 07:42:44 crc kubenswrapper[4599]: I1012 07:42:44.473155 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/822e6aed-f8c9-4003-bf75-692841a1bca7-installation-pull-secrets\") pod \"image-registry-66df7c8f76-rhjz5\" (UID: \"822e6aed-f8c9-4003-bf75-692841a1bca7\") " pod="openshift-image-registry/image-registry-66df7c8f76-rhjz5" Oct 12 07:42:44 crc kubenswrapper[4599]: I1012 07:42:44.473221 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/822e6aed-f8c9-4003-bf75-692841a1bca7-registry-certificates\") pod \"image-registry-66df7c8f76-rhjz5\" (UID: \"822e6aed-f8c9-4003-bf75-692841a1bca7\") " pod="openshift-image-registry/image-registry-66df7c8f76-rhjz5" Oct 12 07:42:44 crc kubenswrapper[4599]: I1012 07:42:44.473292 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/822e6aed-f8c9-4003-bf75-692841a1bca7-trusted-ca\") pod \"image-registry-66df7c8f76-rhjz5\" (UID: \"822e6aed-f8c9-4003-bf75-692841a1bca7\") " pod="openshift-image-registry/image-registry-66df7c8f76-rhjz5" Oct 12 07:42:44 crc kubenswrapper[4599]: I1012 07:42:44.473362 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/822e6aed-f8c9-4003-bf75-692841a1bca7-registry-tls\") pod \"image-registry-66df7c8f76-rhjz5\" (UID: \"822e6aed-f8c9-4003-bf75-692841a1bca7\") " pod="openshift-image-registry/image-registry-66df7c8f76-rhjz5" Oct 12 07:42:44 crc kubenswrapper[4599]: I1012 07:42:44.473392 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/822e6aed-f8c9-4003-bf75-692841a1bca7-bound-sa-token\") pod \"image-registry-66df7c8f76-rhjz5\" (UID: \"822e6aed-f8c9-4003-bf75-692841a1bca7\") " pod="openshift-image-registry/image-registry-66df7c8f76-rhjz5" Oct 12 07:42:44 crc kubenswrapper[4599]: I1012 07:42:44.473432 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/822e6aed-f8c9-4003-bf75-692841a1bca7-ca-trust-extracted\") pod \"image-registry-66df7c8f76-rhjz5\" (UID: \"822e6aed-f8c9-4003-bf75-692841a1bca7\") " pod="openshift-image-registry/image-registry-66df7c8f76-rhjz5" Oct 12 07:42:44 crc kubenswrapper[4599]: I1012 07:42:44.473509 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-rhjz5\" (UID: \"822e6aed-f8c9-4003-bf75-692841a1bca7\") " pod="openshift-image-registry/image-registry-66df7c8f76-rhjz5" Oct 12 07:42:44 crc kubenswrapper[4599]: I1012 07:42:44.473552 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ctt2\" (UniqueName: \"kubernetes.io/projected/822e6aed-f8c9-4003-bf75-692841a1bca7-kube-api-access-6ctt2\") pod \"image-registry-66df7c8f76-rhjz5\" (UID: \"822e6aed-f8c9-4003-bf75-692841a1bca7\") " pod="openshift-image-registry/image-registry-66df7c8f76-rhjz5" Oct 12 07:42:44 crc kubenswrapper[4599]: I1012 07:42:44.488862 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-rhjz5\" (UID: \"822e6aed-f8c9-4003-bf75-692841a1bca7\") " pod="openshift-image-registry/image-registry-66df7c8f76-rhjz5" Oct 12 07:42:44 crc kubenswrapper[4599]: I1012 07:42:44.574788 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ctt2\" (UniqueName: \"kubernetes.io/projected/822e6aed-f8c9-4003-bf75-692841a1bca7-kube-api-access-6ctt2\") pod \"image-registry-66df7c8f76-rhjz5\" (UID: \"822e6aed-f8c9-4003-bf75-692841a1bca7\") " pod="openshift-image-registry/image-registry-66df7c8f76-rhjz5" Oct 12 07:42:44 crc kubenswrapper[4599]: I1012 07:42:44.574853 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/822e6aed-f8c9-4003-bf75-692841a1bca7-installation-pull-secrets\") pod \"image-registry-66df7c8f76-rhjz5\" (UID: \"822e6aed-f8c9-4003-bf75-692841a1bca7\") " pod="openshift-image-registry/image-registry-66df7c8f76-rhjz5" Oct 12 07:42:44 crc kubenswrapper[4599]: I1012 07:42:44.574884 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/822e6aed-f8c9-4003-bf75-692841a1bca7-registry-certificates\") pod \"image-registry-66df7c8f76-rhjz5\" (UID: \"822e6aed-f8c9-4003-bf75-692841a1bca7\") " pod="openshift-image-registry/image-registry-66df7c8f76-rhjz5" Oct 12 07:42:44 crc kubenswrapper[4599]: I1012 07:42:44.574905 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/822e6aed-f8c9-4003-bf75-692841a1bca7-trusted-ca\") pod \"image-registry-66df7c8f76-rhjz5\" (UID: \"822e6aed-f8c9-4003-bf75-692841a1bca7\") " pod="openshift-image-registry/image-registry-66df7c8f76-rhjz5" Oct 12 07:42:44 crc kubenswrapper[4599]: I1012 07:42:44.574937 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/822e6aed-f8c9-4003-bf75-692841a1bca7-registry-tls\") pod \"image-registry-66df7c8f76-rhjz5\" (UID: \"822e6aed-f8c9-4003-bf75-692841a1bca7\") " pod="openshift-image-registry/image-registry-66df7c8f76-rhjz5" Oct 12 07:42:44 crc kubenswrapper[4599]: I1012 07:42:44.574957 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/822e6aed-f8c9-4003-bf75-692841a1bca7-bound-sa-token\") pod \"image-registry-66df7c8f76-rhjz5\" (UID: \"822e6aed-f8c9-4003-bf75-692841a1bca7\") " pod="openshift-image-registry/image-registry-66df7c8f76-rhjz5" Oct 12 07:42:44 crc kubenswrapper[4599]: I1012 07:42:44.574979 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/822e6aed-f8c9-4003-bf75-692841a1bca7-ca-trust-extracted\") pod \"image-registry-66df7c8f76-rhjz5\" (UID: \"822e6aed-f8c9-4003-bf75-692841a1bca7\") " pod="openshift-image-registry/image-registry-66df7c8f76-rhjz5" Oct 12 07:42:44 crc kubenswrapper[4599]: I1012 07:42:44.575451 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/822e6aed-f8c9-4003-bf75-692841a1bca7-ca-trust-extracted\") pod \"image-registry-66df7c8f76-rhjz5\" (UID: \"822e6aed-f8c9-4003-bf75-692841a1bca7\") " pod="openshift-image-registry/image-registry-66df7c8f76-rhjz5" Oct 12 07:42:44 crc kubenswrapper[4599]: I1012 07:42:44.576888 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/822e6aed-f8c9-4003-bf75-692841a1bca7-registry-certificates\") pod \"image-registry-66df7c8f76-rhjz5\" (UID: \"822e6aed-f8c9-4003-bf75-692841a1bca7\") " pod="openshift-image-registry/image-registry-66df7c8f76-rhjz5" Oct 12 07:42:44 crc kubenswrapper[4599]: I1012 07:42:44.577019 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/822e6aed-f8c9-4003-bf75-692841a1bca7-trusted-ca\") pod \"image-registry-66df7c8f76-rhjz5\" (UID: \"822e6aed-f8c9-4003-bf75-692841a1bca7\") " pod="openshift-image-registry/image-registry-66df7c8f76-rhjz5" Oct 12 07:42:44 crc kubenswrapper[4599]: I1012 07:42:44.580738 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/822e6aed-f8c9-4003-bf75-692841a1bca7-registry-tls\") pod \"image-registry-66df7c8f76-rhjz5\" (UID: \"822e6aed-f8c9-4003-bf75-692841a1bca7\") " pod="openshift-image-registry/image-registry-66df7c8f76-rhjz5" Oct 12 07:42:44 crc kubenswrapper[4599]: I1012 07:42:44.580782 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/822e6aed-f8c9-4003-bf75-692841a1bca7-installation-pull-secrets\") pod \"image-registry-66df7c8f76-rhjz5\" (UID: \"822e6aed-f8c9-4003-bf75-692841a1bca7\") " pod="openshift-image-registry/image-registry-66df7c8f76-rhjz5" Oct 12 07:42:44 crc kubenswrapper[4599]: I1012 07:42:44.589449 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/822e6aed-f8c9-4003-bf75-692841a1bca7-bound-sa-token\") pod \"image-registry-66df7c8f76-rhjz5\" (UID: \"822e6aed-f8c9-4003-bf75-692841a1bca7\") " pod="openshift-image-registry/image-registry-66df7c8f76-rhjz5" Oct 12 07:42:44 crc kubenswrapper[4599]: I1012 07:42:44.589876 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ctt2\" (UniqueName: \"kubernetes.io/projected/822e6aed-f8c9-4003-bf75-692841a1bca7-kube-api-access-6ctt2\") pod \"image-registry-66df7c8f76-rhjz5\" (UID: \"822e6aed-f8c9-4003-bf75-692841a1bca7\") " pod="openshift-image-registry/image-registry-66df7c8f76-rhjz5" Oct 12 07:42:44 crc kubenswrapper[4599]: I1012 07:42:44.685300 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-rhjz5" Oct 12 07:42:45 crc kubenswrapper[4599]: I1012 07:42:45.029568 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-rhjz5"] Oct 12 07:42:45 crc kubenswrapper[4599]: I1012 07:42:45.071016 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-rhjz5" event={"ID":"822e6aed-f8c9-4003-bf75-692841a1bca7","Type":"ContainerStarted","Data":"b278feb939b0c19c26bcccd201e4adb0a2217bc6d55f01e92b745805156ba421"} Oct 12 07:42:46 crc kubenswrapper[4599]: I1012 07:42:46.077887 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-rhjz5" event={"ID":"822e6aed-f8c9-4003-bf75-692841a1bca7","Type":"ContainerStarted","Data":"6bbd526cbbfad0128f0be80066f43f4ed6a5e562efb97914d0a86365b439d810"} Oct 12 07:42:46 crc kubenswrapper[4599]: I1012 07:42:46.078046 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-rhjz5" Oct 12 07:42:46 crc kubenswrapper[4599]: I1012 07:42:46.094782 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-rhjz5" podStartSLOduration=2.09476055 podStartE2EDuration="2.09476055s" podCreationTimestamp="2025-10-12 07:42:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:42:46.09432233 +0000 UTC m=+462.883517833" watchObservedRunningTime="2025-10-12 07:42:46.09476055 +0000 UTC m=+462.883956052" Oct 12 07:43:04 crc kubenswrapper[4599]: I1012 07:43:04.690506 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-rhjz5" Oct 12 07:43:04 crc kubenswrapper[4599]: I1012 07:43:04.729679 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-fwlgl"] Oct 12 07:43:16 crc kubenswrapper[4599]: I1012 07:43:16.844952 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-kpxnw"] Oct 12 07:43:16 crc kubenswrapper[4599]: I1012 07:43:16.846032 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-kpxnw" Oct 12 07:43:16 crc kubenswrapper[4599]: I1012 07:43:16.848365 4599 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-ls9kk" Oct 12 07:43:16 crc kubenswrapper[4599]: I1012 07:43:16.848442 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Oct 12 07:43:16 crc kubenswrapper[4599]: I1012 07:43:16.849080 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Oct 12 07:43:16 crc kubenswrapper[4599]: I1012 07:43:16.851809 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-7v5dn"] Oct 12 07:43:16 crc kubenswrapper[4599]: I1012 07:43:16.852787 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-7v5dn" Oct 12 07:43:16 crc kubenswrapper[4599]: I1012 07:43:16.854135 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-kpxnw"] Oct 12 07:43:16 crc kubenswrapper[4599]: I1012 07:43:16.854423 4599 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-v7qg7" Oct 12 07:43:16 crc kubenswrapper[4599]: I1012 07:43:16.859699 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-7v5dn"] Oct 12 07:43:16 crc kubenswrapper[4599]: I1012 07:43:16.862063 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-tp9pm"] Oct 12 07:43:16 crc kubenswrapper[4599]: I1012 07:43:16.862643 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-tp9pm" Oct 12 07:43:16 crc kubenswrapper[4599]: I1012 07:43:16.863814 4599 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-tsdsv" Oct 12 07:43:16 crc kubenswrapper[4599]: I1012 07:43:16.873669 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-tp9pm"] Oct 12 07:43:17 crc kubenswrapper[4599]: I1012 07:43:17.024554 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6md7\" (UniqueName: \"kubernetes.io/projected/70c0dff5-0cd7-4399-a044-c95469bea793-kube-api-access-j6md7\") pod \"cert-manager-5b446d88c5-7v5dn\" (UID: \"70c0dff5-0cd7-4399-a044-c95469bea793\") " pod="cert-manager/cert-manager-5b446d88c5-7v5dn" Oct 12 07:43:17 crc kubenswrapper[4599]: I1012 07:43:17.024921 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2whd\" (UniqueName: \"kubernetes.io/projected/88150699-5d1d-4b47-ad1e-bbe4cf006a3e-kube-api-access-c2whd\") pod \"cert-manager-cainjector-7f985d654d-kpxnw\" (UID: \"88150699-5d1d-4b47-ad1e-bbe4cf006a3e\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-kpxnw" Oct 12 07:43:17 crc kubenswrapper[4599]: I1012 07:43:17.025074 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvqd8\" (UniqueName: \"kubernetes.io/projected/c6d2a135-7c1d-4cfb-b8ee-fa9737f62776-kube-api-access-bvqd8\") pod \"cert-manager-webhook-5655c58dd6-tp9pm\" (UID: \"c6d2a135-7c1d-4cfb-b8ee-fa9737f62776\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-tp9pm" Oct 12 07:43:17 crc kubenswrapper[4599]: I1012 07:43:17.126525 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6md7\" (UniqueName: \"kubernetes.io/projected/70c0dff5-0cd7-4399-a044-c95469bea793-kube-api-access-j6md7\") pod \"cert-manager-5b446d88c5-7v5dn\" (UID: \"70c0dff5-0cd7-4399-a044-c95469bea793\") " pod="cert-manager/cert-manager-5b446d88c5-7v5dn" Oct 12 07:43:17 crc kubenswrapper[4599]: I1012 07:43:17.126567 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2whd\" (UniqueName: \"kubernetes.io/projected/88150699-5d1d-4b47-ad1e-bbe4cf006a3e-kube-api-access-c2whd\") pod \"cert-manager-cainjector-7f985d654d-kpxnw\" (UID: \"88150699-5d1d-4b47-ad1e-bbe4cf006a3e\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-kpxnw" Oct 12 07:43:17 crc kubenswrapper[4599]: I1012 07:43:17.126610 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvqd8\" (UniqueName: \"kubernetes.io/projected/c6d2a135-7c1d-4cfb-b8ee-fa9737f62776-kube-api-access-bvqd8\") pod \"cert-manager-webhook-5655c58dd6-tp9pm\" (UID: \"c6d2a135-7c1d-4cfb-b8ee-fa9737f62776\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-tp9pm" Oct 12 07:43:17 crc kubenswrapper[4599]: I1012 07:43:17.143388 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6md7\" (UniqueName: \"kubernetes.io/projected/70c0dff5-0cd7-4399-a044-c95469bea793-kube-api-access-j6md7\") pod \"cert-manager-5b446d88c5-7v5dn\" (UID: \"70c0dff5-0cd7-4399-a044-c95469bea793\") " pod="cert-manager/cert-manager-5b446d88c5-7v5dn" Oct 12 07:43:17 crc kubenswrapper[4599]: I1012 07:43:17.143384 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvqd8\" (UniqueName: \"kubernetes.io/projected/c6d2a135-7c1d-4cfb-b8ee-fa9737f62776-kube-api-access-bvqd8\") pod \"cert-manager-webhook-5655c58dd6-tp9pm\" (UID: \"c6d2a135-7c1d-4cfb-b8ee-fa9737f62776\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-tp9pm" Oct 12 07:43:17 crc kubenswrapper[4599]: I1012 07:43:17.143731 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2whd\" (UniqueName: \"kubernetes.io/projected/88150699-5d1d-4b47-ad1e-bbe4cf006a3e-kube-api-access-c2whd\") pod \"cert-manager-cainjector-7f985d654d-kpxnw\" (UID: \"88150699-5d1d-4b47-ad1e-bbe4cf006a3e\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-kpxnw" Oct 12 07:43:17 crc kubenswrapper[4599]: I1012 07:43:17.163264 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-kpxnw" Oct 12 07:43:17 crc kubenswrapper[4599]: I1012 07:43:17.172303 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-7v5dn" Oct 12 07:43:17 crc kubenswrapper[4599]: I1012 07:43:17.178161 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-tp9pm" Oct 12 07:43:17 crc kubenswrapper[4599]: I1012 07:43:17.534602 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-7v5dn"] Oct 12 07:43:17 crc kubenswrapper[4599]: I1012 07:43:17.543851 4599 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 12 07:43:17 crc kubenswrapper[4599]: I1012 07:43:17.573092 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-kpxnw"] Oct 12 07:43:17 crc kubenswrapper[4599]: I1012 07:43:17.575314 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-tp9pm"] Oct 12 07:43:17 crc kubenswrapper[4599]: W1012 07:43:17.576870 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod88150699_5d1d_4b47_ad1e_bbe4cf006a3e.slice/crio-3a703a11887321d65cf4b6cf1877f4cc60d6cb24efa83e8fd0a6b0fe7cd0a0bf WatchSource:0}: Error finding container 3a703a11887321d65cf4b6cf1877f4cc60d6cb24efa83e8fd0a6b0fe7cd0a0bf: Status 404 returned error can't find the container with id 3a703a11887321d65cf4b6cf1877f4cc60d6cb24efa83e8fd0a6b0fe7cd0a0bf Oct 12 07:43:17 crc kubenswrapper[4599]: W1012 07:43:17.582240 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc6d2a135_7c1d_4cfb_b8ee_fa9737f62776.slice/crio-5813b5f87c326570f15f1facd49e3c25fd110cd83968a636daa1cc7c5be6ca96 WatchSource:0}: Error finding container 5813b5f87c326570f15f1facd49e3c25fd110cd83968a636daa1cc7c5be6ca96: Status 404 returned error can't find the container with id 5813b5f87c326570f15f1facd49e3c25fd110cd83968a636daa1cc7c5be6ca96 Oct 12 07:43:18 crc kubenswrapper[4599]: I1012 07:43:18.241059 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-tp9pm" event={"ID":"c6d2a135-7c1d-4cfb-b8ee-fa9737f62776","Type":"ContainerStarted","Data":"5813b5f87c326570f15f1facd49e3c25fd110cd83968a636daa1cc7c5be6ca96"} Oct 12 07:43:18 crc kubenswrapper[4599]: I1012 07:43:18.242019 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-7v5dn" event={"ID":"70c0dff5-0cd7-4399-a044-c95469bea793","Type":"ContainerStarted","Data":"54ae1b3edae8e049b2a17a7e51e792c0ff701fe824b32cd80a443ad6869cf4ec"} Oct 12 07:43:18 crc kubenswrapper[4599]: I1012 07:43:18.242970 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-kpxnw" event={"ID":"88150699-5d1d-4b47-ad1e-bbe4cf006a3e","Type":"ContainerStarted","Data":"3a703a11887321d65cf4b6cf1877f4cc60d6cb24efa83e8fd0a6b0fe7cd0a0bf"} Oct 12 07:43:21 crc kubenswrapper[4599]: I1012 07:43:21.259776 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-tp9pm" event={"ID":"c6d2a135-7c1d-4cfb-b8ee-fa9737f62776","Type":"ContainerStarted","Data":"ff13cd98576633fc67d572266980f422fcb58de17314cd631863bbc80de0528a"} Oct 12 07:43:21 crc kubenswrapper[4599]: I1012 07:43:21.260545 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-tp9pm" Oct 12 07:43:21 crc kubenswrapper[4599]: I1012 07:43:21.263836 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-7v5dn" event={"ID":"70c0dff5-0cd7-4399-a044-c95469bea793","Type":"ContainerStarted","Data":"bcb60675aa89d6fce1db05b3017b7487a1453ae595dc0a45dc11c242b4ba786b"} Oct 12 07:43:21 crc kubenswrapper[4599]: I1012 07:43:21.265498 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-kpxnw" event={"ID":"88150699-5d1d-4b47-ad1e-bbe4cf006a3e","Type":"ContainerStarted","Data":"de3b296ab51155a80fbbb912ba72bcd8c4270dbeb9bde7076f9f6443d66b970b"} Oct 12 07:43:21 crc kubenswrapper[4599]: I1012 07:43:21.278150 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-tp9pm" podStartSLOduration=2.522073701 podStartE2EDuration="5.278141089s" podCreationTimestamp="2025-10-12 07:43:16 +0000 UTC" firstStartedPulling="2025-10-12 07:43:17.584221895 +0000 UTC m=+494.373417397" lastFinishedPulling="2025-10-12 07:43:20.340289283 +0000 UTC m=+497.129484785" observedRunningTime="2025-10-12 07:43:21.275949188 +0000 UTC m=+498.065144700" watchObservedRunningTime="2025-10-12 07:43:21.278141089 +0000 UTC m=+498.067336592" Oct 12 07:43:21 crc kubenswrapper[4599]: I1012 07:43:21.289536 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-7v5dn" podStartSLOduration=2.499517244 podStartE2EDuration="5.289519889s" podCreationTimestamp="2025-10-12 07:43:16 +0000 UTC" firstStartedPulling="2025-10-12 07:43:17.54361862 +0000 UTC m=+494.332814122" lastFinishedPulling="2025-10-12 07:43:20.333621275 +0000 UTC m=+497.122816767" observedRunningTime="2025-10-12 07:43:21.287033431 +0000 UTC m=+498.076228932" watchObservedRunningTime="2025-10-12 07:43:21.289519889 +0000 UTC m=+498.078715392" Oct 12 07:43:21 crc kubenswrapper[4599]: I1012 07:43:21.309065 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-kpxnw" podStartSLOduration=2.552672972 podStartE2EDuration="5.309044145s" podCreationTimestamp="2025-10-12 07:43:16 +0000 UTC" firstStartedPulling="2025-10-12 07:43:17.579079216 +0000 UTC m=+494.368274718" lastFinishedPulling="2025-10-12 07:43:20.33545039 +0000 UTC m=+497.124645891" observedRunningTime="2025-10-12 07:43:21.30354994 +0000 UTC m=+498.092745462" watchObservedRunningTime="2025-10-12 07:43:21.309044145 +0000 UTC m=+498.098239647" Oct 12 07:43:27 crc kubenswrapper[4599]: I1012 07:43:27.182311 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-tp9pm" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.321945 4599 patch_prober.go:28] interesting pod/machine-config-daemon-5mz5c container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.322379 4599 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.358630 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-whk5b"] Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.358998 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" podUID="1a95b7ab-8632-4332-a30f-64f28ef8d313" containerName="ovn-controller" containerID="cri-o://810de96e34e6b028e90bcc5693f1624d337cdd775c01901fc91e6dd2a7f3564d" gracePeriod=30 Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.359078 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" podUID="1a95b7ab-8632-4332-a30f-64f28ef8d313" containerName="sbdb" containerID="cri-o://ce496909e393b4896eb9ae165317648df3e828d219c231d95c6057cd08b78fc5" gracePeriod=30 Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.359102 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" podUID="1a95b7ab-8632-4332-a30f-64f28ef8d313" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://c150b687d6b0c337604ac14537335d57d9c955a9e31001d274507e5e65ac7bc5" gracePeriod=30 Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.359139 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" podUID="1a95b7ab-8632-4332-a30f-64f28ef8d313" containerName="kube-rbac-proxy-node" containerID="cri-o://543f674ce98b17c90c5a351ae72f596a2e90006a1b86fcede3359715c67b94bb" gracePeriod=30 Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.359066 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" podUID="1a95b7ab-8632-4332-a30f-64f28ef8d313" containerName="nbdb" containerID="cri-o://5f73bbbb549129c48a1421dda3d2385cf3515feafd8215cdac6b792823d614b8" gracePeriod=30 Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.359173 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" podUID="1a95b7ab-8632-4332-a30f-64f28ef8d313" containerName="ovn-acl-logging" containerID="cri-o://f99854365302221a36d3d7647292430b5655222c551953b6aa6de9eb1c922bea" gracePeriod=30 Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.359678 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" podUID="1a95b7ab-8632-4332-a30f-64f28ef8d313" containerName="northd" containerID="cri-o://a156f480c7252f61ec3dc991814db0e7259fdf7ae618c8f43d86da97f8a6bc5c" gracePeriod=30 Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.401658 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" podUID="1a95b7ab-8632-4332-a30f-64f28ef8d313" containerName="ovnkube-controller" containerID="cri-o://6baea2afbd1f3920c0713234756041f600567cc0d372d1e6e4007bfe2b9e663f" gracePeriod=30 Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.659068 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-whk5b_1a95b7ab-8632-4332-a30f-64f28ef8d313/ovnkube-controller/3.log" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.662942 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-whk5b_1a95b7ab-8632-4332-a30f-64f28ef8d313/ovn-acl-logging/0.log" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.663648 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-whk5b_1a95b7ab-8632-4332-a30f-64f28ef8d313/ovn-controller/0.log" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.664216 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.710864 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-rz8xt"] Oct 12 07:43:28 crc kubenswrapper[4599]: E1012 07:43:28.711174 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a95b7ab-8632-4332-a30f-64f28ef8d313" containerName="nbdb" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.711209 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a95b7ab-8632-4332-a30f-64f28ef8d313" containerName="nbdb" Oct 12 07:43:28 crc kubenswrapper[4599]: E1012 07:43:28.711227 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a95b7ab-8632-4332-a30f-64f28ef8d313" containerName="ovn-acl-logging" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.711232 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a95b7ab-8632-4332-a30f-64f28ef8d313" containerName="ovn-acl-logging" Oct 12 07:43:28 crc kubenswrapper[4599]: E1012 07:43:28.711239 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a95b7ab-8632-4332-a30f-64f28ef8d313" containerName="northd" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.711245 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a95b7ab-8632-4332-a30f-64f28ef8d313" containerName="northd" Oct 12 07:43:28 crc kubenswrapper[4599]: E1012 07:43:28.711256 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a95b7ab-8632-4332-a30f-64f28ef8d313" containerName="sbdb" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.711262 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a95b7ab-8632-4332-a30f-64f28ef8d313" containerName="sbdb" Oct 12 07:43:28 crc kubenswrapper[4599]: E1012 07:43:28.711268 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a95b7ab-8632-4332-a30f-64f28ef8d313" containerName="ovnkube-controller" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.711287 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a95b7ab-8632-4332-a30f-64f28ef8d313" containerName="ovnkube-controller" Oct 12 07:43:28 crc kubenswrapper[4599]: E1012 07:43:28.711294 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a95b7ab-8632-4332-a30f-64f28ef8d313" containerName="ovnkube-controller" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.711300 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a95b7ab-8632-4332-a30f-64f28ef8d313" containerName="ovnkube-controller" Oct 12 07:43:28 crc kubenswrapper[4599]: E1012 07:43:28.711307 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a95b7ab-8632-4332-a30f-64f28ef8d313" containerName="ovn-controller" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.711313 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a95b7ab-8632-4332-a30f-64f28ef8d313" containerName="ovn-controller" Oct 12 07:43:28 crc kubenswrapper[4599]: E1012 07:43:28.711318 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a95b7ab-8632-4332-a30f-64f28ef8d313" containerName="ovnkube-controller" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.711323 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a95b7ab-8632-4332-a30f-64f28ef8d313" containerName="ovnkube-controller" Oct 12 07:43:28 crc kubenswrapper[4599]: E1012 07:43:28.711350 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a95b7ab-8632-4332-a30f-64f28ef8d313" containerName="kubecfg-setup" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.711358 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a95b7ab-8632-4332-a30f-64f28ef8d313" containerName="kubecfg-setup" Oct 12 07:43:28 crc kubenswrapper[4599]: E1012 07:43:28.711366 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a95b7ab-8632-4332-a30f-64f28ef8d313" containerName="kube-rbac-proxy-ovn-metrics" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.711372 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a95b7ab-8632-4332-a30f-64f28ef8d313" containerName="kube-rbac-proxy-ovn-metrics" Oct 12 07:43:28 crc kubenswrapper[4599]: E1012 07:43:28.711381 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a95b7ab-8632-4332-a30f-64f28ef8d313" containerName="kube-rbac-proxy-node" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.711388 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a95b7ab-8632-4332-a30f-64f28ef8d313" containerName="kube-rbac-proxy-node" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.711513 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a95b7ab-8632-4332-a30f-64f28ef8d313" containerName="nbdb" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.711523 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a95b7ab-8632-4332-a30f-64f28ef8d313" containerName="ovnkube-controller" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.711531 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a95b7ab-8632-4332-a30f-64f28ef8d313" containerName="kube-rbac-proxy-ovn-metrics" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.711539 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a95b7ab-8632-4332-a30f-64f28ef8d313" containerName="ovn-acl-logging" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.711545 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a95b7ab-8632-4332-a30f-64f28ef8d313" containerName="kube-rbac-proxy-node" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.711552 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a95b7ab-8632-4332-a30f-64f28ef8d313" containerName="ovnkube-controller" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.711559 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a95b7ab-8632-4332-a30f-64f28ef8d313" containerName="northd" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.711581 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a95b7ab-8632-4332-a30f-64f28ef8d313" containerName="ovnkube-controller" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.711587 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a95b7ab-8632-4332-a30f-64f28ef8d313" containerName="ovn-controller" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.711595 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a95b7ab-8632-4332-a30f-64f28ef8d313" containerName="sbdb" Oct 12 07:43:28 crc kubenswrapper[4599]: E1012 07:43:28.711691 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a95b7ab-8632-4332-a30f-64f28ef8d313" containerName="ovnkube-controller" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.711698 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a95b7ab-8632-4332-a30f-64f28ef8d313" containerName="ovnkube-controller" Oct 12 07:43:28 crc kubenswrapper[4599]: E1012 07:43:28.711705 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a95b7ab-8632-4332-a30f-64f28ef8d313" containerName="ovnkube-controller" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.711711 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a95b7ab-8632-4332-a30f-64f28ef8d313" containerName="ovnkube-controller" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.711835 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a95b7ab-8632-4332-a30f-64f28ef8d313" containerName="ovnkube-controller" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.711847 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a95b7ab-8632-4332-a30f-64f28ef8d313" containerName="ovnkube-controller" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.714284 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-rz8xt" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.795647 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1a95b7ab-8632-4332-a30f-64f28ef8d313-host-cni-netd\") pod \"1a95b7ab-8632-4332-a30f-64f28ef8d313\" (UID: \"1a95b7ab-8632-4332-a30f-64f28ef8d313\") " Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.795787 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1a95b7ab-8632-4332-a30f-64f28ef8d313-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "1a95b7ab-8632-4332-a30f-64f28ef8d313" (UID: "1a95b7ab-8632-4332-a30f-64f28ef8d313"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.796162 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qzwzn\" (UniqueName: \"kubernetes.io/projected/1a95b7ab-8632-4332-a30f-64f28ef8d313-kube-api-access-qzwzn\") pod \"1a95b7ab-8632-4332-a30f-64f28ef8d313\" (UID: \"1a95b7ab-8632-4332-a30f-64f28ef8d313\") " Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.796486 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1a95b7ab-8632-4332-a30f-64f28ef8d313-systemd-units\") pod \"1a95b7ab-8632-4332-a30f-64f28ef8d313\" (UID: \"1a95b7ab-8632-4332-a30f-64f28ef8d313\") " Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.796534 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1a95b7ab-8632-4332-a30f-64f28ef8d313-run-systemd\") pod \"1a95b7ab-8632-4332-a30f-64f28ef8d313\" (UID: \"1a95b7ab-8632-4332-a30f-64f28ef8d313\") " Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.796565 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1a95b7ab-8632-4332-a30f-64f28ef8d313-host-slash\") pod \"1a95b7ab-8632-4332-a30f-64f28ef8d313\" (UID: \"1a95b7ab-8632-4332-a30f-64f28ef8d313\") " Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.796590 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1a95b7ab-8632-4332-a30f-64f28ef8d313-var-lib-openvswitch\") pod \"1a95b7ab-8632-4332-a30f-64f28ef8d313\" (UID: \"1a95b7ab-8632-4332-a30f-64f28ef8d313\") " Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.796626 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1a95b7ab-8632-4332-a30f-64f28ef8d313-host-var-lib-cni-networks-ovn-kubernetes\") pod \"1a95b7ab-8632-4332-a30f-64f28ef8d313\" (UID: \"1a95b7ab-8632-4332-a30f-64f28ef8d313\") " Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.796608 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1a95b7ab-8632-4332-a30f-64f28ef8d313-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "1a95b7ab-8632-4332-a30f-64f28ef8d313" (UID: "1a95b7ab-8632-4332-a30f-64f28ef8d313"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.796665 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1a95b7ab-8632-4332-a30f-64f28ef8d313-ovnkube-script-lib\") pod \"1a95b7ab-8632-4332-a30f-64f28ef8d313\" (UID: \"1a95b7ab-8632-4332-a30f-64f28ef8d313\") " Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.796687 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1a95b7ab-8632-4332-a30f-64f28ef8d313-host-run-netns\") pod \"1a95b7ab-8632-4332-a30f-64f28ef8d313\" (UID: \"1a95b7ab-8632-4332-a30f-64f28ef8d313\") " Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.796705 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1a95b7ab-8632-4332-a30f-64f28ef8d313-host-slash" (OuterVolumeSpecName: "host-slash") pod "1a95b7ab-8632-4332-a30f-64f28ef8d313" (UID: "1a95b7ab-8632-4332-a30f-64f28ef8d313"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.796717 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1a95b7ab-8632-4332-a30f-64f28ef8d313-run-openvswitch\") pod \"1a95b7ab-8632-4332-a30f-64f28ef8d313\" (UID: \"1a95b7ab-8632-4332-a30f-64f28ef8d313\") " Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.796748 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1a95b7ab-8632-4332-a30f-64f28ef8d313-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "1a95b7ab-8632-4332-a30f-64f28ef8d313" (UID: "1a95b7ab-8632-4332-a30f-64f28ef8d313"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.796785 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1a95b7ab-8632-4332-a30f-64f28ef8d313-node-log\") pod \"1a95b7ab-8632-4332-a30f-64f28ef8d313\" (UID: \"1a95b7ab-8632-4332-a30f-64f28ef8d313\") " Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.796798 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1a95b7ab-8632-4332-a30f-64f28ef8d313-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "1a95b7ab-8632-4332-a30f-64f28ef8d313" (UID: "1a95b7ab-8632-4332-a30f-64f28ef8d313"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.796762 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1a95b7ab-8632-4332-a30f-64f28ef8d313-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "1a95b7ab-8632-4332-a30f-64f28ef8d313" (UID: "1a95b7ab-8632-4332-a30f-64f28ef8d313"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.796827 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1a95b7ab-8632-4332-a30f-64f28ef8d313-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "1a95b7ab-8632-4332-a30f-64f28ef8d313" (UID: "1a95b7ab-8632-4332-a30f-64f28ef8d313"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.796873 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1a95b7ab-8632-4332-a30f-64f28ef8d313-host-cni-bin\") pod \"1a95b7ab-8632-4332-a30f-64f28ef8d313\" (UID: \"1a95b7ab-8632-4332-a30f-64f28ef8d313\") " Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.796895 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1a95b7ab-8632-4332-a30f-64f28ef8d313-node-log" (OuterVolumeSpecName: "node-log") pod "1a95b7ab-8632-4332-a30f-64f28ef8d313" (UID: "1a95b7ab-8632-4332-a30f-64f28ef8d313"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.796921 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1a95b7ab-8632-4332-a30f-64f28ef8d313-run-ovn\") pod \"1a95b7ab-8632-4332-a30f-64f28ef8d313\" (UID: \"1a95b7ab-8632-4332-a30f-64f28ef8d313\") " Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.796943 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1a95b7ab-8632-4332-a30f-64f28ef8d313-log-socket\") pod \"1a95b7ab-8632-4332-a30f-64f28ef8d313\" (UID: \"1a95b7ab-8632-4332-a30f-64f28ef8d313\") " Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.796964 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1a95b7ab-8632-4332-a30f-64f28ef8d313-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "1a95b7ab-8632-4332-a30f-64f28ef8d313" (UID: "1a95b7ab-8632-4332-a30f-64f28ef8d313"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.796981 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1a95b7ab-8632-4332-a30f-64f28ef8d313-ovnkube-config\") pod \"1a95b7ab-8632-4332-a30f-64f28ef8d313\" (UID: \"1a95b7ab-8632-4332-a30f-64f28ef8d313\") " Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.796980 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1a95b7ab-8632-4332-a30f-64f28ef8d313-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "1a95b7ab-8632-4332-a30f-64f28ef8d313" (UID: "1a95b7ab-8632-4332-a30f-64f28ef8d313"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.796996 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1a95b7ab-8632-4332-a30f-64f28ef8d313-log-socket" (OuterVolumeSpecName: "log-socket") pod "1a95b7ab-8632-4332-a30f-64f28ef8d313" (UID: "1a95b7ab-8632-4332-a30f-64f28ef8d313"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.797006 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1a95b7ab-8632-4332-a30f-64f28ef8d313-env-overrides\") pod \"1a95b7ab-8632-4332-a30f-64f28ef8d313\" (UID: \"1a95b7ab-8632-4332-a30f-64f28ef8d313\") " Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.797063 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1a95b7ab-8632-4332-a30f-64f28ef8d313-host-kubelet\") pod \"1a95b7ab-8632-4332-a30f-64f28ef8d313\" (UID: \"1a95b7ab-8632-4332-a30f-64f28ef8d313\") " Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.797093 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1a95b7ab-8632-4332-a30f-64f28ef8d313-ovn-node-metrics-cert\") pod \"1a95b7ab-8632-4332-a30f-64f28ef8d313\" (UID: \"1a95b7ab-8632-4332-a30f-64f28ef8d313\") " Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.797118 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1a95b7ab-8632-4332-a30f-64f28ef8d313-etc-openvswitch\") pod \"1a95b7ab-8632-4332-a30f-64f28ef8d313\" (UID: \"1a95b7ab-8632-4332-a30f-64f28ef8d313\") " Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.797138 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1a95b7ab-8632-4332-a30f-64f28ef8d313-host-run-ovn-kubernetes\") pod \"1a95b7ab-8632-4332-a30f-64f28ef8d313\" (UID: \"1a95b7ab-8632-4332-a30f-64f28ef8d313\") " Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.797145 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1a95b7ab-8632-4332-a30f-64f28ef8d313-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "1a95b7ab-8632-4332-a30f-64f28ef8d313" (UID: "1a95b7ab-8632-4332-a30f-64f28ef8d313"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.797300 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/60a2166e-d71e-4540-84ce-92bf345b9db2-run-ovn\") pod \"ovnkube-node-rz8xt\" (UID: \"60a2166e-d71e-4540-84ce-92bf345b9db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz8xt" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.797356 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/60a2166e-d71e-4540-84ce-92bf345b9db2-host-cni-netd\") pod \"ovnkube-node-rz8xt\" (UID: \"60a2166e-d71e-4540-84ce-92bf345b9db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz8xt" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.797375 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a95b7ab-8632-4332-a30f-64f28ef8d313-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "1a95b7ab-8632-4332-a30f-64f28ef8d313" (UID: "1a95b7ab-8632-4332-a30f-64f28ef8d313"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.797392 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/60a2166e-d71e-4540-84ce-92bf345b9db2-run-systemd\") pod \"ovnkube-node-rz8xt\" (UID: \"60a2166e-d71e-4540-84ce-92bf345b9db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz8xt" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.797420 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/60a2166e-d71e-4540-84ce-92bf345b9db2-host-kubelet\") pod \"ovnkube-node-rz8xt\" (UID: \"60a2166e-d71e-4540-84ce-92bf345b9db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz8xt" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.797417 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1a95b7ab-8632-4332-a30f-64f28ef8d313-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "1a95b7ab-8632-4332-a30f-64f28ef8d313" (UID: "1a95b7ab-8632-4332-a30f-64f28ef8d313"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.797441 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1a95b7ab-8632-4332-a30f-64f28ef8d313-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "1a95b7ab-8632-4332-a30f-64f28ef8d313" (UID: "1a95b7ab-8632-4332-a30f-64f28ef8d313"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.797455 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a95b7ab-8632-4332-a30f-64f28ef8d313-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "1a95b7ab-8632-4332-a30f-64f28ef8d313" (UID: "1a95b7ab-8632-4332-a30f-64f28ef8d313"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.797470 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a95b7ab-8632-4332-a30f-64f28ef8d313-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "1a95b7ab-8632-4332-a30f-64f28ef8d313" (UID: "1a95b7ab-8632-4332-a30f-64f28ef8d313"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.797654 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/60a2166e-d71e-4540-84ce-92bf345b9db2-var-lib-openvswitch\") pod \"ovnkube-node-rz8xt\" (UID: \"60a2166e-d71e-4540-84ce-92bf345b9db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz8xt" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.797706 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/60a2166e-d71e-4540-84ce-92bf345b9db2-host-run-netns\") pod \"ovnkube-node-rz8xt\" (UID: \"60a2166e-d71e-4540-84ce-92bf345b9db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz8xt" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.797793 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/60a2166e-d71e-4540-84ce-92bf345b9db2-etc-openvswitch\") pod \"ovnkube-node-rz8xt\" (UID: \"60a2166e-d71e-4540-84ce-92bf345b9db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz8xt" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.797832 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/60a2166e-d71e-4540-84ce-92bf345b9db2-host-cni-bin\") pod \"ovnkube-node-rz8xt\" (UID: \"60a2166e-d71e-4540-84ce-92bf345b9db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz8xt" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.797869 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/60a2166e-d71e-4540-84ce-92bf345b9db2-node-log\") pod \"ovnkube-node-rz8xt\" (UID: \"60a2166e-d71e-4540-84ce-92bf345b9db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz8xt" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.797921 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/60a2166e-d71e-4540-84ce-92bf345b9db2-ovnkube-config\") pod \"ovnkube-node-rz8xt\" (UID: \"60a2166e-d71e-4540-84ce-92bf345b9db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz8xt" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.797962 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/60a2166e-d71e-4540-84ce-92bf345b9db2-host-slash\") pod \"ovnkube-node-rz8xt\" (UID: \"60a2166e-d71e-4540-84ce-92bf345b9db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz8xt" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.797980 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/60a2166e-d71e-4540-84ce-92bf345b9db2-ovnkube-script-lib\") pod \"ovnkube-node-rz8xt\" (UID: \"60a2166e-d71e-4540-84ce-92bf345b9db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz8xt" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.798001 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/60a2166e-d71e-4540-84ce-92bf345b9db2-env-overrides\") pod \"ovnkube-node-rz8xt\" (UID: \"60a2166e-d71e-4540-84ce-92bf345b9db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz8xt" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.798022 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/60a2166e-d71e-4540-84ce-92bf345b9db2-ovn-node-metrics-cert\") pod \"ovnkube-node-rz8xt\" (UID: \"60a2166e-d71e-4540-84ce-92bf345b9db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz8xt" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.798040 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/60a2166e-d71e-4540-84ce-92bf345b9db2-run-openvswitch\") pod \"ovnkube-node-rz8xt\" (UID: \"60a2166e-d71e-4540-84ce-92bf345b9db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz8xt" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.798118 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/60a2166e-d71e-4540-84ce-92bf345b9db2-log-socket\") pod \"ovnkube-node-rz8xt\" (UID: \"60a2166e-d71e-4540-84ce-92bf345b9db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz8xt" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.798138 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkpb6\" (UniqueName: \"kubernetes.io/projected/60a2166e-d71e-4540-84ce-92bf345b9db2-kube-api-access-pkpb6\") pod \"ovnkube-node-rz8xt\" (UID: \"60a2166e-d71e-4540-84ce-92bf345b9db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz8xt" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.798189 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/60a2166e-d71e-4540-84ce-92bf345b9db2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rz8xt\" (UID: \"60a2166e-d71e-4540-84ce-92bf345b9db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz8xt" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.798224 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/60a2166e-d71e-4540-84ce-92bf345b9db2-systemd-units\") pod \"ovnkube-node-rz8xt\" (UID: \"60a2166e-d71e-4540-84ce-92bf345b9db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz8xt" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.798269 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/60a2166e-d71e-4540-84ce-92bf345b9db2-host-run-ovn-kubernetes\") pod \"ovnkube-node-rz8xt\" (UID: \"60a2166e-d71e-4540-84ce-92bf345b9db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz8xt" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.798354 4599 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1a95b7ab-8632-4332-a30f-64f28ef8d313-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.798373 4599 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1a95b7ab-8632-4332-a30f-64f28ef8d313-log-socket\") on node \"crc\" DevicePath \"\"" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.798384 4599 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1a95b7ab-8632-4332-a30f-64f28ef8d313-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.798393 4599 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1a95b7ab-8632-4332-a30f-64f28ef8d313-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.798404 4599 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1a95b7ab-8632-4332-a30f-64f28ef8d313-host-kubelet\") on node \"crc\" DevicePath \"\"" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.798415 4599 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1a95b7ab-8632-4332-a30f-64f28ef8d313-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.798426 4599 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1a95b7ab-8632-4332-a30f-64f28ef8d313-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.798438 4599 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1a95b7ab-8632-4332-a30f-64f28ef8d313-host-cni-netd\") on node \"crc\" DevicePath \"\"" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.798449 4599 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1a95b7ab-8632-4332-a30f-64f28ef8d313-systemd-units\") on node \"crc\" DevicePath \"\"" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.798458 4599 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1a95b7ab-8632-4332-a30f-64f28ef8d313-host-slash\") on node \"crc\" DevicePath \"\"" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.798469 4599 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1a95b7ab-8632-4332-a30f-64f28ef8d313-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.798478 4599 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1a95b7ab-8632-4332-a30f-64f28ef8d313-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.798488 4599 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1a95b7ab-8632-4332-a30f-64f28ef8d313-host-run-netns\") on node \"crc\" DevicePath \"\"" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.798497 4599 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1a95b7ab-8632-4332-a30f-64f28ef8d313-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.798505 4599 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1a95b7ab-8632-4332-a30f-64f28ef8d313-run-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.798515 4599 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1a95b7ab-8632-4332-a30f-64f28ef8d313-node-log\") on node \"crc\" DevicePath \"\"" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.798525 4599 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1a95b7ab-8632-4332-a30f-64f28ef8d313-host-cni-bin\") on node \"crc\" DevicePath \"\"" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.803443 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a95b7ab-8632-4332-a30f-64f28ef8d313-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "1a95b7ab-8632-4332-a30f-64f28ef8d313" (UID: "1a95b7ab-8632-4332-a30f-64f28ef8d313"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.803806 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a95b7ab-8632-4332-a30f-64f28ef8d313-kube-api-access-qzwzn" (OuterVolumeSpecName: "kube-api-access-qzwzn") pod "1a95b7ab-8632-4332-a30f-64f28ef8d313" (UID: "1a95b7ab-8632-4332-a30f-64f28ef8d313"). InnerVolumeSpecName "kube-api-access-qzwzn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.812447 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1a95b7ab-8632-4332-a30f-64f28ef8d313-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "1a95b7ab-8632-4332-a30f-64f28ef8d313" (UID: "1a95b7ab-8632-4332-a30f-64f28ef8d313"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.899060 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/60a2166e-d71e-4540-84ce-92bf345b9db2-var-lib-openvswitch\") pod \"ovnkube-node-rz8xt\" (UID: \"60a2166e-d71e-4540-84ce-92bf345b9db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz8xt" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.899110 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/60a2166e-d71e-4540-84ce-92bf345b9db2-host-run-netns\") pod \"ovnkube-node-rz8xt\" (UID: \"60a2166e-d71e-4540-84ce-92bf345b9db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz8xt" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.899128 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/60a2166e-d71e-4540-84ce-92bf345b9db2-etc-openvswitch\") pod \"ovnkube-node-rz8xt\" (UID: \"60a2166e-d71e-4540-84ce-92bf345b9db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz8xt" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.899149 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/60a2166e-d71e-4540-84ce-92bf345b9db2-host-cni-bin\") pod \"ovnkube-node-rz8xt\" (UID: \"60a2166e-d71e-4540-84ce-92bf345b9db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz8xt" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.899169 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/60a2166e-d71e-4540-84ce-92bf345b9db2-node-log\") pod \"ovnkube-node-rz8xt\" (UID: \"60a2166e-d71e-4540-84ce-92bf345b9db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz8xt" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.899172 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/60a2166e-d71e-4540-84ce-92bf345b9db2-host-run-netns\") pod \"ovnkube-node-rz8xt\" (UID: \"60a2166e-d71e-4540-84ce-92bf345b9db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz8xt" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.899195 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/60a2166e-d71e-4540-84ce-92bf345b9db2-ovnkube-config\") pod \"ovnkube-node-rz8xt\" (UID: \"60a2166e-d71e-4540-84ce-92bf345b9db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz8xt" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.899175 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/60a2166e-d71e-4540-84ce-92bf345b9db2-var-lib-openvswitch\") pod \"ovnkube-node-rz8xt\" (UID: \"60a2166e-d71e-4540-84ce-92bf345b9db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz8xt" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.899198 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/60a2166e-d71e-4540-84ce-92bf345b9db2-etc-openvswitch\") pod \"ovnkube-node-rz8xt\" (UID: \"60a2166e-d71e-4540-84ce-92bf345b9db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz8xt" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.899205 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/60a2166e-d71e-4540-84ce-92bf345b9db2-host-cni-bin\") pod \"ovnkube-node-rz8xt\" (UID: \"60a2166e-d71e-4540-84ce-92bf345b9db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz8xt" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.899228 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/60a2166e-d71e-4540-84ce-92bf345b9db2-node-log\") pod \"ovnkube-node-rz8xt\" (UID: \"60a2166e-d71e-4540-84ce-92bf345b9db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz8xt" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.899355 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/60a2166e-d71e-4540-84ce-92bf345b9db2-host-slash\") pod \"ovnkube-node-rz8xt\" (UID: \"60a2166e-d71e-4540-84ce-92bf345b9db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz8xt" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.899381 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/60a2166e-d71e-4540-84ce-92bf345b9db2-ovnkube-script-lib\") pod \"ovnkube-node-rz8xt\" (UID: \"60a2166e-d71e-4540-84ce-92bf345b9db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz8xt" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.899404 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/60a2166e-d71e-4540-84ce-92bf345b9db2-env-overrides\") pod \"ovnkube-node-rz8xt\" (UID: \"60a2166e-d71e-4540-84ce-92bf345b9db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz8xt" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.899424 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/60a2166e-d71e-4540-84ce-92bf345b9db2-ovn-node-metrics-cert\") pod \"ovnkube-node-rz8xt\" (UID: \"60a2166e-d71e-4540-84ce-92bf345b9db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz8xt" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.899444 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/60a2166e-d71e-4540-84ce-92bf345b9db2-run-openvswitch\") pod \"ovnkube-node-rz8xt\" (UID: \"60a2166e-d71e-4540-84ce-92bf345b9db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz8xt" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.899468 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/60a2166e-d71e-4540-84ce-92bf345b9db2-log-socket\") pod \"ovnkube-node-rz8xt\" (UID: \"60a2166e-d71e-4540-84ce-92bf345b9db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz8xt" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.899492 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkpb6\" (UniqueName: \"kubernetes.io/projected/60a2166e-d71e-4540-84ce-92bf345b9db2-kube-api-access-pkpb6\") pod \"ovnkube-node-rz8xt\" (UID: \"60a2166e-d71e-4540-84ce-92bf345b9db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz8xt" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.899515 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/60a2166e-d71e-4540-84ce-92bf345b9db2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rz8xt\" (UID: \"60a2166e-d71e-4540-84ce-92bf345b9db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz8xt" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.899536 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/60a2166e-d71e-4540-84ce-92bf345b9db2-systemd-units\") pod \"ovnkube-node-rz8xt\" (UID: \"60a2166e-d71e-4540-84ce-92bf345b9db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz8xt" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.899567 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/60a2166e-d71e-4540-84ce-92bf345b9db2-host-run-ovn-kubernetes\") pod \"ovnkube-node-rz8xt\" (UID: \"60a2166e-d71e-4540-84ce-92bf345b9db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz8xt" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.899596 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/60a2166e-d71e-4540-84ce-92bf345b9db2-run-ovn\") pod \"ovnkube-node-rz8xt\" (UID: \"60a2166e-d71e-4540-84ce-92bf345b9db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz8xt" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.899614 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/60a2166e-d71e-4540-84ce-92bf345b9db2-host-cni-netd\") pod \"ovnkube-node-rz8xt\" (UID: \"60a2166e-d71e-4540-84ce-92bf345b9db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz8xt" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.899609 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/60a2166e-d71e-4540-84ce-92bf345b9db2-run-openvswitch\") pod \"ovnkube-node-rz8xt\" (UID: \"60a2166e-d71e-4540-84ce-92bf345b9db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz8xt" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.899658 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/60a2166e-d71e-4540-84ce-92bf345b9db2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rz8xt\" (UID: \"60a2166e-d71e-4540-84ce-92bf345b9db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz8xt" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.899678 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/60a2166e-d71e-4540-84ce-92bf345b9db2-host-slash\") pod \"ovnkube-node-rz8xt\" (UID: \"60a2166e-d71e-4540-84ce-92bf345b9db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz8xt" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.899708 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/60a2166e-d71e-4540-84ce-92bf345b9db2-run-ovn\") pod \"ovnkube-node-rz8xt\" (UID: \"60a2166e-d71e-4540-84ce-92bf345b9db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz8xt" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.899713 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/60a2166e-d71e-4540-84ce-92bf345b9db2-log-socket\") pod \"ovnkube-node-rz8xt\" (UID: \"60a2166e-d71e-4540-84ce-92bf345b9db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz8xt" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.899709 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/60a2166e-d71e-4540-84ce-92bf345b9db2-host-cni-netd\") pod \"ovnkube-node-rz8xt\" (UID: \"60a2166e-d71e-4540-84ce-92bf345b9db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz8xt" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.899675 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/60a2166e-d71e-4540-84ce-92bf345b9db2-host-run-ovn-kubernetes\") pod \"ovnkube-node-rz8xt\" (UID: \"60a2166e-d71e-4540-84ce-92bf345b9db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz8xt" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.899710 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/60a2166e-d71e-4540-84ce-92bf345b9db2-systemd-units\") pod \"ovnkube-node-rz8xt\" (UID: \"60a2166e-d71e-4540-84ce-92bf345b9db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz8xt" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.899811 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/60a2166e-d71e-4540-84ce-92bf345b9db2-run-systemd\") pod \"ovnkube-node-rz8xt\" (UID: \"60a2166e-d71e-4540-84ce-92bf345b9db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz8xt" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.899641 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/60a2166e-d71e-4540-84ce-92bf345b9db2-run-systemd\") pod \"ovnkube-node-rz8xt\" (UID: \"60a2166e-d71e-4540-84ce-92bf345b9db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz8xt" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.900098 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/60a2166e-d71e-4540-84ce-92bf345b9db2-host-kubelet\") pod \"ovnkube-node-rz8xt\" (UID: \"60a2166e-d71e-4540-84ce-92bf345b9db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz8xt" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.900108 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/60a2166e-d71e-4540-84ce-92bf345b9db2-env-overrides\") pod \"ovnkube-node-rz8xt\" (UID: \"60a2166e-d71e-4540-84ce-92bf345b9db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz8xt" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.900197 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/60a2166e-d71e-4540-84ce-92bf345b9db2-host-kubelet\") pod \"ovnkube-node-rz8xt\" (UID: \"60a2166e-d71e-4540-84ce-92bf345b9db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz8xt" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.900227 4599 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1a95b7ab-8632-4332-a30f-64f28ef8d313-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.900243 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qzwzn\" (UniqueName: \"kubernetes.io/projected/1a95b7ab-8632-4332-a30f-64f28ef8d313-kube-api-access-qzwzn\") on node \"crc\" DevicePath \"\"" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.900255 4599 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1a95b7ab-8632-4332-a30f-64f28ef8d313-run-systemd\") on node \"crc\" DevicePath \"\"" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.900316 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/60a2166e-d71e-4540-84ce-92bf345b9db2-ovnkube-script-lib\") pod \"ovnkube-node-rz8xt\" (UID: \"60a2166e-d71e-4540-84ce-92bf345b9db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz8xt" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.900505 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/60a2166e-d71e-4540-84ce-92bf345b9db2-ovnkube-config\") pod \"ovnkube-node-rz8xt\" (UID: \"60a2166e-d71e-4540-84ce-92bf345b9db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz8xt" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.904245 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/60a2166e-d71e-4540-84ce-92bf345b9db2-ovn-node-metrics-cert\") pod \"ovnkube-node-rz8xt\" (UID: \"60a2166e-d71e-4540-84ce-92bf345b9db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz8xt" Oct 12 07:43:28 crc kubenswrapper[4599]: I1012 07:43:28.913873 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkpb6\" (UniqueName: \"kubernetes.io/projected/60a2166e-d71e-4540-84ce-92bf345b9db2-kube-api-access-pkpb6\") pod \"ovnkube-node-rz8xt\" (UID: \"60a2166e-d71e-4540-84ce-92bf345b9db2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rz8xt" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.026042 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-rz8xt" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.305280 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8hm26_ce311f52-0501-45d3-8209-b1d2aa25028b/kube-multus/2.log" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.306158 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8hm26_ce311f52-0501-45d3-8209-b1d2aa25028b/kube-multus/1.log" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.306280 4599 generic.go:334] "Generic (PLEG): container finished" podID="ce311f52-0501-45d3-8209-b1d2aa25028b" containerID="cba10cb566eaf12febccccdd40f3241b968188857ba61ada0445c5f8b6b5b363" exitCode=2 Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.306402 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8hm26" event={"ID":"ce311f52-0501-45d3-8209-b1d2aa25028b","Type":"ContainerDied","Data":"cba10cb566eaf12febccccdd40f3241b968188857ba61ada0445c5f8b6b5b363"} Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.306498 4599 scope.go:117] "RemoveContainer" containerID="1d07bb1c67c880fb49b68be21180f0a5a053da62097b38605440625d10650033" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.307066 4599 scope.go:117] "RemoveContainer" containerID="cba10cb566eaf12febccccdd40f3241b968188857ba61ada0445c5f8b6b5b363" Oct 12 07:43:29 crc kubenswrapper[4599]: E1012 07:43:29.307433 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-8hm26_openshift-multus(ce311f52-0501-45d3-8209-b1d2aa25028b)\"" pod="openshift-multus/multus-8hm26" podUID="ce311f52-0501-45d3-8209-b1d2aa25028b" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.309848 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-whk5b_1a95b7ab-8632-4332-a30f-64f28ef8d313/ovnkube-controller/3.log" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.312625 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-whk5b_1a95b7ab-8632-4332-a30f-64f28ef8d313/ovn-acl-logging/0.log" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.313764 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-whk5b_1a95b7ab-8632-4332-a30f-64f28ef8d313/ovn-controller/0.log" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.314278 4599 generic.go:334] "Generic (PLEG): container finished" podID="1a95b7ab-8632-4332-a30f-64f28ef8d313" containerID="6baea2afbd1f3920c0713234756041f600567cc0d372d1e6e4007bfe2b9e663f" exitCode=0 Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.314310 4599 generic.go:334] "Generic (PLEG): container finished" podID="1a95b7ab-8632-4332-a30f-64f28ef8d313" containerID="ce496909e393b4896eb9ae165317648df3e828d219c231d95c6057cd08b78fc5" exitCode=0 Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.314319 4599 generic.go:334] "Generic (PLEG): container finished" podID="1a95b7ab-8632-4332-a30f-64f28ef8d313" containerID="5f73bbbb549129c48a1421dda3d2385cf3515feafd8215cdac6b792823d614b8" exitCode=0 Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.314326 4599 generic.go:334] "Generic (PLEG): container finished" podID="1a95b7ab-8632-4332-a30f-64f28ef8d313" containerID="a156f480c7252f61ec3dc991814db0e7259fdf7ae618c8f43d86da97f8a6bc5c" exitCode=0 Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.314354 4599 generic.go:334] "Generic (PLEG): container finished" podID="1a95b7ab-8632-4332-a30f-64f28ef8d313" containerID="c150b687d6b0c337604ac14537335d57d9c955a9e31001d274507e5e65ac7bc5" exitCode=0 Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.314361 4599 generic.go:334] "Generic (PLEG): container finished" podID="1a95b7ab-8632-4332-a30f-64f28ef8d313" containerID="543f674ce98b17c90c5a351ae72f596a2e90006a1b86fcede3359715c67b94bb" exitCode=0 Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.314368 4599 generic.go:334] "Generic (PLEG): container finished" podID="1a95b7ab-8632-4332-a30f-64f28ef8d313" containerID="f99854365302221a36d3d7647292430b5655222c551953b6aa6de9eb1c922bea" exitCode=143 Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.314375 4599 generic.go:334] "Generic (PLEG): container finished" podID="1a95b7ab-8632-4332-a30f-64f28ef8d313" containerID="810de96e34e6b028e90bcc5693f1624d337cdd775c01901fc91e6dd2a7f3564d" exitCode=143 Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.314432 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.314454 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" event={"ID":"1a95b7ab-8632-4332-a30f-64f28ef8d313","Type":"ContainerDied","Data":"6baea2afbd1f3920c0713234756041f600567cc0d372d1e6e4007bfe2b9e663f"} Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.314500 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" event={"ID":"1a95b7ab-8632-4332-a30f-64f28ef8d313","Type":"ContainerDied","Data":"ce496909e393b4896eb9ae165317648df3e828d219c231d95c6057cd08b78fc5"} Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.314514 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" event={"ID":"1a95b7ab-8632-4332-a30f-64f28ef8d313","Type":"ContainerDied","Data":"5f73bbbb549129c48a1421dda3d2385cf3515feafd8215cdac6b792823d614b8"} Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.314531 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" event={"ID":"1a95b7ab-8632-4332-a30f-64f28ef8d313","Type":"ContainerDied","Data":"a156f480c7252f61ec3dc991814db0e7259fdf7ae618c8f43d86da97f8a6bc5c"} Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.314541 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" event={"ID":"1a95b7ab-8632-4332-a30f-64f28ef8d313","Type":"ContainerDied","Data":"c150b687d6b0c337604ac14537335d57d9c955a9e31001d274507e5e65ac7bc5"} Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.314551 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" event={"ID":"1a95b7ab-8632-4332-a30f-64f28ef8d313","Type":"ContainerDied","Data":"543f674ce98b17c90c5a351ae72f596a2e90006a1b86fcede3359715c67b94bb"} Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.314563 4599 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6baea2afbd1f3920c0713234756041f600567cc0d372d1e6e4007bfe2b9e663f"} Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.314578 4599 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"22386ac9ba5ff4e4575b13c71cd5865e1a6ea9c403197c2c0ea796f37f1085d1"} Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.314583 4599 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ce496909e393b4896eb9ae165317648df3e828d219c231d95c6057cd08b78fc5"} Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.314589 4599 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5f73bbbb549129c48a1421dda3d2385cf3515feafd8215cdac6b792823d614b8"} Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.314596 4599 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a156f480c7252f61ec3dc991814db0e7259fdf7ae618c8f43d86da97f8a6bc5c"} Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.314600 4599 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c150b687d6b0c337604ac14537335d57d9c955a9e31001d274507e5e65ac7bc5"} Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.314605 4599 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"543f674ce98b17c90c5a351ae72f596a2e90006a1b86fcede3359715c67b94bb"} Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.314610 4599 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f99854365302221a36d3d7647292430b5655222c551953b6aa6de9eb1c922bea"} Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.314614 4599 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"810de96e34e6b028e90bcc5693f1624d337cdd775c01901fc91e6dd2a7f3564d"} Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.314619 4599 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ce4f64c7fab07385f9215821c5c75c5b345653fa141a238bb3dd74e7a1a291a5"} Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.314626 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" event={"ID":"1a95b7ab-8632-4332-a30f-64f28ef8d313","Type":"ContainerDied","Data":"f99854365302221a36d3d7647292430b5655222c551953b6aa6de9eb1c922bea"} Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.314633 4599 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6baea2afbd1f3920c0713234756041f600567cc0d372d1e6e4007bfe2b9e663f"} Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.314643 4599 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"22386ac9ba5ff4e4575b13c71cd5865e1a6ea9c403197c2c0ea796f37f1085d1"} Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.314649 4599 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ce496909e393b4896eb9ae165317648df3e828d219c231d95c6057cd08b78fc5"} Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.314654 4599 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5f73bbbb549129c48a1421dda3d2385cf3515feafd8215cdac6b792823d614b8"} Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.314660 4599 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a156f480c7252f61ec3dc991814db0e7259fdf7ae618c8f43d86da97f8a6bc5c"} Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.314665 4599 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c150b687d6b0c337604ac14537335d57d9c955a9e31001d274507e5e65ac7bc5"} Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.314670 4599 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"543f674ce98b17c90c5a351ae72f596a2e90006a1b86fcede3359715c67b94bb"} Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.314675 4599 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f99854365302221a36d3d7647292430b5655222c551953b6aa6de9eb1c922bea"} Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.314681 4599 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"810de96e34e6b028e90bcc5693f1624d337cdd775c01901fc91e6dd2a7f3564d"} Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.314686 4599 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ce4f64c7fab07385f9215821c5c75c5b345653fa141a238bb3dd74e7a1a291a5"} Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.314695 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" event={"ID":"1a95b7ab-8632-4332-a30f-64f28ef8d313","Type":"ContainerDied","Data":"810de96e34e6b028e90bcc5693f1624d337cdd775c01901fc91e6dd2a7f3564d"} Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.314703 4599 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6baea2afbd1f3920c0713234756041f600567cc0d372d1e6e4007bfe2b9e663f"} Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.314709 4599 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"22386ac9ba5ff4e4575b13c71cd5865e1a6ea9c403197c2c0ea796f37f1085d1"} Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.314714 4599 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ce496909e393b4896eb9ae165317648df3e828d219c231d95c6057cd08b78fc5"} Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.314721 4599 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5f73bbbb549129c48a1421dda3d2385cf3515feafd8215cdac6b792823d614b8"} Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.314725 4599 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a156f480c7252f61ec3dc991814db0e7259fdf7ae618c8f43d86da97f8a6bc5c"} Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.314731 4599 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c150b687d6b0c337604ac14537335d57d9c955a9e31001d274507e5e65ac7bc5"} Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.314736 4599 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"543f674ce98b17c90c5a351ae72f596a2e90006a1b86fcede3359715c67b94bb"} Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.314741 4599 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f99854365302221a36d3d7647292430b5655222c551953b6aa6de9eb1c922bea"} Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.314746 4599 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"810de96e34e6b028e90bcc5693f1624d337cdd775c01901fc91e6dd2a7f3564d"} Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.314750 4599 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ce4f64c7fab07385f9215821c5c75c5b345653fa141a238bb3dd74e7a1a291a5"} Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.314756 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whk5b" event={"ID":"1a95b7ab-8632-4332-a30f-64f28ef8d313","Type":"ContainerDied","Data":"ae3914feb32fa6adbbf41b62fde5ead6f30a591f461ac1ee313218ae5cefb8c3"} Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.314764 4599 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6baea2afbd1f3920c0713234756041f600567cc0d372d1e6e4007bfe2b9e663f"} Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.314770 4599 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"22386ac9ba5ff4e4575b13c71cd5865e1a6ea9c403197c2c0ea796f37f1085d1"} Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.314788 4599 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ce496909e393b4896eb9ae165317648df3e828d219c231d95c6057cd08b78fc5"} Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.314794 4599 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5f73bbbb549129c48a1421dda3d2385cf3515feafd8215cdac6b792823d614b8"} Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.314798 4599 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a156f480c7252f61ec3dc991814db0e7259fdf7ae618c8f43d86da97f8a6bc5c"} Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.314804 4599 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c150b687d6b0c337604ac14537335d57d9c955a9e31001d274507e5e65ac7bc5"} Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.314809 4599 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"543f674ce98b17c90c5a351ae72f596a2e90006a1b86fcede3359715c67b94bb"} Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.314819 4599 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f99854365302221a36d3d7647292430b5655222c551953b6aa6de9eb1c922bea"} Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.314824 4599 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"810de96e34e6b028e90bcc5693f1624d337cdd775c01901fc91e6dd2a7f3564d"} Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.314831 4599 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ce4f64c7fab07385f9215821c5c75c5b345653fa141a238bb3dd74e7a1a291a5"} Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.316677 4599 generic.go:334] "Generic (PLEG): container finished" podID="60a2166e-d71e-4540-84ce-92bf345b9db2" containerID="ae7844017f7f324b095e4e491ec8187d80d21ee208fcaf58a4e3e394191b2a7a" exitCode=0 Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.316706 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rz8xt" event={"ID":"60a2166e-d71e-4540-84ce-92bf345b9db2","Type":"ContainerDied","Data":"ae7844017f7f324b095e4e491ec8187d80d21ee208fcaf58a4e3e394191b2a7a"} Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.316726 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rz8xt" event={"ID":"60a2166e-d71e-4540-84ce-92bf345b9db2","Type":"ContainerStarted","Data":"0cc3ac14d98e872c6c1d2bc9c4af2cefaea2591b0f4b19af310ebe66b695f1b5"} Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.328655 4599 scope.go:117] "RemoveContainer" containerID="6baea2afbd1f3920c0713234756041f600567cc0d372d1e6e4007bfe2b9e663f" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.347226 4599 scope.go:117] "RemoveContainer" containerID="22386ac9ba5ff4e4575b13c71cd5865e1a6ea9c403197c2c0ea796f37f1085d1" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.374446 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-whk5b"] Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.379721 4599 scope.go:117] "RemoveContainer" containerID="ce496909e393b4896eb9ae165317648df3e828d219c231d95c6057cd08b78fc5" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.380979 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-whk5b"] Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.392655 4599 scope.go:117] "RemoveContainer" containerID="5f73bbbb549129c48a1421dda3d2385cf3515feafd8215cdac6b792823d614b8" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.405524 4599 scope.go:117] "RemoveContainer" containerID="a156f480c7252f61ec3dc991814db0e7259fdf7ae618c8f43d86da97f8a6bc5c" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.418484 4599 scope.go:117] "RemoveContainer" containerID="c150b687d6b0c337604ac14537335d57d9c955a9e31001d274507e5e65ac7bc5" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.428667 4599 scope.go:117] "RemoveContainer" containerID="543f674ce98b17c90c5a351ae72f596a2e90006a1b86fcede3359715c67b94bb" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.448429 4599 scope.go:117] "RemoveContainer" containerID="f99854365302221a36d3d7647292430b5655222c551953b6aa6de9eb1c922bea" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.458448 4599 scope.go:117] "RemoveContainer" containerID="810de96e34e6b028e90bcc5693f1624d337cdd775c01901fc91e6dd2a7f3564d" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.480602 4599 scope.go:117] "RemoveContainer" containerID="ce4f64c7fab07385f9215821c5c75c5b345653fa141a238bb3dd74e7a1a291a5" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.503321 4599 scope.go:117] "RemoveContainer" containerID="6baea2afbd1f3920c0713234756041f600567cc0d372d1e6e4007bfe2b9e663f" Oct 12 07:43:29 crc kubenswrapper[4599]: E1012 07:43:29.503668 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6baea2afbd1f3920c0713234756041f600567cc0d372d1e6e4007bfe2b9e663f\": container with ID starting with 6baea2afbd1f3920c0713234756041f600567cc0d372d1e6e4007bfe2b9e663f not found: ID does not exist" containerID="6baea2afbd1f3920c0713234756041f600567cc0d372d1e6e4007bfe2b9e663f" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.503706 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6baea2afbd1f3920c0713234756041f600567cc0d372d1e6e4007bfe2b9e663f"} err="failed to get container status \"6baea2afbd1f3920c0713234756041f600567cc0d372d1e6e4007bfe2b9e663f\": rpc error: code = NotFound desc = could not find container \"6baea2afbd1f3920c0713234756041f600567cc0d372d1e6e4007bfe2b9e663f\": container with ID starting with 6baea2afbd1f3920c0713234756041f600567cc0d372d1e6e4007bfe2b9e663f not found: ID does not exist" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.503731 4599 scope.go:117] "RemoveContainer" containerID="22386ac9ba5ff4e4575b13c71cd5865e1a6ea9c403197c2c0ea796f37f1085d1" Oct 12 07:43:29 crc kubenswrapper[4599]: E1012 07:43:29.504033 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22386ac9ba5ff4e4575b13c71cd5865e1a6ea9c403197c2c0ea796f37f1085d1\": container with ID starting with 22386ac9ba5ff4e4575b13c71cd5865e1a6ea9c403197c2c0ea796f37f1085d1 not found: ID does not exist" containerID="22386ac9ba5ff4e4575b13c71cd5865e1a6ea9c403197c2c0ea796f37f1085d1" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.504062 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22386ac9ba5ff4e4575b13c71cd5865e1a6ea9c403197c2c0ea796f37f1085d1"} err="failed to get container status \"22386ac9ba5ff4e4575b13c71cd5865e1a6ea9c403197c2c0ea796f37f1085d1\": rpc error: code = NotFound desc = could not find container \"22386ac9ba5ff4e4575b13c71cd5865e1a6ea9c403197c2c0ea796f37f1085d1\": container with ID starting with 22386ac9ba5ff4e4575b13c71cd5865e1a6ea9c403197c2c0ea796f37f1085d1 not found: ID does not exist" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.504082 4599 scope.go:117] "RemoveContainer" containerID="ce496909e393b4896eb9ae165317648df3e828d219c231d95c6057cd08b78fc5" Oct 12 07:43:29 crc kubenswrapper[4599]: E1012 07:43:29.504380 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce496909e393b4896eb9ae165317648df3e828d219c231d95c6057cd08b78fc5\": container with ID starting with ce496909e393b4896eb9ae165317648df3e828d219c231d95c6057cd08b78fc5 not found: ID does not exist" containerID="ce496909e393b4896eb9ae165317648df3e828d219c231d95c6057cd08b78fc5" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.504406 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce496909e393b4896eb9ae165317648df3e828d219c231d95c6057cd08b78fc5"} err="failed to get container status \"ce496909e393b4896eb9ae165317648df3e828d219c231d95c6057cd08b78fc5\": rpc error: code = NotFound desc = could not find container \"ce496909e393b4896eb9ae165317648df3e828d219c231d95c6057cd08b78fc5\": container with ID starting with ce496909e393b4896eb9ae165317648df3e828d219c231d95c6057cd08b78fc5 not found: ID does not exist" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.504424 4599 scope.go:117] "RemoveContainer" containerID="5f73bbbb549129c48a1421dda3d2385cf3515feafd8215cdac6b792823d614b8" Oct 12 07:43:29 crc kubenswrapper[4599]: E1012 07:43:29.504635 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f73bbbb549129c48a1421dda3d2385cf3515feafd8215cdac6b792823d614b8\": container with ID starting with 5f73bbbb549129c48a1421dda3d2385cf3515feafd8215cdac6b792823d614b8 not found: ID does not exist" containerID="5f73bbbb549129c48a1421dda3d2385cf3515feafd8215cdac6b792823d614b8" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.504655 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f73bbbb549129c48a1421dda3d2385cf3515feafd8215cdac6b792823d614b8"} err="failed to get container status \"5f73bbbb549129c48a1421dda3d2385cf3515feafd8215cdac6b792823d614b8\": rpc error: code = NotFound desc = could not find container \"5f73bbbb549129c48a1421dda3d2385cf3515feafd8215cdac6b792823d614b8\": container with ID starting with 5f73bbbb549129c48a1421dda3d2385cf3515feafd8215cdac6b792823d614b8 not found: ID does not exist" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.504668 4599 scope.go:117] "RemoveContainer" containerID="a156f480c7252f61ec3dc991814db0e7259fdf7ae618c8f43d86da97f8a6bc5c" Oct 12 07:43:29 crc kubenswrapper[4599]: E1012 07:43:29.504989 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a156f480c7252f61ec3dc991814db0e7259fdf7ae618c8f43d86da97f8a6bc5c\": container with ID starting with a156f480c7252f61ec3dc991814db0e7259fdf7ae618c8f43d86da97f8a6bc5c not found: ID does not exist" containerID="a156f480c7252f61ec3dc991814db0e7259fdf7ae618c8f43d86da97f8a6bc5c" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.505026 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a156f480c7252f61ec3dc991814db0e7259fdf7ae618c8f43d86da97f8a6bc5c"} err="failed to get container status \"a156f480c7252f61ec3dc991814db0e7259fdf7ae618c8f43d86da97f8a6bc5c\": rpc error: code = NotFound desc = could not find container \"a156f480c7252f61ec3dc991814db0e7259fdf7ae618c8f43d86da97f8a6bc5c\": container with ID starting with a156f480c7252f61ec3dc991814db0e7259fdf7ae618c8f43d86da97f8a6bc5c not found: ID does not exist" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.505040 4599 scope.go:117] "RemoveContainer" containerID="c150b687d6b0c337604ac14537335d57d9c955a9e31001d274507e5e65ac7bc5" Oct 12 07:43:29 crc kubenswrapper[4599]: E1012 07:43:29.505229 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c150b687d6b0c337604ac14537335d57d9c955a9e31001d274507e5e65ac7bc5\": container with ID starting with c150b687d6b0c337604ac14537335d57d9c955a9e31001d274507e5e65ac7bc5 not found: ID does not exist" containerID="c150b687d6b0c337604ac14537335d57d9c955a9e31001d274507e5e65ac7bc5" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.505251 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c150b687d6b0c337604ac14537335d57d9c955a9e31001d274507e5e65ac7bc5"} err="failed to get container status \"c150b687d6b0c337604ac14537335d57d9c955a9e31001d274507e5e65ac7bc5\": rpc error: code = NotFound desc = could not find container \"c150b687d6b0c337604ac14537335d57d9c955a9e31001d274507e5e65ac7bc5\": container with ID starting with c150b687d6b0c337604ac14537335d57d9c955a9e31001d274507e5e65ac7bc5 not found: ID does not exist" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.505262 4599 scope.go:117] "RemoveContainer" containerID="543f674ce98b17c90c5a351ae72f596a2e90006a1b86fcede3359715c67b94bb" Oct 12 07:43:29 crc kubenswrapper[4599]: E1012 07:43:29.505654 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"543f674ce98b17c90c5a351ae72f596a2e90006a1b86fcede3359715c67b94bb\": container with ID starting with 543f674ce98b17c90c5a351ae72f596a2e90006a1b86fcede3359715c67b94bb not found: ID does not exist" containerID="543f674ce98b17c90c5a351ae72f596a2e90006a1b86fcede3359715c67b94bb" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.505679 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"543f674ce98b17c90c5a351ae72f596a2e90006a1b86fcede3359715c67b94bb"} err="failed to get container status \"543f674ce98b17c90c5a351ae72f596a2e90006a1b86fcede3359715c67b94bb\": rpc error: code = NotFound desc = could not find container \"543f674ce98b17c90c5a351ae72f596a2e90006a1b86fcede3359715c67b94bb\": container with ID starting with 543f674ce98b17c90c5a351ae72f596a2e90006a1b86fcede3359715c67b94bb not found: ID does not exist" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.505693 4599 scope.go:117] "RemoveContainer" containerID="f99854365302221a36d3d7647292430b5655222c551953b6aa6de9eb1c922bea" Oct 12 07:43:29 crc kubenswrapper[4599]: E1012 07:43:29.505928 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f99854365302221a36d3d7647292430b5655222c551953b6aa6de9eb1c922bea\": container with ID starting with f99854365302221a36d3d7647292430b5655222c551953b6aa6de9eb1c922bea not found: ID does not exist" containerID="f99854365302221a36d3d7647292430b5655222c551953b6aa6de9eb1c922bea" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.505949 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f99854365302221a36d3d7647292430b5655222c551953b6aa6de9eb1c922bea"} err="failed to get container status \"f99854365302221a36d3d7647292430b5655222c551953b6aa6de9eb1c922bea\": rpc error: code = NotFound desc = could not find container \"f99854365302221a36d3d7647292430b5655222c551953b6aa6de9eb1c922bea\": container with ID starting with f99854365302221a36d3d7647292430b5655222c551953b6aa6de9eb1c922bea not found: ID does not exist" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.505965 4599 scope.go:117] "RemoveContainer" containerID="810de96e34e6b028e90bcc5693f1624d337cdd775c01901fc91e6dd2a7f3564d" Oct 12 07:43:29 crc kubenswrapper[4599]: E1012 07:43:29.506226 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"810de96e34e6b028e90bcc5693f1624d337cdd775c01901fc91e6dd2a7f3564d\": container with ID starting with 810de96e34e6b028e90bcc5693f1624d337cdd775c01901fc91e6dd2a7f3564d not found: ID does not exist" containerID="810de96e34e6b028e90bcc5693f1624d337cdd775c01901fc91e6dd2a7f3564d" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.506248 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"810de96e34e6b028e90bcc5693f1624d337cdd775c01901fc91e6dd2a7f3564d"} err="failed to get container status \"810de96e34e6b028e90bcc5693f1624d337cdd775c01901fc91e6dd2a7f3564d\": rpc error: code = NotFound desc = could not find container \"810de96e34e6b028e90bcc5693f1624d337cdd775c01901fc91e6dd2a7f3564d\": container with ID starting with 810de96e34e6b028e90bcc5693f1624d337cdd775c01901fc91e6dd2a7f3564d not found: ID does not exist" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.506261 4599 scope.go:117] "RemoveContainer" containerID="ce4f64c7fab07385f9215821c5c75c5b345653fa141a238bb3dd74e7a1a291a5" Oct 12 07:43:29 crc kubenswrapper[4599]: E1012 07:43:29.506501 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce4f64c7fab07385f9215821c5c75c5b345653fa141a238bb3dd74e7a1a291a5\": container with ID starting with ce4f64c7fab07385f9215821c5c75c5b345653fa141a238bb3dd74e7a1a291a5 not found: ID does not exist" containerID="ce4f64c7fab07385f9215821c5c75c5b345653fa141a238bb3dd74e7a1a291a5" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.506528 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce4f64c7fab07385f9215821c5c75c5b345653fa141a238bb3dd74e7a1a291a5"} err="failed to get container status \"ce4f64c7fab07385f9215821c5c75c5b345653fa141a238bb3dd74e7a1a291a5\": rpc error: code = NotFound desc = could not find container \"ce4f64c7fab07385f9215821c5c75c5b345653fa141a238bb3dd74e7a1a291a5\": container with ID starting with ce4f64c7fab07385f9215821c5c75c5b345653fa141a238bb3dd74e7a1a291a5 not found: ID does not exist" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.506621 4599 scope.go:117] "RemoveContainer" containerID="6baea2afbd1f3920c0713234756041f600567cc0d372d1e6e4007bfe2b9e663f" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.506977 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6baea2afbd1f3920c0713234756041f600567cc0d372d1e6e4007bfe2b9e663f"} err="failed to get container status \"6baea2afbd1f3920c0713234756041f600567cc0d372d1e6e4007bfe2b9e663f\": rpc error: code = NotFound desc = could not find container \"6baea2afbd1f3920c0713234756041f600567cc0d372d1e6e4007bfe2b9e663f\": container with ID starting with 6baea2afbd1f3920c0713234756041f600567cc0d372d1e6e4007bfe2b9e663f not found: ID does not exist" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.507004 4599 scope.go:117] "RemoveContainer" containerID="22386ac9ba5ff4e4575b13c71cd5865e1a6ea9c403197c2c0ea796f37f1085d1" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.507836 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22386ac9ba5ff4e4575b13c71cd5865e1a6ea9c403197c2c0ea796f37f1085d1"} err="failed to get container status \"22386ac9ba5ff4e4575b13c71cd5865e1a6ea9c403197c2c0ea796f37f1085d1\": rpc error: code = NotFound desc = could not find container \"22386ac9ba5ff4e4575b13c71cd5865e1a6ea9c403197c2c0ea796f37f1085d1\": container with ID starting with 22386ac9ba5ff4e4575b13c71cd5865e1a6ea9c403197c2c0ea796f37f1085d1 not found: ID does not exist" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.507860 4599 scope.go:117] "RemoveContainer" containerID="ce496909e393b4896eb9ae165317648df3e828d219c231d95c6057cd08b78fc5" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.508120 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce496909e393b4896eb9ae165317648df3e828d219c231d95c6057cd08b78fc5"} err="failed to get container status \"ce496909e393b4896eb9ae165317648df3e828d219c231d95c6057cd08b78fc5\": rpc error: code = NotFound desc = could not find container \"ce496909e393b4896eb9ae165317648df3e828d219c231d95c6057cd08b78fc5\": container with ID starting with ce496909e393b4896eb9ae165317648df3e828d219c231d95c6057cd08b78fc5 not found: ID does not exist" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.508166 4599 scope.go:117] "RemoveContainer" containerID="5f73bbbb549129c48a1421dda3d2385cf3515feafd8215cdac6b792823d614b8" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.508594 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f73bbbb549129c48a1421dda3d2385cf3515feafd8215cdac6b792823d614b8"} err="failed to get container status \"5f73bbbb549129c48a1421dda3d2385cf3515feafd8215cdac6b792823d614b8\": rpc error: code = NotFound desc = could not find container \"5f73bbbb549129c48a1421dda3d2385cf3515feafd8215cdac6b792823d614b8\": container with ID starting with 5f73bbbb549129c48a1421dda3d2385cf3515feafd8215cdac6b792823d614b8 not found: ID does not exist" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.508634 4599 scope.go:117] "RemoveContainer" containerID="a156f480c7252f61ec3dc991814db0e7259fdf7ae618c8f43d86da97f8a6bc5c" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.508910 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a156f480c7252f61ec3dc991814db0e7259fdf7ae618c8f43d86da97f8a6bc5c"} err="failed to get container status \"a156f480c7252f61ec3dc991814db0e7259fdf7ae618c8f43d86da97f8a6bc5c\": rpc error: code = NotFound desc = could not find container \"a156f480c7252f61ec3dc991814db0e7259fdf7ae618c8f43d86da97f8a6bc5c\": container with ID starting with a156f480c7252f61ec3dc991814db0e7259fdf7ae618c8f43d86da97f8a6bc5c not found: ID does not exist" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.508941 4599 scope.go:117] "RemoveContainer" containerID="c150b687d6b0c337604ac14537335d57d9c955a9e31001d274507e5e65ac7bc5" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.509380 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c150b687d6b0c337604ac14537335d57d9c955a9e31001d274507e5e65ac7bc5"} err="failed to get container status \"c150b687d6b0c337604ac14537335d57d9c955a9e31001d274507e5e65ac7bc5\": rpc error: code = NotFound desc = could not find container \"c150b687d6b0c337604ac14537335d57d9c955a9e31001d274507e5e65ac7bc5\": container with ID starting with c150b687d6b0c337604ac14537335d57d9c955a9e31001d274507e5e65ac7bc5 not found: ID does not exist" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.509403 4599 scope.go:117] "RemoveContainer" containerID="543f674ce98b17c90c5a351ae72f596a2e90006a1b86fcede3359715c67b94bb" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.509798 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"543f674ce98b17c90c5a351ae72f596a2e90006a1b86fcede3359715c67b94bb"} err="failed to get container status \"543f674ce98b17c90c5a351ae72f596a2e90006a1b86fcede3359715c67b94bb\": rpc error: code = NotFound desc = could not find container \"543f674ce98b17c90c5a351ae72f596a2e90006a1b86fcede3359715c67b94bb\": container with ID starting with 543f674ce98b17c90c5a351ae72f596a2e90006a1b86fcede3359715c67b94bb not found: ID does not exist" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.509841 4599 scope.go:117] "RemoveContainer" containerID="f99854365302221a36d3d7647292430b5655222c551953b6aa6de9eb1c922bea" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.510160 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f99854365302221a36d3d7647292430b5655222c551953b6aa6de9eb1c922bea"} err="failed to get container status \"f99854365302221a36d3d7647292430b5655222c551953b6aa6de9eb1c922bea\": rpc error: code = NotFound desc = could not find container \"f99854365302221a36d3d7647292430b5655222c551953b6aa6de9eb1c922bea\": container with ID starting with f99854365302221a36d3d7647292430b5655222c551953b6aa6de9eb1c922bea not found: ID does not exist" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.510184 4599 scope.go:117] "RemoveContainer" containerID="810de96e34e6b028e90bcc5693f1624d337cdd775c01901fc91e6dd2a7f3564d" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.510425 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"810de96e34e6b028e90bcc5693f1624d337cdd775c01901fc91e6dd2a7f3564d"} err="failed to get container status \"810de96e34e6b028e90bcc5693f1624d337cdd775c01901fc91e6dd2a7f3564d\": rpc error: code = NotFound desc = could not find container \"810de96e34e6b028e90bcc5693f1624d337cdd775c01901fc91e6dd2a7f3564d\": container with ID starting with 810de96e34e6b028e90bcc5693f1624d337cdd775c01901fc91e6dd2a7f3564d not found: ID does not exist" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.510452 4599 scope.go:117] "RemoveContainer" containerID="ce4f64c7fab07385f9215821c5c75c5b345653fa141a238bb3dd74e7a1a291a5" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.510733 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce4f64c7fab07385f9215821c5c75c5b345653fa141a238bb3dd74e7a1a291a5"} err="failed to get container status \"ce4f64c7fab07385f9215821c5c75c5b345653fa141a238bb3dd74e7a1a291a5\": rpc error: code = NotFound desc = could not find container \"ce4f64c7fab07385f9215821c5c75c5b345653fa141a238bb3dd74e7a1a291a5\": container with ID starting with ce4f64c7fab07385f9215821c5c75c5b345653fa141a238bb3dd74e7a1a291a5 not found: ID does not exist" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.510755 4599 scope.go:117] "RemoveContainer" containerID="6baea2afbd1f3920c0713234756041f600567cc0d372d1e6e4007bfe2b9e663f" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.510994 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6baea2afbd1f3920c0713234756041f600567cc0d372d1e6e4007bfe2b9e663f"} err="failed to get container status \"6baea2afbd1f3920c0713234756041f600567cc0d372d1e6e4007bfe2b9e663f\": rpc error: code = NotFound desc = could not find container \"6baea2afbd1f3920c0713234756041f600567cc0d372d1e6e4007bfe2b9e663f\": container with ID starting with 6baea2afbd1f3920c0713234756041f600567cc0d372d1e6e4007bfe2b9e663f not found: ID does not exist" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.511012 4599 scope.go:117] "RemoveContainer" containerID="22386ac9ba5ff4e4575b13c71cd5865e1a6ea9c403197c2c0ea796f37f1085d1" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.511227 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22386ac9ba5ff4e4575b13c71cd5865e1a6ea9c403197c2c0ea796f37f1085d1"} err="failed to get container status \"22386ac9ba5ff4e4575b13c71cd5865e1a6ea9c403197c2c0ea796f37f1085d1\": rpc error: code = NotFound desc = could not find container \"22386ac9ba5ff4e4575b13c71cd5865e1a6ea9c403197c2c0ea796f37f1085d1\": container with ID starting with 22386ac9ba5ff4e4575b13c71cd5865e1a6ea9c403197c2c0ea796f37f1085d1 not found: ID does not exist" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.511249 4599 scope.go:117] "RemoveContainer" containerID="ce496909e393b4896eb9ae165317648df3e828d219c231d95c6057cd08b78fc5" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.511529 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce496909e393b4896eb9ae165317648df3e828d219c231d95c6057cd08b78fc5"} err="failed to get container status \"ce496909e393b4896eb9ae165317648df3e828d219c231d95c6057cd08b78fc5\": rpc error: code = NotFound desc = could not find container \"ce496909e393b4896eb9ae165317648df3e828d219c231d95c6057cd08b78fc5\": container with ID starting with ce496909e393b4896eb9ae165317648df3e828d219c231d95c6057cd08b78fc5 not found: ID does not exist" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.511551 4599 scope.go:117] "RemoveContainer" containerID="5f73bbbb549129c48a1421dda3d2385cf3515feafd8215cdac6b792823d614b8" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.511842 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f73bbbb549129c48a1421dda3d2385cf3515feafd8215cdac6b792823d614b8"} err="failed to get container status \"5f73bbbb549129c48a1421dda3d2385cf3515feafd8215cdac6b792823d614b8\": rpc error: code = NotFound desc = could not find container \"5f73bbbb549129c48a1421dda3d2385cf3515feafd8215cdac6b792823d614b8\": container with ID starting with 5f73bbbb549129c48a1421dda3d2385cf3515feafd8215cdac6b792823d614b8 not found: ID does not exist" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.511861 4599 scope.go:117] "RemoveContainer" containerID="a156f480c7252f61ec3dc991814db0e7259fdf7ae618c8f43d86da97f8a6bc5c" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.512135 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a156f480c7252f61ec3dc991814db0e7259fdf7ae618c8f43d86da97f8a6bc5c"} err="failed to get container status \"a156f480c7252f61ec3dc991814db0e7259fdf7ae618c8f43d86da97f8a6bc5c\": rpc error: code = NotFound desc = could not find container \"a156f480c7252f61ec3dc991814db0e7259fdf7ae618c8f43d86da97f8a6bc5c\": container with ID starting with a156f480c7252f61ec3dc991814db0e7259fdf7ae618c8f43d86da97f8a6bc5c not found: ID does not exist" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.512158 4599 scope.go:117] "RemoveContainer" containerID="c150b687d6b0c337604ac14537335d57d9c955a9e31001d274507e5e65ac7bc5" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.512410 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c150b687d6b0c337604ac14537335d57d9c955a9e31001d274507e5e65ac7bc5"} err="failed to get container status \"c150b687d6b0c337604ac14537335d57d9c955a9e31001d274507e5e65ac7bc5\": rpc error: code = NotFound desc = could not find container \"c150b687d6b0c337604ac14537335d57d9c955a9e31001d274507e5e65ac7bc5\": container with ID starting with c150b687d6b0c337604ac14537335d57d9c955a9e31001d274507e5e65ac7bc5 not found: ID does not exist" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.512428 4599 scope.go:117] "RemoveContainer" containerID="543f674ce98b17c90c5a351ae72f596a2e90006a1b86fcede3359715c67b94bb" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.512688 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"543f674ce98b17c90c5a351ae72f596a2e90006a1b86fcede3359715c67b94bb"} err="failed to get container status \"543f674ce98b17c90c5a351ae72f596a2e90006a1b86fcede3359715c67b94bb\": rpc error: code = NotFound desc = could not find container \"543f674ce98b17c90c5a351ae72f596a2e90006a1b86fcede3359715c67b94bb\": container with ID starting with 543f674ce98b17c90c5a351ae72f596a2e90006a1b86fcede3359715c67b94bb not found: ID does not exist" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.512707 4599 scope.go:117] "RemoveContainer" containerID="f99854365302221a36d3d7647292430b5655222c551953b6aa6de9eb1c922bea" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.512915 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f99854365302221a36d3d7647292430b5655222c551953b6aa6de9eb1c922bea"} err="failed to get container status \"f99854365302221a36d3d7647292430b5655222c551953b6aa6de9eb1c922bea\": rpc error: code = NotFound desc = could not find container \"f99854365302221a36d3d7647292430b5655222c551953b6aa6de9eb1c922bea\": container with ID starting with f99854365302221a36d3d7647292430b5655222c551953b6aa6de9eb1c922bea not found: ID does not exist" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.512933 4599 scope.go:117] "RemoveContainer" containerID="810de96e34e6b028e90bcc5693f1624d337cdd775c01901fc91e6dd2a7f3564d" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.513123 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"810de96e34e6b028e90bcc5693f1624d337cdd775c01901fc91e6dd2a7f3564d"} err="failed to get container status \"810de96e34e6b028e90bcc5693f1624d337cdd775c01901fc91e6dd2a7f3564d\": rpc error: code = NotFound desc = could not find container \"810de96e34e6b028e90bcc5693f1624d337cdd775c01901fc91e6dd2a7f3564d\": container with ID starting with 810de96e34e6b028e90bcc5693f1624d337cdd775c01901fc91e6dd2a7f3564d not found: ID does not exist" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.513151 4599 scope.go:117] "RemoveContainer" containerID="ce4f64c7fab07385f9215821c5c75c5b345653fa141a238bb3dd74e7a1a291a5" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.513441 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce4f64c7fab07385f9215821c5c75c5b345653fa141a238bb3dd74e7a1a291a5"} err="failed to get container status \"ce4f64c7fab07385f9215821c5c75c5b345653fa141a238bb3dd74e7a1a291a5\": rpc error: code = NotFound desc = could not find container \"ce4f64c7fab07385f9215821c5c75c5b345653fa141a238bb3dd74e7a1a291a5\": container with ID starting with ce4f64c7fab07385f9215821c5c75c5b345653fa141a238bb3dd74e7a1a291a5 not found: ID does not exist" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.513460 4599 scope.go:117] "RemoveContainer" containerID="6baea2afbd1f3920c0713234756041f600567cc0d372d1e6e4007bfe2b9e663f" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.513750 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6baea2afbd1f3920c0713234756041f600567cc0d372d1e6e4007bfe2b9e663f"} err="failed to get container status \"6baea2afbd1f3920c0713234756041f600567cc0d372d1e6e4007bfe2b9e663f\": rpc error: code = NotFound desc = could not find container \"6baea2afbd1f3920c0713234756041f600567cc0d372d1e6e4007bfe2b9e663f\": container with ID starting with 6baea2afbd1f3920c0713234756041f600567cc0d372d1e6e4007bfe2b9e663f not found: ID does not exist" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.513774 4599 scope.go:117] "RemoveContainer" containerID="22386ac9ba5ff4e4575b13c71cd5865e1a6ea9c403197c2c0ea796f37f1085d1" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.514005 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22386ac9ba5ff4e4575b13c71cd5865e1a6ea9c403197c2c0ea796f37f1085d1"} err="failed to get container status \"22386ac9ba5ff4e4575b13c71cd5865e1a6ea9c403197c2c0ea796f37f1085d1\": rpc error: code = NotFound desc = could not find container \"22386ac9ba5ff4e4575b13c71cd5865e1a6ea9c403197c2c0ea796f37f1085d1\": container with ID starting with 22386ac9ba5ff4e4575b13c71cd5865e1a6ea9c403197c2c0ea796f37f1085d1 not found: ID does not exist" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.514025 4599 scope.go:117] "RemoveContainer" containerID="ce496909e393b4896eb9ae165317648df3e828d219c231d95c6057cd08b78fc5" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.514229 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce496909e393b4896eb9ae165317648df3e828d219c231d95c6057cd08b78fc5"} err="failed to get container status \"ce496909e393b4896eb9ae165317648df3e828d219c231d95c6057cd08b78fc5\": rpc error: code = NotFound desc = could not find container \"ce496909e393b4896eb9ae165317648df3e828d219c231d95c6057cd08b78fc5\": container with ID starting with ce496909e393b4896eb9ae165317648df3e828d219c231d95c6057cd08b78fc5 not found: ID does not exist" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.514254 4599 scope.go:117] "RemoveContainer" containerID="5f73bbbb549129c48a1421dda3d2385cf3515feafd8215cdac6b792823d614b8" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.514594 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f73bbbb549129c48a1421dda3d2385cf3515feafd8215cdac6b792823d614b8"} err="failed to get container status \"5f73bbbb549129c48a1421dda3d2385cf3515feafd8215cdac6b792823d614b8\": rpc error: code = NotFound desc = could not find container \"5f73bbbb549129c48a1421dda3d2385cf3515feafd8215cdac6b792823d614b8\": container with ID starting with 5f73bbbb549129c48a1421dda3d2385cf3515feafd8215cdac6b792823d614b8 not found: ID does not exist" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.514613 4599 scope.go:117] "RemoveContainer" containerID="a156f480c7252f61ec3dc991814db0e7259fdf7ae618c8f43d86da97f8a6bc5c" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.514875 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a156f480c7252f61ec3dc991814db0e7259fdf7ae618c8f43d86da97f8a6bc5c"} err="failed to get container status \"a156f480c7252f61ec3dc991814db0e7259fdf7ae618c8f43d86da97f8a6bc5c\": rpc error: code = NotFound desc = could not find container \"a156f480c7252f61ec3dc991814db0e7259fdf7ae618c8f43d86da97f8a6bc5c\": container with ID starting with a156f480c7252f61ec3dc991814db0e7259fdf7ae618c8f43d86da97f8a6bc5c not found: ID does not exist" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.514894 4599 scope.go:117] "RemoveContainer" containerID="c150b687d6b0c337604ac14537335d57d9c955a9e31001d274507e5e65ac7bc5" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.515090 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c150b687d6b0c337604ac14537335d57d9c955a9e31001d274507e5e65ac7bc5"} err="failed to get container status \"c150b687d6b0c337604ac14537335d57d9c955a9e31001d274507e5e65ac7bc5\": rpc error: code = NotFound desc = could not find container \"c150b687d6b0c337604ac14537335d57d9c955a9e31001d274507e5e65ac7bc5\": container with ID starting with c150b687d6b0c337604ac14537335d57d9c955a9e31001d274507e5e65ac7bc5 not found: ID does not exist" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.515108 4599 scope.go:117] "RemoveContainer" containerID="543f674ce98b17c90c5a351ae72f596a2e90006a1b86fcede3359715c67b94bb" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.515306 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"543f674ce98b17c90c5a351ae72f596a2e90006a1b86fcede3359715c67b94bb"} err="failed to get container status \"543f674ce98b17c90c5a351ae72f596a2e90006a1b86fcede3359715c67b94bb\": rpc error: code = NotFound desc = could not find container \"543f674ce98b17c90c5a351ae72f596a2e90006a1b86fcede3359715c67b94bb\": container with ID starting with 543f674ce98b17c90c5a351ae72f596a2e90006a1b86fcede3359715c67b94bb not found: ID does not exist" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.515323 4599 scope.go:117] "RemoveContainer" containerID="f99854365302221a36d3d7647292430b5655222c551953b6aa6de9eb1c922bea" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.515577 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f99854365302221a36d3d7647292430b5655222c551953b6aa6de9eb1c922bea"} err="failed to get container status \"f99854365302221a36d3d7647292430b5655222c551953b6aa6de9eb1c922bea\": rpc error: code = NotFound desc = could not find container \"f99854365302221a36d3d7647292430b5655222c551953b6aa6de9eb1c922bea\": container with ID starting with f99854365302221a36d3d7647292430b5655222c551953b6aa6de9eb1c922bea not found: ID does not exist" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.515595 4599 scope.go:117] "RemoveContainer" containerID="810de96e34e6b028e90bcc5693f1624d337cdd775c01901fc91e6dd2a7f3564d" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.515828 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"810de96e34e6b028e90bcc5693f1624d337cdd775c01901fc91e6dd2a7f3564d"} err="failed to get container status \"810de96e34e6b028e90bcc5693f1624d337cdd775c01901fc91e6dd2a7f3564d\": rpc error: code = NotFound desc = could not find container \"810de96e34e6b028e90bcc5693f1624d337cdd775c01901fc91e6dd2a7f3564d\": container with ID starting with 810de96e34e6b028e90bcc5693f1624d337cdd775c01901fc91e6dd2a7f3564d not found: ID does not exist" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.515850 4599 scope.go:117] "RemoveContainer" containerID="ce4f64c7fab07385f9215821c5c75c5b345653fa141a238bb3dd74e7a1a291a5" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.516060 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce4f64c7fab07385f9215821c5c75c5b345653fa141a238bb3dd74e7a1a291a5"} err="failed to get container status \"ce4f64c7fab07385f9215821c5c75c5b345653fa141a238bb3dd74e7a1a291a5\": rpc error: code = NotFound desc = could not find container \"ce4f64c7fab07385f9215821c5c75c5b345653fa141a238bb3dd74e7a1a291a5\": container with ID starting with ce4f64c7fab07385f9215821c5c75c5b345653fa141a238bb3dd74e7a1a291a5 not found: ID does not exist" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.550526 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a95b7ab-8632-4332-a30f-64f28ef8d313" path="/var/lib/kubelet/pods/1a95b7ab-8632-4332-a30f-64f28ef8d313/volumes" Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.759884 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-fwlgl" podUID="6d9a3bfc-818e-445a-a027-264cfcfade2b" containerName="registry" containerID="cri-o://bd7cef82d922801b435ec3c8a023cdf88cb3edbfb61625290db1f691e9c21629" gracePeriod=30 Oct 12 07:43:29 crc kubenswrapper[4599]: I1012 07:43:29.927722 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-fwlgl" Oct 12 07:43:30 crc kubenswrapper[4599]: I1012 07:43:30.116364 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-92bvg\" (UniqueName: \"kubernetes.io/projected/6d9a3bfc-818e-445a-a027-264cfcfade2b-kube-api-access-92bvg\") pod \"6d9a3bfc-818e-445a-a027-264cfcfade2b\" (UID: \"6d9a3bfc-818e-445a-a027-264cfcfade2b\") " Oct 12 07:43:30 crc kubenswrapper[4599]: I1012 07:43:30.116427 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6d9a3bfc-818e-445a-a027-264cfcfade2b-bound-sa-token\") pod \"6d9a3bfc-818e-445a-a027-264cfcfade2b\" (UID: \"6d9a3bfc-818e-445a-a027-264cfcfade2b\") " Oct 12 07:43:30 crc kubenswrapper[4599]: I1012 07:43:30.116486 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6d9a3bfc-818e-445a-a027-264cfcfade2b-installation-pull-secrets\") pod \"6d9a3bfc-818e-445a-a027-264cfcfade2b\" (UID: \"6d9a3bfc-818e-445a-a027-264cfcfade2b\") " Oct 12 07:43:30 crc kubenswrapper[4599]: I1012 07:43:30.116548 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6d9a3bfc-818e-445a-a027-264cfcfade2b-ca-trust-extracted\") pod \"6d9a3bfc-818e-445a-a027-264cfcfade2b\" (UID: \"6d9a3bfc-818e-445a-a027-264cfcfade2b\") " Oct 12 07:43:30 crc kubenswrapper[4599]: I1012 07:43:30.116684 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"6d9a3bfc-818e-445a-a027-264cfcfade2b\" (UID: \"6d9a3bfc-818e-445a-a027-264cfcfade2b\") " Oct 12 07:43:30 crc kubenswrapper[4599]: I1012 07:43:30.116752 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6d9a3bfc-818e-445a-a027-264cfcfade2b-registry-tls\") pod \"6d9a3bfc-818e-445a-a027-264cfcfade2b\" (UID: \"6d9a3bfc-818e-445a-a027-264cfcfade2b\") " Oct 12 07:43:30 crc kubenswrapper[4599]: I1012 07:43:30.116817 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6d9a3bfc-818e-445a-a027-264cfcfade2b-trusted-ca\") pod \"6d9a3bfc-818e-445a-a027-264cfcfade2b\" (UID: \"6d9a3bfc-818e-445a-a027-264cfcfade2b\") " Oct 12 07:43:30 crc kubenswrapper[4599]: I1012 07:43:30.116906 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6d9a3bfc-818e-445a-a027-264cfcfade2b-registry-certificates\") pod \"6d9a3bfc-818e-445a-a027-264cfcfade2b\" (UID: \"6d9a3bfc-818e-445a-a027-264cfcfade2b\") " Oct 12 07:43:30 crc kubenswrapper[4599]: I1012 07:43:30.118108 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d9a3bfc-818e-445a-a027-264cfcfade2b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "6d9a3bfc-818e-445a-a027-264cfcfade2b" (UID: "6d9a3bfc-818e-445a-a027-264cfcfade2b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:43:30 crc kubenswrapper[4599]: I1012 07:43:30.118559 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d9a3bfc-818e-445a-a027-264cfcfade2b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "6d9a3bfc-818e-445a-a027-264cfcfade2b" (UID: "6d9a3bfc-818e-445a-a027-264cfcfade2b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:43:30 crc kubenswrapper[4599]: I1012 07:43:30.124468 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d9a3bfc-818e-445a-a027-264cfcfade2b-kube-api-access-92bvg" (OuterVolumeSpecName: "kube-api-access-92bvg") pod "6d9a3bfc-818e-445a-a027-264cfcfade2b" (UID: "6d9a3bfc-818e-445a-a027-264cfcfade2b"). InnerVolumeSpecName "kube-api-access-92bvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:43:30 crc kubenswrapper[4599]: I1012 07:43:30.124618 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d9a3bfc-818e-445a-a027-264cfcfade2b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "6d9a3bfc-818e-445a-a027-264cfcfade2b" (UID: "6d9a3bfc-818e-445a-a027-264cfcfade2b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:43:30 crc kubenswrapper[4599]: I1012 07:43:30.125025 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d9a3bfc-818e-445a-a027-264cfcfade2b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "6d9a3bfc-818e-445a-a027-264cfcfade2b" (UID: "6d9a3bfc-818e-445a-a027-264cfcfade2b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:43:30 crc kubenswrapper[4599]: I1012 07:43:30.125521 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d9a3bfc-818e-445a-a027-264cfcfade2b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "6d9a3bfc-818e-445a-a027-264cfcfade2b" (UID: "6d9a3bfc-818e-445a-a027-264cfcfade2b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:43:30 crc kubenswrapper[4599]: I1012 07:43:30.126567 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "6d9a3bfc-818e-445a-a027-264cfcfade2b" (UID: "6d9a3bfc-818e-445a-a027-264cfcfade2b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 12 07:43:30 crc kubenswrapper[4599]: I1012 07:43:30.132804 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d9a3bfc-818e-445a-a027-264cfcfade2b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "6d9a3bfc-818e-445a-a027-264cfcfade2b" (UID: "6d9a3bfc-818e-445a-a027-264cfcfade2b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:43:30 crc kubenswrapper[4599]: I1012 07:43:30.218423 4599 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6d9a3bfc-818e-445a-a027-264cfcfade2b-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 12 07:43:30 crc kubenswrapper[4599]: I1012 07:43:30.218454 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-92bvg\" (UniqueName: \"kubernetes.io/projected/6d9a3bfc-818e-445a-a027-264cfcfade2b-kube-api-access-92bvg\") on node \"crc\" DevicePath \"\"" Oct 12 07:43:30 crc kubenswrapper[4599]: I1012 07:43:30.218470 4599 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6d9a3bfc-818e-445a-a027-264cfcfade2b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 12 07:43:30 crc kubenswrapper[4599]: I1012 07:43:30.218482 4599 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6d9a3bfc-818e-445a-a027-264cfcfade2b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 12 07:43:30 crc kubenswrapper[4599]: I1012 07:43:30.218494 4599 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6d9a3bfc-818e-445a-a027-264cfcfade2b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 12 07:43:30 crc kubenswrapper[4599]: I1012 07:43:30.218505 4599 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6d9a3bfc-818e-445a-a027-264cfcfade2b-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 12 07:43:30 crc kubenswrapper[4599]: I1012 07:43:30.218516 4599 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6d9a3bfc-818e-445a-a027-264cfcfade2b-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 12 07:43:30 crc kubenswrapper[4599]: I1012 07:43:30.344902 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rz8xt" event={"ID":"60a2166e-d71e-4540-84ce-92bf345b9db2","Type":"ContainerStarted","Data":"df763b087d2a9f9126b269c36a32efb9fb8b9ac055090d5d9b935741c116b5ab"} Oct 12 07:43:30 crc kubenswrapper[4599]: I1012 07:43:30.345768 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rz8xt" event={"ID":"60a2166e-d71e-4540-84ce-92bf345b9db2","Type":"ContainerStarted","Data":"da80b9bf032e81face1d3ee578f84a5ae7bd13b6bf8010b88261e055a08c98b4"} Oct 12 07:43:30 crc kubenswrapper[4599]: I1012 07:43:30.345848 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rz8xt" event={"ID":"60a2166e-d71e-4540-84ce-92bf345b9db2","Type":"ContainerStarted","Data":"d4b0cafb76d5e859f419858d8f84673cc0f812bb254c55f4ff7de723704e4bbb"} Oct 12 07:43:30 crc kubenswrapper[4599]: I1012 07:43:30.345916 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rz8xt" event={"ID":"60a2166e-d71e-4540-84ce-92bf345b9db2","Type":"ContainerStarted","Data":"3103d92f1ae7330239543b04cb54b7b03a91775a98046958121e1b9b5862907c"} Oct 12 07:43:30 crc kubenswrapper[4599]: I1012 07:43:30.345976 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rz8xt" event={"ID":"60a2166e-d71e-4540-84ce-92bf345b9db2","Type":"ContainerStarted","Data":"1a489f4875667c578bee875b1c670f5fc235b1b5ede3cbcd5edc44c6a890692d"} Oct 12 07:43:30 crc kubenswrapper[4599]: I1012 07:43:30.346028 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rz8xt" event={"ID":"60a2166e-d71e-4540-84ce-92bf345b9db2","Type":"ContainerStarted","Data":"870627887b59f99d655cbe714a28c6d361acd682d3e3aa9f0ce25b57178f7679"} Oct 12 07:43:30 crc kubenswrapper[4599]: I1012 07:43:30.347483 4599 generic.go:334] "Generic (PLEG): container finished" podID="6d9a3bfc-818e-445a-a027-264cfcfade2b" containerID="bd7cef82d922801b435ec3c8a023cdf88cb3edbfb61625290db1f691e9c21629" exitCode=0 Oct 12 07:43:30 crc kubenswrapper[4599]: I1012 07:43:30.347541 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-fwlgl" event={"ID":"6d9a3bfc-818e-445a-a027-264cfcfade2b","Type":"ContainerDied","Data":"bd7cef82d922801b435ec3c8a023cdf88cb3edbfb61625290db1f691e9c21629"} Oct 12 07:43:30 crc kubenswrapper[4599]: I1012 07:43:30.347594 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-fwlgl" Oct 12 07:43:30 crc kubenswrapper[4599]: I1012 07:43:30.347641 4599 scope.go:117] "RemoveContainer" containerID="bd7cef82d922801b435ec3c8a023cdf88cb3edbfb61625290db1f691e9c21629" Oct 12 07:43:30 crc kubenswrapper[4599]: I1012 07:43:30.347616 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-fwlgl" event={"ID":"6d9a3bfc-818e-445a-a027-264cfcfade2b","Type":"ContainerDied","Data":"2774b3586dd22f700ce384d2404bb08a4474566fad82f025a372971a167f992a"} Oct 12 07:43:30 crc kubenswrapper[4599]: I1012 07:43:30.350025 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8hm26_ce311f52-0501-45d3-8209-b1d2aa25028b/kube-multus/2.log" Oct 12 07:43:30 crc kubenswrapper[4599]: I1012 07:43:30.363018 4599 scope.go:117] "RemoveContainer" containerID="bd7cef82d922801b435ec3c8a023cdf88cb3edbfb61625290db1f691e9c21629" Oct 12 07:43:30 crc kubenswrapper[4599]: E1012 07:43:30.363400 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd7cef82d922801b435ec3c8a023cdf88cb3edbfb61625290db1f691e9c21629\": container with ID starting with bd7cef82d922801b435ec3c8a023cdf88cb3edbfb61625290db1f691e9c21629 not found: ID does not exist" containerID="bd7cef82d922801b435ec3c8a023cdf88cb3edbfb61625290db1f691e9c21629" Oct 12 07:43:30 crc kubenswrapper[4599]: I1012 07:43:30.363438 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd7cef82d922801b435ec3c8a023cdf88cb3edbfb61625290db1f691e9c21629"} err="failed to get container status \"bd7cef82d922801b435ec3c8a023cdf88cb3edbfb61625290db1f691e9c21629\": rpc error: code = NotFound desc = could not find container \"bd7cef82d922801b435ec3c8a023cdf88cb3edbfb61625290db1f691e9c21629\": container with ID starting with bd7cef82d922801b435ec3c8a023cdf88cb3edbfb61625290db1f691e9c21629 not found: ID does not exist" Oct 12 07:43:30 crc kubenswrapper[4599]: I1012 07:43:30.371219 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-fwlgl"] Oct 12 07:43:30 crc kubenswrapper[4599]: I1012 07:43:30.373933 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-fwlgl"] Oct 12 07:43:31 crc kubenswrapper[4599]: I1012 07:43:31.551816 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d9a3bfc-818e-445a-a027-264cfcfade2b" path="/var/lib/kubelet/pods/6d9a3bfc-818e-445a-a027-264cfcfade2b/volumes" Oct 12 07:43:32 crc kubenswrapper[4599]: I1012 07:43:32.367535 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rz8xt" event={"ID":"60a2166e-d71e-4540-84ce-92bf345b9db2","Type":"ContainerStarted","Data":"ea167a3088312f54c13f510477da6cb4495b6411aa8fa50d7cb2e7a9703ed743"} Oct 12 07:43:34 crc kubenswrapper[4599]: I1012 07:43:34.386048 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rz8xt" event={"ID":"60a2166e-d71e-4540-84ce-92bf345b9db2","Type":"ContainerStarted","Data":"041bb8e710d448201a549040dfa026a779df6ef0a815ab0e245b452e72606652"} Oct 12 07:43:34 crc kubenswrapper[4599]: I1012 07:43:34.386486 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-rz8xt" Oct 12 07:43:34 crc kubenswrapper[4599]: I1012 07:43:34.386515 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-rz8xt" Oct 12 07:43:34 crc kubenswrapper[4599]: I1012 07:43:34.386564 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-rz8xt" Oct 12 07:43:34 crc kubenswrapper[4599]: I1012 07:43:34.407095 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rz8xt" Oct 12 07:43:34 crc kubenswrapper[4599]: I1012 07:43:34.408284 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rz8xt" Oct 12 07:43:34 crc kubenswrapper[4599]: I1012 07:43:34.413383 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-rz8xt" podStartSLOduration=6.413328547 podStartE2EDuration="6.413328547s" podCreationTimestamp="2025-10-12 07:43:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:43:34.410615319 +0000 UTC m=+511.199810831" watchObservedRunningTime="2025-10-12 07:43:34.413328547 +0000 UTC m=+511.202524049" Oct 12 07:43:42 crc kubenswrapper[4599]: I1012 07:43:42.545125 4599 scope.go:117] "RemoveContainer" containerID="cba10cb566eaf12febccccdd40f3241b968188857ba61ada0445c5f8b6b5b363" Oct 12 07:43:42 crc kubenswrapper[4599]: E1012 07:43:42.546083 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-8hm26_openshift-multus(ce311f52-0501-45d3-8209-b1d2aa25028b)\"" pod="openshift-multus/multus-8hm26" podUID="ce311f52-0501-45d3-8209-b1d2aa25028b" Oct 12 07:43:53 crc kubenswrapper[4599]: I1012 07:43:53.547978 4599 scope.go:117] "RemoveContainer" containerID="cba10cb566eaf12febccccdd40f3241b968188857ba61ada0445c5f8b6b5b363" Oct 12 07:43:54 crc kubenswrapper[4599]: I1012 07:43:54.496691 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8hm26_ce311f52-0501-45d3-8209-b1d2aa25028b/kube-multus/2.log" Oct 12 07:43:54 crc kubenswrapper[4599]: I1012 07:43:54.497050 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8hm26" event={"ID":"ce311f52-0501-45d3-8209-b1d2aa25028b","Type":"ContainerStarted","Data":"34e1272c40b55568fcc2d13b512fbc5de9c4ba18b9a4cc02be4aef0f8181eb77"} Oct 12 07:43:54 crc kubenswrapper[4599]: I1012 07:43:54.801858 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cg6fsr"] Oct 12 07:43:54 crc kubenswrapper[4599]: E1012 07:43:54.802677 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d9a3bfc-818e-445a-a027-264cfcfade2b" containerName="registry" Oct 12 07:43:54 crc kubenswrapper[4599]: I1012 07:43:54.802703 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d9a3bfc-818e-445a-a027-264cfcfade2b" containerName="registry" Oct 12 07:43:54 crc kubenswrapper[4599]: I1012 07:43:54.802797 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d9a3bfc-818e-445a-a027-264cfcfade2b" containerName="registry" Oct 12 07:43:54 crc kubenswrapper[4599]: I1012 07:43:54.803534 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cg6fsr" Oct 12 07:43:54 crc kubenswrapper[4599]: I1012 07:43:54.808623 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 12 07:43:54 crc kubenswrapper[4599]: I1012 07:43:54.810164 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cg6fsr"] Oct 12 07:43:54 crc kubenswrapper[4599]: I1012 07:43:54.829872 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bwlh\" (UniqueName: \"kubernetes.io/projected/82699d9d-fb0c-46be-9020-16bb7e4cb65c-kube-api-access-4bwlh\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cg6fsr\" (UID: \"82699d9d-fb0c-46be-9020-16bb7e4cb65c\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cg6fsr" Oct 12 07:43:54 crc kubenswrapper[4599]: I1012 07:43:54.829942 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/82699d9d-fb0c-46be-9020-16bb7e4cb65c-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cg6fsr\" (UID: \"82699d9d-fb0c-46be-9020-16bb7e4cb65c\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cg6fsr" Oct 12 07:43:54 crc kubenswrapper[4599]: I1012 07:43:54.829981 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/82699d9d-fb0c-46be-9020-16bb7e4cb65c-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cg6fsr\" (UID: \"82699d9d-fb0c-46be-9020-16bb7e4cb65c\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cg6fsr" Oct 12 07:43:54 crc kubenswrapper[4599]: I1012 07:43:54.931733 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/82699d9d-fb0c-46be-9020-16bb7e4cb65c-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cg6fsr\" (UID: \"82699d9d-fb0c-46be-9020-16bb7e4cb65c\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cg6fsr" Oct 12 07:43:54 crc kubenswrapper[4599]: I1012 07:43:54.931814 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bwlh\" (UniqueName: \"kubernetes.io/projected/82699d9d-fb0c-46be-9020-16bb7e4cb65c-kube-api-access-4bwlh\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cg6fsr\" (UID: \"82699d9d-fb0c-46be-9020-16bb7e4cb65c\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cg6fsr" Oct 12 07:43:54 crc kubenswrapper[4599]: I1012 07:43:54.931908 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/82699d9d-fb0c-46be-9020-16bb7e4cb65c-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cg6fsr\" (UID: \"82699d9d-fb0c-46be-9020-16bb7e4cb65c\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cg6fsr" Oct 12 07:43:54 crc kubenswrapper[4599]: I1012 07:43:54.932181 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/82699d9d-fb0c-46be-9020-16bb7e4cb65c-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cg6fsr\" (UID: \"82699d9d-fb0c-46be-9020-16bb7e4cb65c\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cg6fsr" Oct 12 07:43:54 crc kubenswrapper[4599]: I1012 07:43:54.932359 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/82699d9d-fb0c-46be-9020-16bb7e4cb65c-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cg6fsr\" (UID: \"82699d9d-fb0c-46be-9020-16bb7e4cb65c\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cg6fsr" Oct 12 07:43:54 crc kubenswrapper[4599]: I1012 07:43:54.948614 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bwlh\" (UniqueName: \"kubernetes.io/projected/82699d9d-fb0c-46be-9020-16bb7e4cb65c-kube-api-access-4bwlh\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cg6fsr\" (UID: \"82699d9d-fb0c-46be-9020-16bb7e4cb65c\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cg6fsr" Oct 12 07:43:55 crc kubenswrapper[4599]: I1012 07:43:55.119107 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cg6fsr" Oct 12 07:43:55 crc kubenswrapper[4599]: I1012 07:43:55.479832 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cg6fsr"] Oct 12 07:43:55 crc kubenswrapper[4599]: I1012 07:43:55.501235 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cg6fsr" event={"ID":"82699d9d-fb0c-46be-9020-16bb7e4cb65c","Type":"ContainerStarted","Data":"7c73f08e60890777f379bec72fdda25ae506521ff712e3a38bb32db39fec4394"} Oct 12 07:43:56 crc kubenswrapper[4599]: I1012 07:43:56.507992 4599 generic.go:334] "Generic (PLEG): container finished" podID="82699d9d-fb0c-46be-9020-16bb7e4cb65c" containerID="75c97ed435467e06659db49b825570d25dfefd3c01a385233bf30a5e0227a130" exitCode=0 Oct 12 07:43:56 crc kubenswrapper[4599]: I1012 07:43:56.508048 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cg6fsr" event={"ID":"82699d9d-fb0c-46be-9020-16bb7e4cb65c","Type":"ContainerDied","Data":"75c97ed435467e06659db49b825570d25dfefd3c01a385233bf30a5e0227a130"} Oct 12 07:43:58 crc kubenswrapper[4599]: I1012 07:43:58.322382 4599 patch_prober.go:28] interesting pod/machine-config-daemon-5mz5c container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 07:43:58 crc kubenswrapper[4599]: I1012 07:43:58.322820 4599 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 07:43:58 crc kubenswrapper[4599]: I1012 07:43:58.524461 4599 generic.go:334] "Generic (PLEG): container finished" podID="82699d9d-fb0c-46be-9020-16bb7e4cb65c" containerID="fae65f7e489f441ff1259c0976e80a0532297922b3708d0c1b0642026dbb5635" exitCode=0 Oct 12 07:43:58 crc kubenswrapper[4599]: I1012 07:43:58.524513 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cg6fsr" event={"ID":"82699d9d-fb0c-46be-9020-16bb7e4cb65c","Type":"ContainerDied","Data":"fae65f7e489f441ff1259c0976e80a0532297922b3708d0c1b0642026dbb5635"} Oct 12 07:43:59 crc kubenswrapper[4599]: I1012 07:43:59.050878 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rz8xt" Oct 12 07:43:59 crc kubenswrapper[4599]: I1012 07:43:59.533476 4599 generic.go:334] "Generic (PLEG): container finished" podID="82699d9d-fb0c-46be-9020-16bb7e4cb65c" containerID="6ce9909a984d8d869911142540c3bc0703e3d567e7eeb3c37d0638f4d6c57366" exitCode=0 Oct 12 07:43:59 crc kubenswrapper[4599]: I1012 07:43:59.533534 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cg6fsr" event={"ID":"82699d9d-fb0c-46be-9020-16bb7e4cb65c","Type":"ContainerDied","Data":"6ce9909a984d8d869911142540c3bc0703e3d567e7eeb3c37d0638f4d6c57366"} Oct 12 07:44:00 crc kubenswrapper[4599]: I1012 07:44:00.718483 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cg6fsr" Oct 12 07:44:00 crc kubenswrapper[4599]: I1012 07:44:00.899206 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/82699d9d-fb0c-46be-9020-16bb7e4cb65c-util\") pod \"82699d9d-fb0c-46be-9020-16bb7e4cb65c\" (UID: \"82699d9d-fb0c-46be-9020-16bb7e4cb65c\") " Oct 12 07:44:00 crc kubenswrapper[4599]: I1012 07:44:00.899264 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4bwlh\" (UniqueName: \"kubernetes.io/projected/82699d9d-fb0c-46be-9020-16bb7e4cb65c-kube-api-access-4bwlh\") pod \"82699d9d-fb0c-46be-9020-16bb7e4cb65c\" (UID: \"82699d9d-fb0c-46be-9020-16bb7e4cb65c\") " Oct 12 07:44:00 crc kubenswrapper[4599]: I1012 07:44:00.899352 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/82699d9d-fb0c-46be-9020-16bb7e4cb65c-bundle\") pod \"82699d9d-fb0c-46be-9020-16bb7e4cb65c\" (UID: \"82699d9d-fb0c-46be-9020-16bb7e4cb65c\") " Oct 12 07:44:00 crc kubenswrapper[4599]: I1012 07:44:00.900005 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82699d9d-fb0c-46be-9020-16bb7e4cb65c-bundle" (OuterVolumeSpecName: "bundle") pod "82699d9d-fb0c-46be-9020-16bb7e4cb65c" (UID: "82699d9d-fb0c-46be-9020-16bb7e4cb65c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:44:00 crc kubenswrapper[4599]: I1012 07:44:00.905669 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82699d9d-fb0c-46be-9020-16bb7e4cb65c-kube-api-access-4bwlh" (OuterVolumeSpecName: "kube-api-access-4bwlh") pod "82699d9d-fb0c-46be-9020-16bb7e4cb65c" (UID: "82699d9d-fb0c-46be-9020-16bb7e4cb65c"). InnerVolumeSpecName "kube-api-access-4bwlh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:44:00 crc kubenswrapper[4599]: I1012 07:44:00.907531 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82699d9d-fb0c-46be-9020-16bb7e4cb65c-util" (OuterVolumeSpecName: "util") pod "82699d9d-fb0c-46be-9020-16bb7e4cb65c" (UID: "82699d9d-fb0c-46be-9020-16bb7e4cb65c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:44:01 crc kubenswrapper[4599]: I1012 07:44:01.001127 4599 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/82699d9d-fb0c-46be-9020-16bb7e4cb65c-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 07:44:01 crc kubenswrapper[4599]: I1012 07:44:01.001149 4599 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/82699d9d-fb0c-46be-9020-16bb7e4cb65c-util\") on node \"crc\" DevicePath \"\"" Oct 12 07:44:01 crc kubenswrapper[4599]: I1012 07:44:01.001163 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4bwlh\" (UniqueName: \"kubernetes.io/projected/82699d9d-fb0c-46be-9020-16bb7e4cb65c-kube-api-access-4bwlh\") on node \"crc\" DevicePath \"\"" Oct 12 07:44:01 crc kubenswrapper[4599]: I1012 07:44:01.547424 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cg6fsr" Oct 12 07:44:01 crc kubenswrapper[4599]: I1012 07:44:01.550901 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cg6fsr" event={"ID":"82699d9d-fb0c-46be-9020-16bb7e4cb65c","Type":"ContainerDied","Data":"7c73f08e60890777f379bec72fdda25ae506521ff712e3a38bb32db39fec4394"} Oct 12 07:44:01 crc kubenswrapper[4599]: I1012 07:44:01.550978 4599 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c73f08e60890777f379bec72fdda25ae506521ff712e3a38bb32db39fec4394" Oct 12 07:44:03 crc kubenswrapper[4599]: I1012 07:44:03.648056 4599 scope.go:117] "RemoveContainer" containerID="17b1c6b7ed4f23206782035e25723153ac8f1f4bdd6b7d5d18ecd73d10dbbb65" Oct 12 07:44:06 crc kubenswrapper[4599]: I1012 07:44:06.347510 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-vz57h"] Oct 12 07:44:06 crc kubenswrapper[4599]: E1012 07:44:06.348040 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82699d9d-fb0c-46be-9020-16bb7e4cb65c" containerName="pull" Oct 12 07:44:06 crc kubenswrapper[4599]: I1012 07:44:06.348054 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="82699d9d-fb0c-46be-9020-16bb7e4cb65c" containerName="pull" Oct 12 07:44:06 crc kubenswrapper[4599]: E1012 07:44:06.348066 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82699d9d-fb0c-46be-9020-16bb7e4cb65c" containerName="extract" Oct 12 07:44:06 crc kubenswrapper[4599]: I1012 07:44:06.348071 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="82699d9d-fb0c-46be-9020-16bb7e4cb65c" containerName="extract" Oct 12 07:44:06 crc kubenswrapper[4599]: E1012 07:44:06.348079 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82699d9d-fb0c-46be-9020-16bb7e4cb65c" containerName="util" Oct 12 07:44:06 crc kubenswrapper[4599]: I1012 07:44:06.348085 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="82699d9d-fb0c-46be-9020-16bb7e4cb65c" containerName="util" Oct 12 07:44:06 crc kubenswrapper[4599]: I1012 07:44:06.348186 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="82699d9d-fb0c-46be-9020-16bb7e4cb65c" containerName="extract" Oct 12 07:44:06 crc kubenswrapper[4599]: I1012 07:44:06.348638 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-vz57h" Oct 12 07:44:06 crc kubenswrapper[4599]: I1012 07:44:06.350080 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-jnxst" Oct 12 07:44:06 crc kubenswrapper[4599]: I1012 07:44:06.351138 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Oct 12 07:44:06 crc kubenswrapper[4599]: I1012 07:44:06.352446 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Oct 12 07:44:06 crc kubenswrapper[4599]: I1012 07:44:06.357948 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-vz57h"] Oct 12 07:44:06 crc kubenswrapper[4599]: I1012 07:44:06.459813 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdg4k\" (UniqueName: \"kubernetes.io/projected/87a55b6b-2189-4332-beb0-5bf12c1ded00-kube-api-access-cdg4k\") pod \"nmstate-operator-858ddd8f98-vz57h\" (UID: \"87a55b6b-2189-4332-beb0-5bf12c1ded00\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-vz57h" Oct 12 07:44:06 crc kubenswrapper[4599]: I1012 07:44:06.561053 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdg4k\" (UniqueName: \"kubernetes.io/projected/87a55b6b-2189-4332-beb0-5bf12c1ded00-kube-api-access-cdg4k\") pod \"nmstate-operator-858ddd8f98-vz57h\" (UID: \"87a55b6b-2189-4332-beb0-5bf12c1ded00\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-vz57h" Oct 12 07:44:06 crc kubenswrapper[4599]: I1012 07:44:06.578261 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdg4k\" (UniqueName: \"kubernetes.io/projected/87a55b6b-2189-4332-beb0-5bf12c1ded00-kube-api-access-cdg4k\") pod \"nmstate-operator-858ddd8f98-vz57h\" (UID: \"87a55b6b-2189-4332-beb0-5bf12c1ded00\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-vz57h" Oct 12 07:44:06 crc kubenswrapper[4599]: I1012 07:44:06.662303 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-vz57h" Oct 12 07:44:07 crc kubenswrapper[4599]: I1012 07:44:07.010015 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-vz57h"] Oct 12 07:44:07 crc kubenswrapper[4599]: I1012 07:44:07.576909 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-vz57h" event={"ID":"87a55b6b-2189-4332-beb0-5bf12c1ded00","Type":"ContainerStarted","Data":"3ee571a44158f915b316e47b749000c884c2684ed0a6cba8b6d6f68cca15d24e"} Oct 12 07:44:09 crc kubenswrapper[4599]: I1012 07:44:09.591899 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-vz57h" event={"ID":"87a55b6b-2189-4332-beb0-5bf12c1ded00","Type":"ContainerStarted","Data":"95b0a33e385aeb858401edf0fcfbe76ce27eee497f9842e5d29745c6979b8947"} Oct 12 07:44:09 crc kubenswrapper[4599]: I1012 07:44:09.604136 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-858ddd8f98-vz57h" podStartSLOduration=1.778123627 podStartE2EDuration="3.604118636s" podCreationTimestamp="2025-10-12 07:44:06 +0000 UTC" firstStartedPulling="2025-10-12 07:44:07.021401078 +0000 UTC m=+543.810596580" lastFinishedPulling="2025-10-12 07:44:08.847396087 +0000 UTC m=+545.636591589" observedRunningTime="2025-10-12 07:44:09.603305771 +0000 UTC m=+546.392501273" watchObservedRunningTime="2025-10-12 07:44:09.604118636 +0000 UTC m=+546.393314138" Oct 12 07:44:15 crc kubenswrapper[4599]: I1012 07:44:15.209265 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-tls82"] Oct 12 07:44:15 crc kubenswrapper[4599]: I1012 07:44:15.210157 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-tls82" Oct 12 07:44:15 crc kubenswrapper[4599]: I1012 07:44:15.211848 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-4hrn8" Oct 12 07:44:15 crc kubenswrapper[4599]: I1012 07:44:15.222829 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-rb46k"] Oct 12 07:44:15 crc kubenswrapper[4599]: I1012 07:44:15.223476 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-rb46k" Oct 12 07:44:15 crc kubenswrapper[4599]: I1012 07:44:15.227094 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-tls82"] Oct 12 07:44:15 crc kubenswrapper[4599]: I1012 07:44:15.232456 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Oct 12 07:44:15 crc kubenswrapper[4599]: I1012 07:44:15.236073 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-mxzhr"] Oct 12 07:44:15 crc kubenswrapper[4599]: I1012 07:44:15.236869 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-mxzhr" Oct 12 07:44:15 crc kubenswrapper[4599]: I1012 07:44:15.244292 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-rb46k"] Oct 12 07:44:15 crc kubenswrapper[4599]: I1012 07:44:15.263156 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/cf76296b-ab43-4e40-83c9-ee507169ea4c-nmstate-lock\") pod \"nmstate-handler-mxzhr\" (UID: \"cf76296b-ab43-4e40-83c9-ee507169ea4c\") " pod="openshift-nmstate/nmstate-handler-mxzhr" Oct 12 07:44:15 crc kubenswrapper[4599]: I1012 07:44:15.263468 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwlf7\" (UniqueName: \"kubernetes.io/projected/cf76296b-ab43-4e40-83c9-ee507169ea4c-kube-api-access-xwlf7\") pod \"nmstate-handler-mxzhr\" (UID: \"cf76296b-ab43-4e40-83c9-ee507169ea4c\") " pod="openshift-nmstate/nmstate-handler-mxzhr" Oct 12 07:44:15 crc kubenswrapper[4599]: I1012 07:44:15.263607 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/cf76296b-ab43-4e40-83c9-ee507169ea4c-dbus-socket\") pod \"nmstate-handler-mxzhr\" (UID: \"cf76296b-ab43-4e40-83c9-ee507169ea4c\") " pod="openshift-nmstate/nmstate-handler-mxzhr" Oct 12 07:44:15 crc kubenswrapper[4599]: I1012 07:44:15.263696 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/fe7e44b3-5972-4d03-8919-8a67214fee06-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-rb46k\" (UID: \"fe7e44b3-5972-4d03-8919-8a67214fee06\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-rb46k" Oct 12 07:44:15 crc kubenswrapper[4599]: I1012 07:44:15.264184 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/cf76296b-ab43-4e40-83c9-ee507169ea4c-ovs-socket\") pod \"nmstate-handler-mxzhr\" (UID: \"cf76296b-ab43-4e40-83c9-ee507169ea4c\") " pod="openshift-nmstate/nmstate-handler-mxzhr" Oct 12 07:44:15 crc kubenswrapper[4599]: I1012 07:44:15.264266 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qx4p8\" (UniqueName: \"kubernetes.io/projected/126c69f6-12a5-46a8-a817-23b97dc624d7-kube-api-access-qx4p8\") pod \"nmstate-metrics-fdff9cb8d-tls82\" (UID: \"126c69f6-12a5-46a8-a817-23b97dc624d7\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-tls82" Oct 12 07:44:15 crc kubenswrapper[4599]: I1012 07:44:15.264294 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jvns\" (UniqueName: \"kubernetes.io/projected/fe7e44b3-5972-4d03-8919-8a67214fee06-kube-api-access-7jvns\") pod \"nmstate-webhook-6cdbc54649-rb46k\" (UID: \"fe7e44b3-5972-4d03-8919-8a67214fee06\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-rb46k" Oct 12 07:44:15 crc kubenswrapper[4599]: I1012 07:44:15.322818 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-klx8j"] Oct 12 07:44:15 crc kubenswrapper[4599]: I1012 07:44:15.323594 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-klx8j" Oct 12 07:44:15 crc kubenswrapper[4599]: I1012 07:44:15.325198 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Oct 12 07:44:15 crc kubenswrapper[4599]: I1012 07:44:15.325459 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-rpnbd" Oct 12 07:44:15 crc kubenswrapper[4599]: I1012 07:44:15.328849 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Oct 12 07:44:15 crc kubenswrapper[4599]: I1012 07:44:15.330452 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-klx8j"] Oct 12 07:44:15 crc kubenswrapper[4599]: I1012 07:44:15.365277 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/2ff98253-bb25-450e-9202-817788dab660-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-klx8j\" (UID: \"2ff98253-bb25-450e-9202-817788dab660\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-klx8j" Oct 12 07:44:15 crc kubenswrapper[4599]: I1012 07:44:15.365372 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/cf76296b-ab43-4e40-83c9-ee507169ea4c-dbus-socket\") pod \"nmstate-handler-mxzhr\" (UID: \"cf76296b-ab43-4e40-83c9-ee507169ea4c\") " pod="openshift-nmstate/nmstate-handler-mxzhr" Oct 12 07:44:15 crc kubenswrapper[4599]: I1012 07:44:15.365412 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/fe7e44b3-5972-4d03-8919-8a67214fee06-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-rb46k\" (UID: \"fe7e44b3-5972-4d03-8919-8a67214fee06\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-rb46k" Oct 12 07:44:15 crc kubenswrapper[4599]: I1012 07:44:15.365436 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/2ff98253-bb25-450e-9202-817788dab660-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-klx8j\" (UID: \"2ff98253-bb25-450e-9202-817788dab660\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-klx8j" Oct 12 07:44:15 crc kubenswrapper[4599]: I1012 07:44:15.365466 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/cf76296b-ab43-4e40-83c9-ee507169ea4c-ovs-socket\") pod \"nmstate-handler-mxzhr\" (UID: \"cf76296b-ab43-4e40-83c9-ee507169ea4c\") " pod="openshift-nmstate/nmstate-handler-mxzhr" Oct 12 07:44:15 crc kubenswrapper[4599]: I1012 07:44:15.365490 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qx4p8\" (UniqueName: \"kubernetes.io/projected/126c69f6-12a5-46a8-a817-23b97dc624d7-kube-api-access-qx4p8\") pod \"nmstate-metrics-fdff9cb8d-tls82\" (UID: \"126c69f6-12a5-46a8-a817-23b97dc624d7\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-tls82" Oct 12 07:44:15 crc kubenswrapper[4599]: I1012 07:44:15.365510 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jvns\" (UniqueName: \"kubernetes.io/projected/fe7e44b3-5972-4d03-8919-8a67214fee06-kube-api-access-7jvns\") pod \"nmstate-webhook-6cdbc54649-rb46k\" (UID: \"fe7e44b3-5972-4d03-8919-8a67214fee06\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-rb46k" Oct 12 07:44:15 crc kubenswrapper[4599]: I1012 07:44:15.365537 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hl4lc\" (UniqueName: \"kubernetes.io/projected/2ff98253-bb25-450e-9202-817788dab660-kube-api-access-hl4lc\") pod \"nmstate-console-plugin-6b874cbd85-klx8j\" (UID: \"2ff98253-bb25-450e-9202-817788dab660\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-klx8j" Oct 12 07:44:15 crc kubenswrapper[4599]: I1012 07:44:15.365567 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/cf76296b-ab43-4e40-83c9-ee507169ea4c-nmstate-lock\") pod \"nmstate-handler-mxzhr\" (UID: \"cf76296b-ab43-4e40-83c9-ee507169ea4c\") " pod="openshift-nmstate/nmstate-handler-mxzhr" Oct 12 07:44:15 crc kubenswrapper[4599]: I1012 07:44:15.365617 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwlf7\" (UniqueName: \"kubernetes.io/projected/cf76296b-ab43-4e40-83c9-ee507169ea4c-kube-api-access-xwlf7\") pod \"nmstate-handler-mxzhr\" (UID: \"cf76296b-ab43-4e40-83c9-ee507169ea4c\") " pod="openshift-nmstate/nmstate-handler-mxzhr" Oct 12 07:44:15 crc kubenswrapper[4599]: I1012 07:44:15.365694 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/cf76296b-ab43-4e40-83c9-ee507169ea4c-dbus-socket\") pod \"nmstate-handler-mxzhr\" (UID: \"cf76296b-ab43-4e40-83c9-ee507169ea4c\") " pod="openshift-nmstate/nmstate-handler-mxzhr" Oct 12 07:44:15 crc kubenswrapper[4599]: I1012 07:44:15.365865 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/cf76296b-ab43-4e40-83c9-ee507169ea4c-ovs-socket\") pod \"nmstate-handler-mxzhr\" (UID: \"cf76296b-ab43-4e40-83c9-ee507169ea4c\") " pod="openshift-nmstate/nmstate-handler-mxzhr" Oct 12 07:44:15 crc kubenswrapper[4599]: I1012 07:44:15.365983 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/cf76296b-ab43-4e40-83c9-ee507169ea4c-nmstate-lock\") pod \"nmstate-handler-mxzhr\" (UID: \"cf76296b-ab43-4e40-83c9-ee507169ea4c\") " pod="openshift-nmstate/nmstate-handler-mxzhr" Oct 12 07:44:15 crc kubenswrapper[4599]: I1012 07:44:15.379688 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/fe7e44b3-5972-4d03-8919-8a67214fee06-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-rb46k\" (UID: \"fe7e44b3-5972-4d03-8919-8a67214fee06\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-rb46k" Oct 12 07:44:15 crc kubenswrapper[4599]: I1012 07:44:15.381976 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwlf7\" (UniqueName: \"kubernetes.io/projected/cf76296b-ab43-4e40-83c9-ee507169ea4c-kube-api-access-xwlf7\") pod \"nmstate-handler-mxzhr\" (UID: \"cf76296b-ab43-4e40-83c9-ee507169ea4c\") " pod="openshift-nmstate/nmstate-handler-mxzhr" Oct 12 07:44:15 crc kubenswrapper[4599]: I1012 07:44:15.382162 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jvns\" (UniqueName: \"kubernetes.io/projected/fe7e44b3-5972-4d03-8919-8a67214fee06-kube-api-access-7jvns\") pod \"nmstate-webhook-6cdbc54649-rb46k\" (UID: \"fe7e44b3-5972-4d03-8919-8a67214fee06\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-rb46k" Oct 12 07:44:15 crc kubenswrapper[4599]: I1012 07:44:15.382611 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qx4p8\" (UniqueName: \"kubernetes.io/projected/126c69f6-12a5-46a8-a817-23b97dc624d7-kube-api-access-qx4p8\") pod \"nmstate-metrics-fdff9cb8d-tls82\" (UID: \"126c69f6-12a5-46a8-a817-23b97dc624d7\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-tls82" Oct 12 07:44:15 crc kubenswrapper[4599]: I1012 07:44:15.466021 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/2ff98253-bb25-450e-9202-817788dab660-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-klx8j\" (UID: \"2ff98253-bb25-450e-9202-817788dab660\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-klx8j" Oct 12 07:44:15 crc kubenswrapper[4599]: I1012 07:44:15.466088 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hl4lc\" (UniqueName: \"kubernetes.io/projected/2ff98253-bb25-450e-9202-817788dab660-kube-api-access-hl4lc\") pod \"nmstate-console-plugin-6b874cbd85-klx8j\" (UID: \"2ff98253-bb25-450e-9202-817788dab660\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-klx8j" Oct 12 07:44:15 crc kubenswrapper[4599]: I1012 07:44:15.466155 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/2ff98253-bb25-450e-9202-817788dab660-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-klx8j\" (UID: \"2ff98253-bb25-450e-9202-817788dab660\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-klx8j" Oct 12 07:44:15 crc kubenswrapper[4599]: I1012 07:44:15.467233 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/2ff98253-bb25-450e-9202-817788dab660-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-klx8j\" (UID: \"2ff98253-bb25-450e-9202-817788dab660\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-klx8j" Oct 12 07:44:15 crc kubenswrapper[4599]: I1012 07:44:15.469533 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/2ff98253-bb25-450e-9202-817788dab660-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-klx8j\" (UID: \"2ff98253-bb25-450e-9202-817788dab660\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-klx8j" Oct 12 07:44:15 crc kubenswrapper[4599]: I1012 07:44:15.483247 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hl4lc\" (UniqueName: \"kubernetes.io/projected/2ff98253-bb25-450e-9202-817788dab660-kube-api-access-hl4lc\") pod \"nmstate-console-plugin-6b874cbd85-klx8j\" (UID: \"2ff98253-bb25-450e-9202-817788dab660\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-klx8j" Oct 12 07:44:15 crc kubenswrapper[4599]: I1012 07:44:15.489639 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-7dc47c5bb6-m2gd9"] Oct 12 07:44:15 crc kubenswrapper[4599]: I1012 07:44:15.490407 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7dc47c5bb6-m2gd9" Oct 12 07:44:15 crc kubenswrapper[4599]: I1012 07:44:15.501102 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7dc47c5bb6-m2gd9"] Oct 12 07:44:15 crc kubenswrapper[4599]: I1012 07:44:15.524968 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-tls82" Oct 12 07:44:15 crc kubenswrapper[4599]: I1012 07:44:15.534416 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-rb46k" Oct 12 07:44:15 crc kubenswrapper[4599]: I1012 07:44:15.548172 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-mxzhr" Oct 12 07:44:15 crc kubenswrapper[4599]: W1012 07:44:15.572984 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf76296b_ab43_4e40_83c9_ee507169ea4c.slice/crio-e81d0050b3d173f047c53b0b892af6fba665087282db6d355ac1803d2be439a6 WatchSource:0}: Error finding container e81d0050b3d173f047c53b0b892af6fba665087282db6d355ac1803d2be439a6: Status 404 returned error can't find the container with id e81d0050b3d173f047c53b0b892af6fba665087282db6d355ac1803d2be439a6 Oct 12 07:44:15 crc kubenswrapper[4599]: I1012 07:44:15.620309 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-mxzhr" event={"ID":"cf76296b-ab43-4e40-83c9-ee507169ea4c","Type":"ContainerStarted","Data":"e81d0050b3d173f047c53b0b892af6fba665087282db6d355ac1803d2be439a6"} Oct 12 07:44:15 crc kubenswrapper[4599]: I1012 07:44:15.635816 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-klx8j" Oct 12 07:44:15 crc kubenswrapper[4599]: I1012 07:44:15.668020 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdbkp\" (UniqueName: \"kubernetes.io/projected/2bcb636c-8377-489e-a0be-8905bc48bd9d-kube-api-access-vdbkp\") pod \"console-7dc47c5bb6-m2gd9\" (UID: \"2bcb636c-8377-489e-a0be-8905bc48bd9d\") " pod="openshift-console/console-7dc47c5bb6-m2gd9" Oct 12 07:44:15 crc kubenswrapper[4599]: I1012 07:44:15.668057 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2bcb636c-8377-489e-a0be-8905bc48bd9d-oauth-serving-cert\") pod \"console-7dc47c5bb6-m2gd9\" (UID: \"2bcb636c-8377-489e-a0be-8905bc48bd9d\") " pod="openshift-console/console-7dc47c5bb6-m2gd9" Oct 12 07:44:15 crc kubenswrapper[4599]: I1012 07:44:15.668085 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2bcb636c-8377-489e-a0be-8905bc48bd9d-console-serving-cert\") pod \"console-7dc47c5bb6-m2gd9\" (UID: \"2bcb636c-8377-489e-a0be-8905bc48bd9d\") " pod="openshift-console/console-7dc47c5bb6-m2gd9" Oct 12 07:44:15 crc kubenswrapper[4599]: I1012 07:44:15.668105 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2bcb636c-8377-489e-a0be-8905bc48bd9d-console-config\") pod \"console-7dc47c5bb6-m2gd9\" (UID: \"2bcb636c-8377-489e-a0be-8905bc48bd9d\") " pod="openshift-console/console-7dc47c5bb6-m2gd9" Oct 12 07:44:15 crc kubenswrapper[4599]: I1012 07:44:15.668257 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2bcb636c-8377-489e-a0be-8905bc48bd9d-trusted-ca-bundle\") pod \"console-7dc47c5bb6-m2gd9\" (UID: \"2bcb636c-8377-489e-a0be-8905bc48bd9d\") " pod="openshift-console/console-7dc47c5bb6-m2gd9" Oct 12 07:44:15 crc kubenswrapper[4599]: I1012 07:44:15.668273 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2bcb636c-8377-489e-a0be-8905bc48bd9d-console-oauth-config\") pod \"console-7dc47c5bb6-m2gd9\" (UID: \"2bcb636c-8377-489e-a0be-8905bc48bd9d\") " pod="openshift-console/console-7dc47c5bb6-m2gd9" Oct 12 07:44:15 crc kubenswrapper[4599]: I1012 07:44:15.668295 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2bcb636c-8377-489e-a0be-8905bc48bd9d-service-ca\") pod \"console-7dc47c5bb6-m2gd9\" (UID: \"2bcb636c-8377-489e-a0be-8905bc48bd9d\") " pod="openshift-console/console-7dc47c5bb6-m2gd9" Oct 12 07:44:15 crc kubenswrapper[4599]: I1012 07:44:15.678599 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-tls82"] Oct 12 07:44:15 crc kubenswrapper[4599]: I1012 07:44:15.727419 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-rb46k"] Oct 12 07:44:15 crc kubenswrapper[4599]: W1012 07:44:15.730524 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe7e44b3_5972_4d03_8919_8a67214fee06.slice/crio-721ed0a9f091500ffa5d45eb2f4bd90beafb544ef28a7c8b3545f3ed1db67455 WatchSource:0}: Error finding container 721ed0a9f091500ffa5d45eb2f4bd90beafb544ef28a7c8b3545f3ed1db67455: Status 404 returned error can't find the container with id 721ed0a9f091500ffa5d45eb2f4bd90beafb544ef28a7c8b3545f3ed1db67455 Oct 12 07:44:15 crc kubenswrapper[4599]: I1012 07:44:15.769029 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdbkp\" (UniqueName: \"kubernetes.io/projected/2bcb636c-8377-489e-a0be-8905bc48bd9d-kube-api-access-vdbkp\") pod \"console-7dc47c5bb6-m2gd9\" (UID: \"2bcb636c-8377-489e-a0be-8905bc48bd9d\") " pod="openshift-console/console-7dc47c5bb6-m2gd9" Oct 12 07:44:15 crc kubenswrapper[4599]: I1012 07:44:15.769076 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2bcb636c-8377-489e-a0be-8905bc48bd9d-oauth-serving-cert\") pod \"console-7dc47c5bb6-m2gd9\" (UID: \"2bcb636c-8377-489e-a0be-8905bc48bd9d\") " pod="openshift-console/console-7dc47c5bb6-m2gd9" Oct 12 07:44:15 crc kubenswrapper[4599]: I1012 07:44:15.769115 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2bcb636c-8377-489e-a0be-8905bc48bd9d-console-serving-cert\") pod \"console-7dc47c5bb6-m2gd9\" (UID: \"2bcb636c-8377-489e-a0be-8905bc48bd9d\") " pod="openshift-console/console-7dc47c5bb6-m2gd9" Oct 12 07:44:15 crc kubenswrapper[4599]: I1012 07:44:15.769134 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2bcb636c-8377-489e-a0be-8905bc48bd9d-console-config\") pod \"console-7dc47c5bb6-m2gd9\" (UID: \"2bcb636c-8377-489e-a0be-8905bc48bd9d\") " pod="openshift-console/console-7dc47c5bb6-m2gd9" Oct 12 07:44:15 crc kubenswrapper[4599]: I1012 07:44:15.769229 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2bcb636c-8377-489e-a0be-8905bc48bd9d-trusted-ca-bundle\") pod \"console-7dc47c5bb6-m2gd9\" (UID: \"2bcb636c-8377-489e-a0be-8905bc48bd9d\") " pod="openshift-console/console-7dc47c5bb6-m2gd9" Oct 12 07:44:15 crc kubenswrapper[4599]: I1012 07:44:15.769247 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2bcb636c-8377-489e-a0be-8905bc48bd9d-console-oauth-config\") pod \"console-7dc47c5bb6-m2gd9\" (UID: \"2bcb636c-8377-489e-a0be-8905bc48bd9d\") " pod="openshift-console/console-7dc47c5bb6-m2gd9" Oct 12 07:44:15 crc kubenswrapper[4599]: I1012 07:44:15.769272 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2bcb636c-8377-489e-a0be-8905bc48bd9d-service-ca\") pod \"console-7dc47c5bb6-m2gd9\" (UID: \"2bcb636c-8377-489e-a0be-8905bc48bd9d\") " pod="openshift-console/console-7dc47c5bb6-m2gd9" Oct 12 07:44:15 crc kubenswrapper[4599]: I1012 07:44:15.770167 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2bcb636c-8377-489e-a0be-8905bc48bd9d-oauth-serving-cert\") pod \"console-7dc47c5bb6-m2gd9\" (UID: \"2bcb636c-8377-489e-a0be-8905bc48bd9d\") " pod="openshift-console/console-7dc47c5bb6-m2gd9" Oct 12 07:44:15 crc kubenswrapper[4599]: I1012 07:44:15.770199 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2bcb636c-8377-489e-a0be-8905bc48bd9d-service-ca\") pod \"console-7dc47c5bb6-m2gd9\" (UID: \"2bcb636c-8377-489e-a0be-8905bc48bd9d\") " pod="openshift-console/console-7dc47c5bb6-m2gd9" Oct 12 07:44:15 crc kubenswrapper[4599]: I1012 07:44:15.770807 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2bcb636c-8377-489e-a0be-8905bc48bd9d-console-config\") pod \"console-7dc47c5bb6-m2gd9\" (UID: \"2bcb636c-8377-489e-a0be-8905bc48bd9d\") " pod="openshift-console/console-7dc47c5bb6-m2gd9" Oct 12 07:44:15 crc kubenswrapper[4599]: I1012 07:44:15.770933 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2bcb636c-8377-489e-a0be-8905bc48bd9d-trusted-ca-bundle\") pod \"console-7dc47c5bb6-m2gd9\" (UID: \"2bcb636c-8377-489e-a0be-8905bc48bd9d\") " pod="openshift-console/console-7dc47c5bb6-m2gd9" Oct 12 07:44:15 crc kubenswrapper[4599]: I1012 07:44:15.778292 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2bcb636c-8377-489e-a0be-8905bc48bd9d-console-oauth-config\") pod \"console-7dc47c5bb6-m2gd9\" (UID: \"2bcb636c-8377-489e-a0be-8905bc48bd9d\") " pod="openshift-console/console-7dc47c5bb6-m2gd9" Oct 12 07:44:15 crc kubenswrapper[4599]: I1012 07:44:15.779224 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2bcb636c-8377-489e-a0be-8905bc48bd9d-console-serving-cert\") pod \"console-7dc47c5bb6-m2gd9\" (UID: \"2bcb636c-8377-489e-a0be-8905bc48bd9d\") " pod="openshift-console/console-7dc47c5bb6-m2gd9" Oct 12 07:44:15 crc kubenswrapper[4599]: I1012 07:44:15.786563 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdbkp\" (UniqueName: \"kubernetes.io/projected/2bcb636c-8377-489e-a0be-8905bc48bd9d-kube-api-access-vdbkp\") pod \"console-7dc47c5bb6-m2gd9\" (UID: \"2bcb636c-8377-489e-a0be-8905bc48bd9d\") " pod="openshift-console/console-7dc47c5bb6-m2gd9" Oct 12 07:44:15 crc kubenswrapper[4599]: I1012 07:44:15.812694 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7dc47c5bb6-m2gd9" Oct 12 07:44:15 crc kubenswrapper[4599]: I1012 07:44:15.819775 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-klx8j"] Oct 12 07:44:15 crc kubenswrapper[4599]: W1012 07:44:15.825123 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ff98253_bb25_450e_9202_817788dab660.slice/crio-39859fd9de8e59920d6038e88a45e1a017905a723a4c5b16d63b5f6065aa00b5 WatchSource:0}: Error finding container 39859fd9de8e59920d6038e88a45e1a017905a723a4c5b16d63b5f6065aa00b5: Status 404 returned error can't find the container with id 39859fd9de8e59920d6038e88a45e1a017905a723a4c5b16d63b5f6065aa00b5 Oct 12 07:44:15 crc kubenswrapper[4599]: I1012 07:44:15.974462 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7dc47c5bb6-m2gd9"] Oct 12 07:44:15 crc kubenswrapper[4599]: W1012 07:44:15.985865 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2bcb636c_8377_489e_a0be_8905bc48bd9d.slice/crio-5c3a785dd99b80d376954f221641c2a89bbf8a6f9ac840f80459f24cabbbd892 WatchSource:0}: Error finding container 5c3a785dd99b80d376954f221641c2a89bbf8a6f9ac840f80459f24cabbbd892: Status 404 returned error can't find the container with id 5c3a785dd99b80d376954f221641c2a89bbf8a6f9ac840f80459f24cabbbd892 Oct 12 07:44:16 crc kubenswrapper[4599]: I1012 07:44:16.626545 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-klx8j" event={"ID":"2ff98253-bb25-450e-9202-817788dab660","Type":"ContainerStarted","Data":"39859fd9de8e59920d6038e88a45e1a017905a723a4c5b16d63b5f6065aa00b5"} Oct 12 07:44:16 crc kubenswrapper[4599]: I1012 07:44:16.627442 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-tls82" event={"ID":"126c69f6-12a5-46a8-a817-23b97dc624d7","Type":"ContainerStarted","Data":"f9a53e85aa75ea39d63c48b8cdc17bef25953a8140ad032f0c7b2e2879be2a49"} Oct 12 07:44:16 crc kubenswrapper[4599]: I1012 07:44:16.628864 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7dc47c5bb6-m2gd9" event={"ID":"2bcb636c-8377-489e-a0be-8905bc48bd9d","Type":"ContainerStarted","Data":"6745b75df190028a7dfa1fb99d5961532f34c27193a9cac01d5615f04e4e771f"} Oct 12 07:44:16 crc kubenswrapper[4599]: I1012 07:44:16.628892 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7dc47c5bb6-m2gd9" event={"ID":"2bcb636c-8377-489e-a0be-8905bc48bd9d","Type":"ContainerStarted","Data":"5c3a785dd99b80d376954f221641c2a89bbf8a6f9ac840f80459f24cabbbd892"} Oct 12 07:44:16 crc kubenswrapper[4599]: I1012 07:44:16.629686 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-rb46k" event={"ID":"fe7e44b3-5972-4d03-8919-8a67214fee06","Type":"ContainerStarted","Data":"721ed0a9f091500ffa5d45eb2f4bd90beafb544ef28a7c8b3545f3ed1db67455"} Oct 12 07:44:16 crc kubenswrapper[4599]: I1012 07:44:16.652812 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7dc47c5bb6-m2gd9" podStartSLOduration=1.652795882 podStartE2EDuration="1.652795882s" podCreationTimestamp="2025-10-12 07:44:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:44:16.649187533 +0000 UTC m=+553.438383035" watchObservedRunningTime="2025-10-12 07:44:16.652795882 +0000 UTC m=+553.441991384" Oct 12 07:44:18 crc kubenswrapper[4599]: I1012 07:44:18.644025 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-rb46k" event={"ID":"fe7e44b3-5972-4d03-8919-8a67214fee06","Type":"ContainerStarted","Data":"0ca52790357ee5f3177298a032f943ae98da7a1bd1d57117798809eb56570b1c"} Oct 12 07:44:18 crc kubenswrapper[4599]: I1012 07:44:18.644307 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-rb46k" Oct 12 07:44:18 crc kubenswrapper[4599]: I1012 07:44:18.645528 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-mxzhr" event={"ID":"cf76296b-ab43-4e40-83c9-ee507169ea4c","Type":"ContainerStarted","Data":"576386d1070554b20b21015a68ae906604ae76fa39bfb69c7735bf064cd24b10"} Oct 12 07:44:18 crc kubenswrapper[4599]: I1012 07:44:18.645593 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-mxzhr" Oct 12 07:44:18 crc kubenswrapper[4599]: I1012 07:44:18.646723 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-klx8j" event={"ID":"2ff98253-bb25-450e-9202-817788dab660","Type":"ContainerStarted","Data":"b4c17417dbf28b575f1e6c9ef29951e0ae6198c8f6c1c50deda8a48399ee5e96"} Oct 12 07:44:18 crc kubenswrapper[4599]: I1012 07:44:18.647908 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-tls82" event={"ID":"126c69f6-12a5-46a8-a817-23b97dc624d7","Type":"ContainerStarted","Data":"6c800951d3bc4d4a286a397f93d0caa0dfbe6e728208337ac8f457e36332a923"} Oct 12 07:44:18 crc kubenswrapper[4599]: I1012 07:44:18.659934 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-rb46k" podStartSLOduration=1.330114468 podStartE2EDuration="3.659918901s" podCreationTimestamp="2025-10-12 07:44:15 +0000 UTC" firstStartedPulling="2025-10-12 07:44:15.732242762 +0000 UTC m=+552.521438264" lastFinishedPulling="2025-10-12 07:44:18.062047194 +0000 UTC m=+554.851242697" observedRunningTime="2025-10-12 07:44:18.65588639 +0000 UTC m=+555.445081893" watchObservedRunningTime="2025-10-12 07:44:18.659918901 +0000 UTC m=+555.449114404" Oct 12 07:44:18 crc kubenswrapper[4599]: I1012 07:44:18.677704 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-klx8j" podStartSLOduration=1.442643619 podStartE2EDuration="3.677684745s" podCreationTimestamp="2025-10-12 07:44:15 +0000 UTC" firstStartedPulling="2025-10-12 07:44:15.82712326 +0000 UTC m=+552.616318762" lastFinishedPulling="2025-10-12 07:44:18.062164386 +0000 UTC m=+554.851359888" observedRunningTime="2025-10-12 07:44:18.676190945 +0000 UTC m=+555.465386447" watchObservedRunningTime="2025-10-12 07:44:18.677684745 +0000 UTC m=+555.466880247" Oct 12 07:44:18 crc kubenswrapper[4599]: I1012 07:44:18.692183 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-mxzhr" podStartSLOduration=1.202918375 podStartE2EDuration="3.692163964s" podCreationTimestamp="2025-10-12 07:44:15 +0000 UTC" firstStartedPulling="2025-10-12 07:44:15.576953101 +0000 UTC m=+552.366148604" lastFinishedPulling="2025-10-12 07:44:18.06619869 +0000 UTC m=+554.855394193" observedRunningTime="2025-10-12 07:44:18.687704697 +0000 UTC m=+555.476900199" watchObservedRunningTime="2025-10-12 07:44:18.692163964 +0000 UTC m=+555.481359467" Oct 12 07:44:20 crc kubenswrapper[4599]: I1012 07:44:20.659287 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-tls82" event={"ID":"126c69f6-12a5-46a8-a817-23b97dc624d7","Type":"ContainerStarted","Data":"0180c27db76a29507cc0d197088f7b1d44a332e2361de4b57653dbc1912a3875"} Oct 12 07:44:20 crc kubenswrapper[4599]: I1012 07:44:20.672232 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-tls82" podStartSLOduration=1.298315209 podStartE2EDuration="5.672211939s" podCreationTimestamp="2025-10-12 07:44:15 +0000 UTC" firstStartedPulling="2025-10-12 07:44:15.692073071 +0000 UTC m=+552.481268574" lastFinishedPulling="2025-10-12 07:44:20.065969803 +0000 UTC m=+556.855165304" observedRunningTime="2025-10-12 07:44:20.671913696 +0000 UTC m=+557.461109198" watchObservedRunningTime="2025-10-12 07:44:20.672211939 +0000 UTC m=+557.461407441" Oct 12 07:44:25 crc kubenswrapper[4599]: I1012 07:44:25.567252 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-mxzhr" Oct 12 07:44:25 crc kubenswrapper[4599]: I1012 07:44:25.813630 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7dc47c5bb6-m2gd9" Oct 12 07:44:25 crc kubenswrapper[4599]: I1012 07:44:25.813813 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-7dc47c5bb6-m2gd9" Oct 12 07:44:25 crc kubenswrapper[4599]: I1012 07:44:25.818826 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7dc47c5bb6-m2gd9" Oct 12 07:44:26 crc kubenswrapper[4599]: I1012 07:44:26.697104 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7dc47c5bb6-m2gd9" Oct 12 07:44:26 crc kubenswrapper[4599]: I1012 07:44:26.736503 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-lwg5q"] Oct 12 07:44:28 crc kubenswrapper[4599]: I1012 07:44:28.322457 4599 patch_prober.go:28] interesting pod/machine-config-daemon-5mz5c container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 07:44:28 crc kubenswrapper[4599]: I1012 07:44:28.322818 4599 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 07:44:28 crc kubenswrapper[4599]: I1012 07:44:28.322874 4599 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" Oct 12 07:44:28 crc kubenswrapper[4599]: I1012 07:44:28.323575 4599 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fa4ea3304924aa5e47754fb164316c2f3c9af596068fee357fa89cb1b44eb67a"} pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 12 07:44:28 crc kubenswrapper[4599]: I1012 07:44:28.323632 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" containerName="machine-config-daemon" containerID="cri-o://fa4ea3304924aa5e47754fb164316c2f3c9af596068fee357fa89cb1b44eb67a" gracePeriod=600 Oct 12 07:44:28 crc kubenswrapper[4599]: I1012 07:44:28.707132 4599 generic.go:334] "Generic (PLEG): container finished" podID="cc694bce-8c25-4729-b452-29d44d3efe6e" containerID="fa4ea3304924aa5e47754fb164316c2f3c9af596068fee357fa89cb1b44eb67a" exitCode=0 Oct 12 07:44:28 crc kubenswrapper[4599]: I1012 07:44:28.707178 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" event={"ID":"cc694bce-8c25-4729-b452-29d44d3efe6e","Type":"ContainerDied","Data":"fa4ea3304924aa5e47754fb164316c2f3c9af596068fee357fa89cb1b44eb67a"} Oct 12 07:44:28 crc kubenswrapper[4599]: I1012 07:44:28.707499 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" event={"ID":"cc694bce-8c25-4729-b452-29d44d3efe6e","Type":"ContainerStarted","Data":"62dd115f3eaf8ba983cf13f3b84adc51fbb09341d1c83aeb28106a411652e265"} Oct 12 07:44:28 crc kubenswrapper[4599]: I1012 07:44:28.707522 4599 scope.go:117] "RemoveContainer" containerID="96c58ca22fd64ff7166d37c6b5588563180da0da78f1666f39d10296101df256" Oct 12 07:44:35 crc kubenswrapper[4599]: I1012 07:44:35.540221 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-rb46k" Oct 12 07:44:46 crc kubenswrapper[4599]: I1012 07:44:46.495273 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xcf9v"] Oct 12 07:44:46 crc kubenswrapper[4599]: I1012 07:44:46.497794 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xcf9v" Oct 12 07:44:46 crc kubenswrapper[4599]: I1012 07:44:46.500654 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 12 07:44:46 crc kubenswrapper[4599]: I1012 07:44:46.504639 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xcf9v"] Oct 12 07:44:46 crc kubenswrapper[4599]: I1012 07:44:46.535779 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/492ddcff-667c-4dda-b878-741413cb8aa1-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xcf9v\" (UID: \"492ddcff-667c-4dda-b878-741413cb8aa1\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xcf9v" Oct 12 07:44:46 crc kubenswrapper[4599]: I1012 07:44:46.535863 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/492ddcff-667c-4dda-b878-741413cb8aa1-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xcf9v\" (UID: \"492ddcff-667c-4dda-b878-741413cb8aa1\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xcf9v" Oct 12 07:44:46 crc kubenswrapper[4599]: I1012 07:44:46.535913 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fl7n8\" (UniqueName: \"kubernetes.io/projected/492ddcff-667c-4dda-b878-741413cb8aa1-kube-api-access-fl7n8\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xcf9v\" (UID: \"492ddcff-667c-4dda-b878-741413cb8aa1\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xcf9v" Oct 12 07:44:46 crc kubenswrapper[4599]: I1012 07:44:46.638177 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/492ddcff-667c-4dda-b878-741413cb8aa1-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xcf9v\" (UID: \"492ddcff-667c-4dda-b878-741413cb8aa1\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xcf9v" Oct 12 07:44:46 crc kubenswrapper[4599]: I1012 07:44:46.638296 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/492ddcff-667c-4dda-b878-741413cb8aa1-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xcf9v\" (UID: \"492ddcff-667c-4dda-b878-741413cb8aa1\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xcf9v" Oct 12 07:44:46 crc kubenswrapper[4599]: I1012 07:44:46.638387 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fl7n8\" (UniqueName: \"kubernetes.io/projected/492ddcff-667c-4dda-b878-741413cb8aa1-kube-api-access-fl7n8\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xcf9v\" (UID: \"492ddcff-667c-4dda-b878-741413cb8aa1\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xcf9v" Oct 12 07:44:46 crc kubenswrapper[4599]: I1012 07:44:46.639252 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/492ddcff-667c-4dda-b878-741413cb8aa1-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xcf9v\" (UID: \"492ddcff-667c-4dda-b878-741413cb8aa1\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xcf9v" Oct 12 07:44:46 crc kubenswrapper[4599]: I1012 07:44:46.639440 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/492ddcff-667c-4dda-b878-741413cb8aa1-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xcf9v\" (UID: \"492ddcff-667c-4dda-b878-741413cb8aa1\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xcf9v" Oct 12 07:44:46 crc kubenswrapper[4599]: I1012 07:44:46.658457 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fl7n8\" (UniqueName: \"kubernetes.io/projected/492ddcff-667c-4dda-b878-741413cb8aa1-kube-api-access-fl7n8\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xcf9v\" (UID: \"492ddcff-667c-4dda-b878-741413cb8aa1\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xcf9v" Oct 12 07:44:46 crc kubenswrapper[4599]: I1012 07:44:46.812518 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xcf9v" Oct 12 07:44:47 crc kubenswrapper[4599]: I1012 07:44:47.163112 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xcf9v"] Oct 12 07:44:47 crc kubenswrapper[4599]: I1012 07:44:47.806292 4599 generic.go:334] "Generic (PLEG): container finished" podID="492ddcff-667c-4dda-b878-741413cb8aa1" containerID="99c9614f6565c5772a1c785dc8b5fd258805b10d2966711a9367de55609f61fd" exitCode=0 Oct 12 07:44:47 crc kubenswrapper[4599]: I1012 07:44:47.806407 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xcf9v" event={"ID":"492ddcff-667c-4dda-b878-741413cb8aa1","Type":"ContainerDied","Data":"99c9614f6565c5772a1c785dc8b5fd258805b10d2966711a9367de55609f61fd"} Oct 12 07:44:47 crc kubenswrapper[4599]: I1012 07:44:47.806663 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xcf9v" event={"ID":"492ddcff-667c-4dda-b878-741413cb8aa1","Type":"ContainerStarted","Data":"8d70678fb2bc85fa0823d166d46993320829f3f0a04cbdc75439f4ec2946c159"} Oct 12 07:44:49 crc kubenswrapper[4599]: I1012 07:44:49.820841 4599 generic.go:334] "Generic (PLEG): container finished" podID="492ddcff-667c-4dda-b878-741413cb8aa1" containerID="f88b9c6fa6b50723904d2ee7dd9fc7775eadc2905701d509498ff8072da93292" exitCode=0 Oct 12 07:44:49 crc kubenswrapper[4599]: I1012 07:44:49.820927 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xcf9v" event={"ID":"492ddcff-667c-4dda-b878-741413cb8aa1","Type":"ContainerDied","Data":"f88b9c6fa6b50723904d2ee7dd9fc7775eadc2905701d509498ff8072da93292"} Oct 12 07:44:50 crc kubenswrapper[4599]: I1012 07:44:50.830597 4599 generic.go:334] "Generic (PLEG): container finished" podID="492ddcff-667c-4dda-b878-741413cb8aa1" containerID="0588676e4bcc0d313284fb32281a320db6240d00812ddd42102f337b086fb1d1" exitCode=0 Oct 12 07:44:50 crc kubenswrapper[4599]: I1012 07:44:50.830702 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xcf9v" event={"ID":"492ddcff-667c-4dda-b878-741413cb8aa1","Type":"ContainerDied","Data":"0588676e4bcc0d313284fb32281a320db6240d00812ddd42102f337b086fb1d1"} Oct 12 07:44:51 crc kubenswrapper[4599]: I1012 07:44:51.765663 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-lwg5q" podUID="f1daac45-8c29-46d6-a4ca-6c42bc99f1f7" containerName="console" containerID="cri-o://0edded123a6780987964859623160fc4f4e89c2d9bb63fe0b8cf6d07b7742584" gracePeriod=15 Oct 12 07:44:52 crc kubenswrapper[4599]: I1012 07:44:52.041324 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xcf9v" Oct 12 07:44:52 crc kubenswrapper[4599]: I1012 07:44:52.091880 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-lwg5q_f1daac45-8c29-46d6-a4ca-6c42bc99f1f7/console/0.log" Oct 12 07:44:52 crc kubenswrapper[4599]: I1012 07:44:52.091950 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-lwg5q" Oct 12 07:44:52 crc kubenswrapper[4599]: I1012 07:44:52.209907 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f1daac45-8c29-46d6-a4ca-6c42bc99f1f7-trusted-ca-bundle\") pod \"f1daac45-8c29-46d6-a4ca-6c42bc99f1f7\" (UID: \"f1daac45-8c29-46d6-a4ca-6c42bc99f1f7\") " Oct 12 07:44:52 crc kubenswrapper[4599]: I1012 07:44:52.209948 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f1daac45-8c29-46d6-a4ca-6c42bc99f1f7-console-oauth-config\") pod \"f1daac45-8c29-46d6-a4ca-6c42bc99f1f7\" (UID: \"f1daac45-8c29-46d6-a4ca-6c42bc99f1f7\") " Oct 12 07:44:52 crc kubenswrapper[4599]: I1012 07:44:52.209994 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f1daac45-8c29-46d6-a4ca-6c42bc99f1f7-service-ca\") pod \"f1daac45-8c29-46d6-a4ca-6c42bc99f1f7\" (UID: \"f1daac45-8c29-46d6-a4ca-6c42bc99f1f7\") " Oct 12 07:44:52 crc kubenswrapper[4599]: I1012 07:44:52.210034 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f1daac45-8c29-46d6-a4ca-6c42bc99f1f7-oauth-serving-cert\") pod \"f1daac45-8c29-46d6-a4ca-6c42bc99f1f7\" (UID: \"f1daac45-8c29-46d6-a4ca-6c42bc99f1f7\") " Oct 12 07:44:52 crc kubenswrapper[4599]: I1012 07:44:52.210196 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f1daac45-8c29-46d6-a4ca-6c42bc99f1f7-console-serving-cert\") pod \"f1daac45-8c29-46d6-a4ca-6c42bc99f1f7\" (UID: \"f1daac45-8c29-46d6-a4ca-6c42bc99f1f7\") " Oct 12 07:44:52 crc kubenswrapper[4599]: I1012 07:44:52.210241 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f1daac45-8c29-46d6-a4ca-6c42bc99f1f7-console-config\") pod \"f1daac45-8c29-46d6-a4ca-6c42bc99f1f7\" (UID: \"f1daac45-8c29-46d6-a4ca-6c42bc99f1f7\") " Oct 12 07:44:52 crc kubenswrapper[4599]: I1012 07:44:52.210272 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kd5gw\" (UniqueName: \"kubernetes.io/projected/f1daac45-8c29-46d6-a4ca-6c42bc99f1f7-kube-api-access-kd5gw\") pod \"f1daac45-8c29-46d6-a4ca-6c42bc99f1f7\" (UID: \"f1daac45-8c29-46d6-a4ca-6c42bc99f1f7\") " Oct 12 07:44:52 crc kubenswrapper[4599]: I1012 07:44:52.210318 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fl7n8\" (UniqueName: \"kubernetes.io/projected/492ddcff-667c-4dda-b878-741413cb8aa1-kube-api-access-fl7n8\") pod \"492ddcff-667c-4dda-b878-741413cb8aa1\" (UID: \"492ddcff-667c-4dda-b878-741413cb8aa1\") " Oct 12 07:44:52 crc kubenswrapper[4599]: I1012 07:44:52.210392 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/492ddcff-667c-4dda-b878-741413cb8aa1-bundle\") pod \"492ddcff-667c-4dda-b878-741413cb8aa1\" (UID: \"492ddcff-667c-4dda-b878-741413cb8aa1\") " Oct 12 07:44:52 crc kubenswrapper[4599]: I1012 07:44:52.210454 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/492ddcff-667c-4dda-b878-741413cb8aa1-util\") pod \"492ddcff-667c-4dda-b878-741413cb8aa1\" (UID: \"492ddcff-667c-4dda-b878-741413cb8aa1\") " Oct 12 07:44:52 crc kubenswrapper[4599]: I1012 07:44:52.211541 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1daac45-8c29-46d6-a4ca-6c42bc99f1f7-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "f1daac45-8c29-46d6-a4ca-6c42bc99f1f7" (UID: "f1daac45-8c29-46d6-a4ca-6c42bc99f1f7"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:44:52 crc kubenswrapper[4599]: I1012 07:44:52.211536 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/492ddcff-667c-4dda-b878-741413cb8aa1-bundle" (OuterVolumeSpecName: "bundle") pod "492ddcff-667c-4dda-b878-741413cb8aa1" (UID: "492ddcff-667c-4dda-b878-741413cb8aa1"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:44:52 crc kubenswrapper[4599]: I1012 07:44:52.211700 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1daac45-8c29-46d6-a4ca-6c42bc99f1f7-service-ca" (OuterVolumeSpecName: "service-ca") pod "f1daac45-8c29-46d6-a4ca-6c42bc99f1f7" (UID: "f1daac45-8c29-46d6-a4ca-6c42bc99f1f7"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:44:52 crc kubenswrapper[4599]: I1012 07:44:52.212103 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1daac45-8c29-46d6-a4ca-6c42bc99f1f7-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "f1daac45-8c29-46d6-a4ca-6c42bc99f1f7" (UID: "f1daac45-8c29-46d6-a4ca-6c42bc99f1f7"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:44:52 crc kubenswrapper[4599]: I1012 07:44:52.212210 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1daac45-8c29-46d6-a4ca-6c42bc99f1f7-console-config" (OuterVolumeSpecName: "console-config") pod "f1daac45-8c29-46d6-a4ca-6c42bc99f1f7" (UID: "f1daac45-8c29-46d6-a4ca-6c42bc99f1f7"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:44:52 crc kubenswrapper[4599]: I1012 07:44:52.216282 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1daac45-8c29-46d6-a4ca-6c42bc99f1f7-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "f1daac45-8c29-46d6-a4ca-6c42bc99f1f7" (UID: "f1daac45-8c29-46d6-a4ca-6c42bc99f1f7"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:44:52 crc kubenswrapper[4599]: I1012 07:44:52.216422 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/492ddcff-667c-4dda-b878-741413cb8aa1-kube-api-access-fl7n8" (OuterVolumeSpecName: "kube-api-access-fl7n8") pod "492ddcff-667c-4dda-b878-741413cb8aa1" (UID: "492ddcff-667c-4dda-b878-741413cb8aa1"). InnerVolumeSpecName "kube-api-access-fl7n8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:44:52 crc kubenswrapper[4599]: I1012 07:44:52.216764 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1daac45-8c29-46d6-a4ca-6c42bc99f1f7-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "f1daac45-8c29-46d6-a4ca-6c42bc99f1f7" (UID: "f1daac45-8c29-46d6-a4ca-6c42bc99f1f7"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:44:52 crc kubenswrapper[4599]: I1012 07:44:52.216849 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1daac45-8c29-46d6-a4ca-6c42bc99f1f7-kube-api-access-kd5gw" (OuterVolumeSpecName: "kube-api-access-kd5gw") pod "f1daac45-8c29-46d6-a4ca-6c42bc99f1f7" (UID: "f1daac45-8c29-46d6-a4ca-6c42bc99f1f7"). InnerVolumeSpecName "kube-api-access-kd5gw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:44:52 crc kubenswrapper[4599]: I1012 07:44:52.221267 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/492ddcff-667c-4dda-b878-741413cb8aa1-util" (OuterVolumeSpecName: "util") pod "492ddcff-667c-4dda-b878-741413cb8aa1" (UID: "492ddcff-667c-4dda-b878-741413cb8aa1"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:44:52 crc kubenswrapper[4599]: I1012 07:44:52.312822 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kd5gw\" (UniqueName: \"kubernetes.io/projected/f1daac45-8c29-46d6-a4ca-6c42bc99f1f7-kube-api-access-kd5gw\") on node \"crc\" DevicePath \"\"" Oct 12 07:44:52 crc kubenswrapper[4599]: I1012 07:44:52.312863 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fl7n8\" (UniqueName: \"kubernetes.io/projected/492ddcff-667c-4dda-b878-741413cb8aa1-kube-api-access-fl7n8\") on node \"crc\" DevicePath \"\"" Oct 12 07:44:52 crc kubenswrapper[4599]: I1012 07:44:52.312876 4599 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/492ddcff-667c-4dda-b878-741413cb8aa1-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 07:44:52 crc kubenswrapper[4599]: I1012 07:44:52.312887 4599 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/492ddcff-667c-4dda-b878-741413cb8aa1-util\") on node \"crc\" DevicePath \"\"" Oct 12 07:44:52 crc kubenswrapper[4599]: I1012 07:44:52.312903 4599 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f1daac45-8c29-46d6-a4ca-6c42bc99f1f7-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 07:44:52 crc kubenswrapper[4599]: I1012 07:44:52.312913 4599 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f1daac45-8c29-46d6-a4ca-6c42bc99f1f7-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 12 07:44:52 crc kubenswrapper[4599]: I1012 07:44:52.312922 4599 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f1daac45-8c29-46d6-a4ca-6c42bc99f1f7-service-ca\") on node \"crc\" DevicePath \"\"" Oct 12 07:44:52 crc kubenswrapper[4599]: I1012 07:44:52.312931 4599 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f1daac45-8c29-46d6-a4ca-6c42bc99f1f7-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 12 07:44:52 crc kubenswrapper[4599]: I1012 07:44:52.312940 4599 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f1daac45-8c29-46d6-a4ca-6c42bc99f1f7-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 12 07:44:52 crc kubenswrapper[4599]: I1012 07:44:52.312951 4599 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f1daac45-8c29-46d6-a4ca-6c42bc99f1f7-console-config\") on node \"crc\" DevicePath \"\"" Oct 12 07:44:52 crc kubenswrapper[4599]: I1012 07:44:52.847730 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-lwg5q_f1daac45-8c29-46d6-a4ca-6c42bc99f1f7/console/0.log" Oct 12 07:44:52 crc kubenswrapper[4599]: I1012 07:44:52.847780 4599 generic.go:334] "Generic (PLEG): container finished" podID="f1daac45-8c29-46d6-a4ca-6c42bc99f1f7" containerID="0edded123a6780987964859623160fc4f4e89c2d9bb63fe0b8cf6d07b7742584" exitCode=2 Oct 12 07:44:52 crc kubenswrapper[4599]: I1012 07:44:52.847902 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-lwg5q" Oct 12 07:44:52 crc kubenswrapper[4599]: I1012 07:44:52.847887 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-lwg5q" event={"ID":"f1daac45-8c29-46d6-a4ca-6c42bc99f1f7","Type":"ContainerDied","Data":"0edded123a6780987964859623160fc4f4e89c2d9bb63fe0b8cf6d07b7742584"} Oct 12 07:44:52 crc kubenswrapper[4599]: I1012 07:44:52.847951 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-lwg5q" event={"ID":"f1daac45-8c29-46d6-a4ca-6c42bc99f1f7","Type":"ContainerDied","Data":"7132b8e06fe2a89d206adaabce104a6b553eaf5058c69e921243ce0afd346fbf"} Oct 12 07:44:52 crc kubenswrapper[4599]: I1012 07:44:52.847990 4599 scope.go:117] "RemoveContainer" containerID="0edded123a6780987964859623160fc4f4e89c2d9bb63fe0b8cf6d07b7742584" Oct 12 07:44:52 crc kubenswrapper[4599]: I1012 07:44:52.851725 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xcf9v" event={"ID":"492ddcff-667c-4dda-b878-741413cb8aa1","Type":"ContainerDied","Data":"8d70678fb2bc85fa0823d166d46993320829f3f0a04cbdc75439f4ec2946c159"} Oct 12 07:44:52 crc kubenswrapper[4599]: I1012 07:44:52.851759 4599 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d70678fb2bc85fa0823d166d46993320829f3f0a04cbdc75439f4ec2946c159" Oct 12 07:44:52 crc kubenswrapper[4599]: I1012 07:44:52.851847 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xcf9v" Oct 12 07:44:52 crc kubenswrapper[4599]: I1012 07:44:52.863251 4599 scope.go:117] "RemoveContainer" containerID="0edded123a6780987964859623160fc4f4e89c2d9bb63fe0b8cf6d07b7742584" Oct 12 07:44:52 crc kubenswrapper[4599]: E1012 07:44:52.863585 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0edded123a6780987964859623160fc4f4e89c2d9bb63fe0b8cf6d07b7742584\": container with ID starting with 0edded123a6780987964859623160fc4f4e89c2d9bb63fe0b8cf6d07b7742584 not found: ID does not exist" containerID="0edded123a6780987964859623160fc4f4e89c2d9bb63fe0b8cf6d07b7742584" Oct 12 07:44:52 crc kubenswrapper[4599]: I1012 07:44:52.863615 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0edded123a6780987964859623160fc4f4e89c2d9bb63fe0b8cf6d07b7742584"} err="failed to get container status \"0edded123a6780987964859623160fc4f4e89c2d9bb63fe0b8cf6d07b7742584\": rpc error: code = NotFound desc = could not find container \"0edded123a6780987964859623160fc4f4e89c2d9bb63fe0b8cf6d07b7742584\": container with ID starting with 0edded123a6780987964859623160fc4f4e89c2d9bb63fe0b8cf6d07b7742584 not found: ID does not exist" Oct 12 07:44:52 crc kubenswrapper[4599]: I1012 07:44:52.875525 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-lwg5q"] Oct 12 07:44:52 crc kubenswrapper[4599]: I1012 07:44:52.878269 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-lwg5q"] Oct 12 07:44:53 crc kubenswrapper[4599]: I1012 07:44:53.551232 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1daac45-8c29-46d6-a4ca-6c42bc99f1f7" path="/var/lib/kubelet/pods/f1daac45-8c29-46d6-a4ca-6c42bc99f1f7/volumes" Oct 12 07:45:00 crc kubenswrapper[4599]: I1012 07:45:00.128725 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29337585-gh2qn"] Oct 12 07:45:00 crc kubenswrapper[4599]: E1012 07:45:00.129473 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1daac45-8c29-46d6-a4ca-6c42bc99f1f7" containerName="console" Oct 12 07:45:00 crc kubenswrapper[4599]: I1012 07:45:00.129487 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1daac45-8c29-46d6-a4ca-6c42bc99f1f7" containerName="console" Oct 12 07:45:00 crc kubenswrapper[4599]: E1012 07:45:00.129502 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="492ddcff-667c-4dda-b878-741413cb8aa1" containerName="util" Oct 12 07:45:00 crc kubenswrapper[4599]: I1012 07:45:00.129508 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="492ddcff-667c-4dda-b878-741413cb8aa1" containerName="util" Oct 12 07:45:00 crc kubenswrapper[4599]: E1012 07:45:00.129519 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="492ddcff-667c-4dda-b878-741413cb8aa1" containerName="extract" Oct 12 07:45:00 crc kubenswrapper[4599]: I1012 07:45:00.129526 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="492ddcff-667c-4dda-b878-741413cb8aa1" containerName="extract" Oct 12 07:45:00 crc kubenswrapper[4599]: E1012 07:45:00.129534 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="492ddcff-667c-4dda-b878-741413cb8aa1" containerName="pull" Oct 12 07:45:00 crc kubenswrapper[4599]: I1012 07:45:00.129539 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="492ddcff-667c-4dda-b878-741413cb8aa1" containerName="pull" Oct 12 07:45:00 crc kubenswrapper[4599]: I1012 07:45:00.129645 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="492ddcff-667c-4dda-b878-741413cb8aa1" containerName="extract" Oct 12 07:45:00 crc kubenswrapper[4599]: I1012 07:45:00.129654 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1daac45-8c29-46d6-a4ca-6c42bc99f1f7" containerName="console" Oct 12 07:45:00 crc kubenswrapper[4599]: I1012 07:45:00.130051 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29337585-gh2qn" Oct 12 07:45:00 crc kubenswrapper[4599]: I1012 07:45:00.132035 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 12 07:45:00 crc kubenswrapper[4599]: I1012 07:45:00.132169 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 12 07:45:00 crc kubenswrapper[4599]: I1012 07:45:00.140363 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29337585-gh2qn"] Oct 12 07:45:00 crc kubenswrapper[4599]: I1012 07:45:00.315467 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b6dd7df5-77c0-4ffb-9d14-425465bb9ab3-config-volume\") pod \"collect-profiles-29337585-gh2qn\" (UID: \"b6dd7df5-77c0-4ffb-9d14-425465bb9ab3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337585-gh2qn" Oct 12 07:45:00 crc kubenswrapper[4599]: I1012 07:45:00.315665 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlzdz\" (UniqueName: \"kubernetes.io/projected/b6dd7df5-77c0-4ffb-9d14-425465bb9ab3-kube-api-access-rlzdz\") pod \"collect-profiles-29337585-gh2qn\" (UID: \"b6dd7df5-77c0-4ffb-9d14-425465bb9ab3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337585-gh2qn" Oct 12 07:45:00 crc kubenswrapper[4599]: I1012 07:45:00.315863 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b6dd7df5-77c0-4ffb-9d14-425465bb9ab3-secret-volume\") pod \"collect-profiles-29337585-gh2qn\" (UID: \"b6dd7df5-77c0-4ffb-9d14-425465bb9ab3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337585-gh2qn" Oct 12 07:45:00 crc kubenswrapper[4599]: I1012 07:45:00.417142 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b6dd7df5-77c0-4ffb-9d14-425465bb9ab3-secret-volume\") pod \"collect-profiles-29337585-gh2qn\" (UID: \"b6dd7df5-77c0-4ffb-9d14-425465bb9ab3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337585-gh2qn" Oct 12 07:45:00 crc kubenswrapper[4599]: I1012 07:45:00.417234 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b6dd7df5-77c0-4ffb-9d14-425465bb9ab3-config-volume\") pod \"collect-profiles-29337585-gh2qn\" (UID: \"b6dd7df5-77c0-4ffb-9d14-425465bb9ab3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337585-gh2qn" Oct 12 07:45:00 crc kubenswrapper[4599]: I1012 07:45:00.417289 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlzdz\" (UniqueName: \"kubernetes.io/projected/b6dd7df5-77c0-4ffb-9d14-425465bb9ab3-kube-api-access-rlzdz\") pod \"collect-profiles-29337585-gh2qn\" (UID: \"b6dd7df5-77c0-4ffb-9d14-425465bb9ab3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337585-gh2qn" Oct 12 07:45:00 crc kubenswrapper[4599]: I1012 07:45:00.418526 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b6dd7df5-77c0-4ffb-9d14-425465bb9ab3-config-volume\") pod \"collect-profiles-29337585-gh2qn\" (UID: \"b6dd7df5-77c0-4ffb-9d14-425465bb9ab3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337585-gh2qn" Oct 12 07:45:00 crc kubenswrapper[4599]: I1012 07:45:00.425044 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b6dd7df5-77c0-4ffb-9d14-425465bb9ab3-secret-volume\") pod \"collect-profiles-29337585-gh2qn\" (UID: \"b6dd7df5-77c0-4ffb-9d14-425465bb9ab3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337585-gh2qn" Oct 12 07:45:00 crc kubenswrapper[4599]: I1012 07:45:00.436849 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlzdz\" (UniqueName: \"kubernetes.io/projected/b6dd7df5-77c0-4ffb-9d14-425465bb9ab3-kube-api-access-rlzdz\") pod \"collect-profiles-29337585-gh2qn\" (UID: \"b6dd7df5-77c0-4ffb-9d14-425465bb9ab3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337585-gh2qn" Oct 12 07:45:00 crc kubenswrapper[4599]: I1012 07:45:00.447802 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29337585-gh2qn" Oct 12 07:45:00 crc kubenswrapper[4599]: I1012 07:45:00.825180 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29337585-gh2qn"] Oct 12 07:45:00 crc kubenswrapper[4599]: W1012 07:45:00.837453 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6dd7df5_77c0_4ffb_9d14_425465bb9ab3.slice/crio-4c9ebcce603996059a2cd9cf9e3d47b7705d9410fb632f75638f89b270c0798c WatchSource:0}: Error finding container 4c9ebcce603996059a2cd9cf9e3d47b7705d9410fb632f75638f89b270c0798c: Status 404 returned error can't find the container with id 4c9ebcce603996059a2cd9cf9e3d47b7705d9410fb632f75638f89b270c0798c Oct 12 07:45:00 crc kubenswrapper[4599]: I1012 07:45:00.900322 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29337585-gh2qn" event={"ID":"b6dd7df5-77c0-4ffb-9d14-425465bb9ab3","Type":"ContainerStarted","Data":"4c9ebcce603996059a2cd9cf9e3d47b7705d9410fb632f75638f89b270c0798c"} Oct 12 07:45:01 crc kubenswrapper[4599]: I1012 07:45:01.709129 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-75d566c47b-2dhk8"] Oct 12 07:45:01 crc kubenswrapper[4599]: I1012 07:45:01.709990 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-75d566c47b-2dhk8" Oct 12 07:45:01 crc kubenswrapper[4599]: I1012 07:45:01.713675 4599 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Oct 12 07:45:01 crc kubenswrapper[4599]: I1012 07:45:01.714092 4599 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-vphgj" Oct 12 07:45:01 crc kubenswrapper[4599]: I1012 07:45:01.714238 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Oct 12 07:45:01 crc kubenswrapper[4599]: I1012 07:45:01.714405 4599 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Oct 12 07:45:01 crc kubenswrapper[4599]: I1012 07:45:01.714530 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Oct 12 07:45:01 crc kubenswrapper[4599]: I1012 07:45:01.715042 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-75d566c47b-2dhk8"] Oct 12 07:45:01 crc kubenswrapper[4599]: I1012 07:45:01.837576 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/14e65bd0-37ae-438c-9d25-b2d4b70556e7-apiservice-cert\") pod \"metallb-operator-controller-manager-75d566c47b-2dhk8\" (UID: \"14e65bd0-37ae-438c-9d25-b2d4b70556e7\") " pod="metallb-system/metallb-operator-controller-manager-75d566c47b-2dhk8" Oct 12 07:45:01 crc kubenswrapper[4599]: I1012 07:45:01.837838 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/14e65bd0-37ae-438c-9d25-b2d4b70556e7-webhook-cert\") pod \"metallb-operator-controller-manager-75d566c47b-2dhk8\" (UID: \"14e65bd0-37ae-438c-9d25-b2d4b70556e7\") " pod="metallb-system/metallb-operator-controller-manager-75d566c47b-2dhk8" Oct 12 07:45:01 crc kubenswrapper[4599]: I1012 07:45:01.837954 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkzt7\" (UniqueName: \"kubernetes.io/projected/14e65bd0-37ae-438c-9d25-b2d4b70556e7-kube-api-access-gkzt7\") pod \"metallb-operator-controller-manager-75d566c47b-2dhk8\" (UID: \"14e65bd0-37ae-438c-9d25-b2d4b70556e7\") " pod="metallb-system/metallb-operator-controller-manager-75d566c47b-2dhk8" Oct 12 07:45:01 crc kubenswrapper[4599]: I1012 07:45:01.907411 4599 generic.go:334] "Generic (PLEG): container finished" podID="b6dd7df5-77c0-4ffb-9d14-425465bb9ab3" containerID="4ad77157cd9ccdd7484b815bcc542a463380fb30bb7bec230944a05c2d9c6007" exitCode=0 Oct 12 07:45:01 crc kubenswrapper[4599]: I1012 07:45:01.907508 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29337585-gh2qn" event={"ID":"b6dd7df5-77c0-4ffb-9d14-425465bb9ab3","Type":"ContainerDied","Data":"4ad77157cd9ccdd7484b815bcc542a463380fb30bb7bec230944a05c2d9c6007"} Oct 12 07:45:01 crc kubenswrapper[4599]: I1012 07:45:01.939245 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/14e65bd0-37ae-438c-9d25-b2d4b70556e7-apiservice-cert\") pod \"metallb-operator-controller-manager-75d566c47b-2dhk8\" (UID: \"14e65bd0-37ae-438c-9d25-b2d4b70556e7\") " pod="metallb-system/metallb-operator-controller-manager-75d566c47b-2dhk8" Oct 12 07:45:01 crc kubenswrapper[4599]: I1012 07:45:01.939316 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/14e65bd0-37ae-438c-9d25-b2d4b70556e7-webhook-cert\") pod \"metallb-operator-controller-manager-75d566c47b-2dhk8\" (UID: \"14e65bd0-37ae-438c-9d25-b2d4b70556e7\") " pod="metallb-system/metallb-operator-controller-manager-75d566c47b-2dhk8" Oct 12 07:45:01 crc kubenswrapper[4599]: I1012 07:45:01.939352 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkzt7\" (UniqueName: \"kubernetes.io/projected/14e65bd0-37ae-438c-9d25-b2d4b70556e7-kube-api-access-gkzt7\") pod \"metallb-operator-controller-manager-75d566c47b-2dhk8\" (UID: \"14e65bd0-37ae-438c-9d25-b2d4b70556e7\") " pod="metallb-system/metallb-operator-controller-manager-75d566c47b-2dhk8" Oct 12 07:45:01 crc kubenswrapper[4599]: I1012 07:45:01.945306 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/14e65bd0-37ae-438c-9d25-b2d4b70556e7-webhook-cert\") pod \"metallb-operator-controller-manager-75d566c47b-2dhk8\" (UID: \"14e65bd0-37ae-438c-9d25-b2d4b70556e7\") " pod="metallb-system/metallb-operator-controller-manager-75d566c47b-2dhk8" Oct 12 07:45:01 crc kubenswrapper[4599]: I1012 07:45:01.947697 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/14e65bd0-37ae-438c-9d25-b2d4b70556e7-apiservice-cert\") pod \"metallb-operator-controller-manager-75d566c47b-2dhk8\" (UID: \"14e65bd0-37ae-438c-9d25-b2d4b70556e7\") " pod="metallb-system/metallb-operator-controller-manager-75d566c47b-2dhk8" Oct 12 07:45:01 crc kubenswrapper[4599]: I1012 07:45:01.954103 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkzt7\" (UniqueName: \"kubernetes.io/projected/14e65bd0-37ae-438c-9d25-b2d4b70556e7-kube-api-access-gkzt7\") pod \"metallb-operator-controller-manager-75d566c47b-2dhk8\" (UID: \"14e65bd0-37ae-438c-9d25-b2d4b70556e7\") " pod="metallb-system/metallb-operator-controller-manager-75d566c47b-2dhk8" Oct 12 07:45:01 crc kubenswrapper[4599]: I1012 07:45:01.975626 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-5c94cffdb4-lj8nf"] Oct 12 07:45:01 crc kubenswrapper[4599]: I1012 07:45:01.976498 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-5c94cffdb4-lj8nf" Oct 12 07:45:01 crc kubenswrapper[4599]: I1012 07:45:01.978930 4599 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Oct 12 07:45:01 crc kubenswrapper[4599]: I1012 07:45:01.979154 4599 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-j5vdd" Oct 12 07:45:01 crc kubenswrapper[4599]: I1012 07:45:01.979277 4599 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 12 07:45:01 crc kubenswrapper[4599]: I1012 07:45:01.987654 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-5c94cffdb4-lj8nf"] Oct 12 07:45:02 crc kubenswrapper[4599]: I1012 07:45:02.024744 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-75d566c47b-2dhk8" Oct 12 07:45:02 crc kubenswrapper[4599]: I1012 07:45:02.040693 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bb4359a2-e2a1-4e37-b7df-420ab49781c6-apiservice-cert\") pod \"metallb-operator-webhook-server-5c94cffdb4-lj8nf\" (UID: \"bb4359a2-e2a1-4e37-b7df-420ab49781c6\") " pod="metallb-system/metallb-operator-webhook-server-5c94cffdb4-lj8nf" Oct 12 07:45:02 crc kubenswrapper[4599]: I1012 07:45:02.040753 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bb4359a2-e2a1-4e37-b7df-420ab49781c6-webhook-cert\") pod \"metallb-operator-webhook-server-5c94cffdb4-lj8nf\" (UID: \"bb4359a2-e2a1-4e37-b7df-420ab49781c6\") " pod="metallb-system/metallb-operator-webhook-server-5c94cffdb4-lj8nf" Oct 12 07:45:02 crc kubenswrapper[4599]: I1012 07:45:02.040794 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhkw2\" (UniqueName: \"kubernetes.io/projected/bb4359a2-e2a1-4e37-b7df-420ab49781c6-kube-api-access-xhkw2\") pod \"metallb-operator-webhook-server-5c94cffdb4-lj8nf\" (UID: \"bb4359a2-e2a1-4e37-b7df-420ab49781c6\") " pod="metallb-system/metallb-operator-webhook-server-5c94cffdb4-lj8nf" Oct 12 07:45:02 crc kubenswrapper[4599]: I1012 07:45:02.141801 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bb4359a2-e2a1-4e37-b7df-420ab49781c6-apiservice-cert\") pod \"metallb-operator-webhook-server-5c94cffdb4-lj8nf\" (UID: \"bb4359a2-e2a1-4e37-b7df-420ab49781c6\") " pod="metallb-system/metallb-operator-webhook-server-5c94cffdb4-lj8nf" Oct 12 07:45:02 crc kubenswrapper[4599]: I1012 07:45:02.141862 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bb4359a2-e2a1-4e37-b7df-420ab49781c6-webhook-cert\") pod \"metallb-operator-webhook-server-5c94cffdb4-lj8nf\" (UID: \"bb4359a2-e2a1-4e37-b7df-420ab49781c6\") " pod="metallb-system/metallb-operator-webhook-server-5c94cffdb4-lj8nf" Oct 12 07:45:02 crc kubenswrapper[4599]: I1012 07:45:02.141900 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhkw2\" (UniqueName: \"kubernetes.io/projected/bb4359a2-e2a1-4e37-b7df-420ab49781c6-kube-api-access-xhkw2\") pod \"metallb-operator-webhook-server-5c94cffdb4-lj8nf\" (UID: \"bb4359a2-e2a1-4e37-b7df-420ab49781c6\") " pod="metallb-system/metallb-operator-webhook-server-5c94cffdb4-lj8nf" Oct 12 07:45:02 crc kubenswrapper[4599]: I1012 07:45:02.147130 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bb4359a2-e2a1-4e37-b7df-420ab49781c6-apiservice-cert\") pod \"metallb-operator-webhook-server-5c94cffdb4-lj8nf\" (UID: \"bb4359a2-e2a1-4e37-b7df-420ab49781c6\") " pod="metallb-system/metallb-operator-webhook-server-5c94cffdb4-lj8nf" Oct 12 07:45:02 crc kubenswrapper[4599]: I1012 07:45:02.154881 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bb4359a2-e2a1-4e37-b7df-420ab49781c6-webhook-cert\") pod \"metallb-operator-webhook-server-5c94cffdb4-lj8nf\" (UID: \"bb4359a2-e2a1-4e37-b7df-420ab49781c6\") " pod="metallb-system/metallb-operator-webhook-server-5c94cffdb4-lj8nf" Oct 12 07:45:02 crc kubenswrapper[4599]: I1012 07:45:02.158680 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhkw2\" (UniqueName: \"kubernetes.io/projected/bb4359a2-e2a1-4e37-b7df-420ab49781c6-kube-api-access-xhkw2\") pod \"metallb-operator-webhook-server-5c94cffdb4-lj8nf\" (UID: \"bb4359a2-e2a1-4e37-b7df-420ab49781c6\") " pod="metallb-system/metallb-operator-webhook-server-5c94cffdb4-lj8nf" Oct 12 07:45:02 crc kubenswrapper[4599]: I1012 07:45:02.231914 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-75d566c47b-2dhk8"] Oct 12 07:45:02 crc kubenswrapper[4599]: W1012 07:45:02.236552 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14e65bd0_37ae_438c_9d25_b2d4b70556e7.slice/crio-7dc0e50a1227e45702b38ce7a2f67c097d751c5ce37ffc83eac5ed6292b853f2 WatchSource:0}: Error finding container 7dc0e50a1227e45702b38ce7a2f67c097d751c5ce37ffc83eac5ed6292b853f2: Status 404 returned error can't find the container with id 7dc0e50a1227e45702b38ce7a2f67c097d751c5ce37ffc83eac5ed6292b853f2 Oct 12 07:45:02 crc kubenswrapper[4599]: I1012 07:45:02.291747 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-5c94cffdb4-lj8nf" Oct 12 07:45:02 crc kubenswrapper[4599]: I1012 07:45:02.654569 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-5c94cffdb4-lj8nf"] Oct 12 07:45:02 crc kubenswrapper[4599]: W1012 07:45:02.657894 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbb4359a2_e2a1_4e37_b7df_420ab49781c6.slice/crio-2a5ec9ca56aab9bcbd7c35a72ef5caa680a7b9a90105d31c37139fce3514bf5f WatchSource:0}: Error finding container 2a5ec9ca56aab9bcbd7c35a72ef5caa680a7b9a90105d31c37139fce3514bf5f: Status 404 returned error can't find the container with id 2a5ec9ca56aab9bcbd7c35a72ef5caa680a7b9a90105d31c37139fce3514bf5f Oct 12 07:45:02 crc kubenswrapper[4599]: I1012 07:45:02.916679 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-75d566c47b-2dhk8" event={"ID":"14e65bd0-37ae-438c-9d25-b2d4b70556e7","Type":"ContainerStarted","Data":"7dc0e50a1227e45702b38ce7a2f67c097d751c5ce37ffc83eac5ed6292b853f2"} Oct 12 07:45:02 crc kubenswrapper[4599]: I1012 07:45:02.918218 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-5c94cffdb4-lj8nf" event={"ID":"bb4359a2-e2a1-4e37-b7df-420ab49781c6","Type":"ContainerStarted","Data":"2a5ec9ca56aab9bcbd7c35a72ef5caa680a7b9a90105d31c37139fce3514bf5f"} Oct 12 07:45:03 crc kubenswrapper[4599]: I1012 07:45:03.116966 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29337585-gh2qn" Oct 12 07:45:03 crc kubenswrapper[4599]: I1012 07:45:03.255252 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rlzdz\" (UniqueName: \"kubernetes.io/projected/b6dd7df5-77c0-4ffb-9d14-425465bb9ab3-kube-api-access-rlzdz\") pod \"b6dd7df5-77c0-4ffb-9d14-425465bb9ab3\" (UID: \"b6dd7df5-77c0-4ffb-9d14-425465bb9ab3\") " Oct 12 07:45:03 crc kubenswrapper[4599]: I1012 07:45:03.255319 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b6dd7df5-77c0-4ffb-9d14-425465bb9ab3-config-volume\") pod \"b6dd7df5-77c0-4ffb-9d14-425465bb9ab3\" (UID: \"b6dd7df5-77c0-4ffb-9d14-425465bb9ab3\") " Oct 12 07:45:03 crc kubenswrapper[4599]: I1012 07:45:03.255424 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b6dd7df5-77c0-4ffb-9d14-425465bb9ab3-secret-volume\") pod \"b6dd7df5-77c0-4ffb-9d14-425465bb9ab3\" (UID: \"b6dd7df5-77c0-4ffb-9d14-425465bb9ab3\") " Oct 12 07:45:03 crc kubenswrapper[4599]: I1012 07:45:03.256436 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6dd7df5-77c0-4ffb-9d14-425465bb9ab3-config-volume" (OuterVolumeSpecName: "config-volume") pod "b6dd7df5-77c0-4ffb-9d14-425465bb9ab3" (UID: "b6dd7df5-77c0-4ffb-9d14-425465bb9ab3"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:45:03 crc kubenswrapper[4599]: I1012 07:45:03.259585 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6dd7df5-77c0-4ffb-9d14-425465bb9ab3-kube-api-access-rlzdz" (OuterVolumeSpecName: "kube-api-access-rlzdz") pod "b6dd7df5-77c0-4ffb-9d14-425465bb9ab3" (UID: "b6dd7df5-77c0-4ffb-9d14-425465bb9ab3"). InnerVolumeSpecName "kube-api-access-rlzdz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:45:03 crc kubenswrapper[4599]: I1012 07:45:03.260163 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6dd7df5-77c0-4ffb-9d14-425465bb9ab3-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b6dd7df5-77c0-4ffb-9d14-425465bb9ab3" (UID: "b6dd7df5-77c0-4ffb-9d14-425465bb9ab3"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:45:03 crc kubenswrapper[4599]: I1012 07:45:03.357116 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rlzdz\" (UniqueName: \"kubernetes.io/projected/b6dd7df5-77c0-4ffb-9d14-425465bb9ab3-kube-api-access-rlzdz\") on node \"crc\" DevicePath \"\"" Oct 12 07:45:03 crc kubenswrapper[4599]: I1012 07:45:03.357215 4599 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b6dd7df5-77c0-4ffb-9d14-425465bb9ab3-config-volume\") on node \"crc\" DevicePath \"\"" Oct 12 07:45:03 crc kubenswrapper[4599]: I1012 07:45:03.357280 4599 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b6dd7df5-77c0-4ffb-9d14-425465bb9ab3-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 12 07:45:03 crc kubenswrapper[4599]: I1012 07:45:03.935463 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29337585-gh2qn" event={"ID":"b6dd7df5-77c0-4ffb-9d14-425465bb9ab3","Type":"ContainerDied","Data":"4c9ebcce603996059a2cd9cf9e3d47b7705d9410fb632f75638f89b270c0798c"} Oct 12 07:45:03 crc kubenswrapper[4599]: I1012 07:45:03.935772 4599 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c9ebcce603996059a2cd9cf9e3d47b7705d9410fb632f75638f89b270c0798c" Oct 12 07:45:03 crc kubenswrapper[4599]: I1012 07:45:03.935705 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29337585-gh2qn" Oct 12 07:45:05 crc kubenswrapper[4599]: I1012 07:45:05.963125 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-75d566c47b-2dhk8" event={"ID":"14e65bd0-37ae-438c-9d25-b2d4b70556e7","Type":"ContainerStarted","Data":"8315bbbc5e80ba9ff7f1017e79b3705421d06910cb26df108ae6cd14b07fdcf3"} Oct 12 07:45:05 crc kubenswrapper[4599]: I1012 07:45:05.964510 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-75d566c47b-2dhk8" Oct 12 07:45:05 crc kubenswrapper[4599]: I1012 07:45:05.986925 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-75d566c47b-2dhk8" podStartSLOduration=2.066836556 podStartE2EDuration="4.986898652s" podCreationTimestamp="2025-10-12 07:45:01 +0000 UTC" firstStartedPulling="2025-10-12 07:45:02.240528538 +0000 UTC m=+599.029724040" lastFinishedPulling="2025-10-12 07:45:05.160590633 +0000 UTC m=+601.949786136" observedRunningTime="2025-10-12 07:45:05.978365211 +0000 UTC m=+602.767560713" watchObservedRunningTime="2025-10-12 07:45:05.986898652 +0000 UTC m=+602.776094154" Oct 12 07:45:06 crc kubenswrapper[4599]: I1012 07:45:06.973432 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-5c94cffdb4-lj8nf" event={"ID":"bb4359a2-e2a1-4e37-b7df-420ab49781c6","Type":"ContainerStarted","Data":"2695d850832fcfee793e63d92dbdf97e29d75faa414b69ec0cbd9f31b1496dce"} Oct 12 07:45:06 crc kubenswrapper[4599]: I1012 07:45:06.973741 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-5c94cffdb4-lj8nf" Oct 12 07:45:06 crc kubenswrapper[4599]: I1012 07:45:06.995466 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-5c94cffdb4-lj8nf" podStartSLOduration=2.399026191 podStartE2EDuration="5.995449023s" podCreationTimestamp="2025-10-12 07:45:01 +0000 UTC" firstStartedPulling="2025-10-12 07:45:02.659854407 +0000 UTC m=+599.449049909" lastFinishedPulling="2025-10-12 07:45:06.256277238 +0000 UTC m=+603.045472741" observedRunningTime="2025-10-12 07:45:06.989563398 +0000 UTC m=+603.778758901" watchObservedRunningTime="2025-10-12 07:45:06.995449023 +0000 UTC m=+603.784644526" Oct 12 07:45:09 crc kubenswrapper[4599]: E1012 07:45:09.271617 4599 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6dd7df5_77c0_4ffb_9d14_425465bb9ab3.slice/crio-4c9ebcce603996059a2cd9cf9e3d47b7705d9410fb632f75638f89b270c0798c\": RecentStats: unable to find data in memory cache]" Oct 12 07:45:19 crc kubenswrapper[4599]: E1012 07:45:19.402543 4599 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6dd7df5_77c0_4ffb_9d14_425465bb9ab3.slice/crio-4c9ebcce603996059a2cd9cf9e3d47b7705d9410fb632f75638f89b270c0798c\": RecentStats: unable to find data in memory cache]" Oct 12 07:45:22 crc kubenswrapper[4599]: I1012 07:45:22.297535 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-5c94cffdb4-lj8nf" Oct 12 07:45:29 crc kubenswrapper[4599]: E1012 07:45:29.520575 4599 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6dd7df5_77c0_4ffb_9d14_425465bb9ab3.slice/crio-4c9ebcce603996059a2cd9cf9e3d47b7705d9410fb632f75638f89b270c0798c\": RecentStats: unable to find data in memory cache]" Oct 12 07:45:39 crc kubenswrapper[4599]: E1012 07:45:39.632403 4599 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6dd7df5_77c0_4ffb_9d14_425465bb9ab3.slice/crio-4c9ebcce603996059a2cd9cf9e3d47b7705d9410fb632f75638f89b270c0798c\": RecentStats: unable to find data in memory cache]" Oct 12 07:45:42 crc kubenswrapper[4599]: I1012 07:45:42.027540 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-75d566c47b-2dhk8" Oct 12 07:45:42 crc kubenswrapper[4599]: I1012 07:45:42.578639 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-76vb2"] Oct 12 07:45:42 crc kubenswrapper[4599]: E1012 07:45:42.579225 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6dd7df5-77c0-4ffb-9d14-425465bb9ab3" containerName="collect-profiles" Oct 12 07:45:42 crc kubenswrapper[4599]: I1012 07:45:42.579240 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6dd7df5-77c0-4ffb-9d14-425465bb9ab3" containerName="collect-profiles" Oct 12 07:45:42 crc kubenswrapper[4599]: I1012 07:45:42.579380 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6dd7df5-77c0-4ffb-9d14-425465bb9ab3" containerName="collect-profiles" Oct 12 07:45:42 crc kubenswrapper[4599]: I1012 07:45:42.581218 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-76vb2" Oct 12 07:45:42 crc kubenswrapper[4599]: I1012 07:45:42.581790 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-zcx46"] Oct 12 07:45:42 crc kubenswrapper[4599]: I1012 07:45:42.582657 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-zcx46" Oct 12 07:45:42 crc kubenswrapper[4599]: I1012 07:45:42.583927 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Oct 12 07:45:42 crc kubenswrapper[4599]: I1012 07:45:42.584312 4599 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Oct 12 07:45:42 crc kubenswrapper[4599]: I1012 07:45:42.584431 4599 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-ljmb4" Oct 12 07:45:42 crc kubenswrapper[4599]: I1012 07:45:42.586217 4599 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Oct 12 07:45:42 crc kubenswrapper[4599]: I1012 07:45:42.588841 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-zcx46"] Oct 12 07:45:42 crc kubenswrapper[4599]: I1012 07:45:42.656429 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-g4nt4"] Oct 12 07:45:42 crc kubenswrapper[4599]: I1012 07:45:42.657439 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-g4nt4" Oct 12 07:45:42 crc kubenswrapper[4599]: I1012 07:45:42.659264 4599 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Oct 12 07:45:42 crc kubenswrapper[4599]: I1012 07:45:42.659497 4599 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Oct 12 07:45:42 crc kubenswrapper[4599]: I1012 07:45:42.659534 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Oct 12 07:45:42 crc kubenswrapper[4599]: I1012 07:45:42.662941 4599 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-vjq4s" Oct 12 07:45:42 crc kubenswrapper[4599]: I1012 07:45:42.667424 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-68d546b9d8-v2d4g"] Oct 12 07:45:42 crc kubenswrapper[4599]: I1012 07:45:42.668420 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-v2d4g" Oct 12 07:45:42 crc kubenswrapper[4599]: I1012 07:45:42.669563 4599 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Oct 12 07:45:42 crc kubenswrapper[4599]: I1012 07:45:42.681590 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-v2d4g"] Oct 12 07:45:42 crc kubenswrapper[4599]: I1012 07:45:42.702818 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/e2411c1e-4e34-4ca5-aa25-0b317652dd35-frr-sockets\") pod \"frr-k8s-76vb2\" (UID: \"e2411c1e-4e34-4ca5-aa25-0b317652dd35\") " pod="metallb-system/frr-k8s-76vb2" Oct 12 07:45:42 crc kubenswrapper[4599]: I1012 07:45:42.702859 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/12d002b3-67b4-4405-ab2f-930346bfc610-metrics-certs\") pod \"speaker-g4nt4\" (UID: \"12d002b3-67b4-4405-ab2f-930346bfc610\") " pod="metallb-system/speaker-g4nt4" Oct 12 07:45:42 crc kubenswrapper[4599]: I1012 07:45:42.702889 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fvfn\" (UniqueName: \"kubernetes.io/projected/6f5b7478-9c00-4302-b356-cae717338202-kube-api-access-9fvfn\") pod \"frr-k8s-webhook-server-64bf5d555-zcx46\" (UID: \"6f5b7478-9c00-4302-b356-cae717338202\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-zcx46" Oct 12 07:45:42 crc kubenswrapper[4599]: I1012 07:45:42.703068 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e2411c1e-4e34-4ca5-aa25-0b317652dd35-metrics-certs\") pod \"frr-k8s-76vb2\" (UID: \"e2411c1e-4e34-4ca5-aa25-0b317652dd35\") " pod="metallb-system/frr-k8s-76vb2" Oct 12 07:45:42 crc kubenswrapper[4599]: I1012 07:45:42.703229 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/12d002b3-67b4-4405-ab2f-930346bfc610-memberlist\") pod \"speaker-g4nt4\" (UID: \"12d002b3-67b4-4405-ab2f-930346bfc610\") " pod="metallb-system/speaker-g4nt4" Oct 12 07:45:42 crc kubenswrapper[4599]: I1012 07:45:42.703277 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/e2411c1e-4e34-4ca5-aa25-0b317652dd35-metrics\") pod \"frr-k8s-76vb2\" (UID: \"e2411c1e-4e34-4ca5-aa25-0b317652dd35\") " pod="metallb-system/frr-k8s-76vb2" Oct 12 07:45:42 crc kubenswrapper[4599]: I1012 07:45:42.703296 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/12d002b3-67b4-4405-ab2f-930346bfc610-metallb-excludel2\") pod \"speaker-g4nt4\" (UID: \"12d002b3-67b4-4405-ab2f-930346bfc610\") " pod="metallb-system/speaker-g4nt4" Oct 12 07:45:42 crc kubenswrapper[4599]: I1012 07:45:42.703408 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6f5b7478-9c00-4302-b356-cae717338202-cert\") pod \"frr-k8s-webhook-server-64bf5d555-zcx46\" (UID: \"6f5b7478-9c00-4302-b356-cae717338202\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-zcx46" Oct 12 07:45:42 crc kubenswrapper[4599]: I1012 07:45:42.703436 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qrjj\" (UniqueName: \"kubernetes.io/projected/12d002b3-67b4-4405-ab2f-930346bfc610-kube-api-access-9qrjj\") pod \"speaker-g4nt4\" (UID: \"12d002b3-67b4-4405-ab2f-930346bfc610\") " pod="metallb-system/speaker-g4nt4" Oct 12 07:45:42 crc kubenswrapper[4599]: I1012 07:45:42.703455 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/e2411c1e-4e34-4ca5-aa25-0b317652dd35-frr-startup\") pod \"frr-k8s-76vb2\" (UID: \"e2411c1e-4e34-4ca5-aa25-0b317652dd35\") " pod="metallb-system/frr-k8s-76vb2" Oct 12 07:45:42 crc kubenswrapper[4599]: I1012 07:45:42.703478 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/e2411c1e-4e34-4ca5-aa25-0b317652dd35-reloader\") pod \"frr-k8s-76vb2\" (UID: \"e2411c1e-4e34-4ca5-aa25-0b317652dd35\") " pod="metallb-system/frr-k8s-76vb2" Oct 12 07:45:42 crc kubenswrapper[4599]: I1012 07:45:42.703495 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzlq4\" (UniqueName: \"kubernetes.io/projected/a8d05e19-feeb-41e3-ab30-f55af42472ca-kube-api-access-hzlq4\") pod \"controller-68d546b9d8-v2d4g\" (UID: \"a8d05e19-feeb-41e3-ab30-f55af42472ca\") " pod="metallb-system/controller-68d546b9d8-v2d4g" Oct 12 07:45:42 crc kubenswrapper[4599]: I1012 07:45:42.703511 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a8d05e19-feeb-41e3-ab30-f55af42472ca-cert\") pod \"controller-68d546b9d8-v2d4g\" (UID: \"a8d05e19-feeb-41e3-ab30-f55af42472ca\") " pod="metallb-system/controller-68d546b9d8-v2d4g" Oct 12 07:45:42 crc kubenswrapper[4599]: I1012 07:45:42.703529 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a8d05e19-feeb-41e3-ab30-f55af42472ca-metrics-certs\") pod \"controller-68d546b9d8-v2d4g\" (UID: \"a8d05e19-feeb-41e3-ab30-f55af42472ca\") " pod="metallb-system/controller-68d546b9d8-v2d4g" Oct 12 07:45:42 crc kubenswrapper[4599]: I1012 07:45:42.703611 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/e2411c1e-4e34-4ca5-aa25-0b317652dd35-frr-conf\") pod \"frr-k8s-76vb2\" (UID: \"e2411c1e-4e34-4ca5-aa25-0b317652dd35\") " pod="metallb-system/frr-k8s-76vb2" Oct 12 07:45:42 crc kubenswrapper[4599]: I1012 07:45:42.703798 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6fjl\" (UniqueName: \"kubernetes.io/projected/e2411c1e-4e34-4ca5-aa25-0b317652dd35-kube-api-access-f6fjl\") pod \"frr-k8s-76vb2\" (UID: \"e2411c1e-4e34-4ca5-aa25-0b317652dd35\") " pod="metallb-system/frr-k8s-76vb2" Oct 12 07:45:42 crc kubenswrapper[4599]: I1012 07:45:42.805183 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e2411c1e-4e34-4ca5-aa25-0b317652dd35-metrics-certs\") pod \"frr-k8s-76vb2\" (UID: \"e2411c1e-4e34-4ca5-aa25-0b317652dd35\") " pod="metallb-system/frr-k8s-76vb2" Oct 12 07:45:42 crc kubenswrapper[4599]: I1012 07:45:42.805250 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/12d002b3-67b4-4405-ab2f-930346bfc610-memberlist\") pod \"speaker-g4nt4\" (UID: \"12d002b3-67b4-4405-ab2f-930346bfc610\") " pod="metallb-system/speaker-g4nt4" Oct 12 07:45:42 crc kubenswrapper[4599]: I1012 07:45:42.805269 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/e2411c1e-4e34-4ca5-aa25-0b317652dd35-metrics\") pod \"frr-k8s-76vb2\" (UID: \"e2411c1e-4e34-4ca5-aa25-0b317652dd35\") " pod="metallb-system/frr-k8s-76vb2" Oct 12 07:45:42 crc kubenswrapper[4599]: I1012 07:45:42.805287 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/12d002b3-67b4-4405-ab2f-930346bfc610-metallb-excludel2\") pod \"speaker-g4nt4\" (UID: \"12d002b3-67b4-4405-ab2f-930346bfc610\") " pod="metallb-system/speaker-g4nt4" Oct 12 07:45:42 crc kubenswrapper[4599]: I1012 07:45:42.805313 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6f5b7478-9c00-4302-b356-cae717338202-cert\") pod \"frr-k8s-webhook-server-64bf5d555-zcx46\" (UID: \"6f5b7478-9c00-4302-b356-cae717338202\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-zcx46" Oct 12 07:45:42 crc kubenswrapper[4599]: I1012 07:45:42.805345 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qrjj\" (UniqueName: \"kubernetes.io/projected/12d002b3-67b4-4405-ab2f-930346bfc610-kube-api-access-9qrjj\") pod \"speaker-g4nt4\" (UID: \"12d002b3-67b4-4405-ab2f-930346bfc610\") " pod="metallb-system/speaker-g4nt4" Oct 12 07:45:42 crc kubenswrapper[4599]: I1012 07:45:42.805365 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/e2411c1e-4e34-4ca5-aa25-0b317652dd35-frr-startup\") pod \"frr-k8s-76vb2\" (UID: \"e2411c1e-4e34-4ca5-aa25-0b317652dd35\") " pod="metallb-system/frr-k8s-76vb2" Oct 12 07:45:42 crc kubenswrapper[4599]: I1012 07:45:42.805384 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/e2411c1e-4e34-4ca5-aa25-0b317652dd35-reloader\") pod \"frr-k8s-76vb2\" (UID: \"e2411c1e-4e34-4ca5-aa25-0b317652dd35\") " pod="metallb-system/frr-k8s-76vb2" Oct 12 07:45:42 crc kubenswrapper[4599]: I1012 07:45:42.805402 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzlq4\" (UniqueName: \"kubernetes.io/projected/a8d05e19-feeb-41e3-ab30-f55af42472ca-kube-api-access-hzlq4\") pod \"controller-68d546b9d8-v2d4g\" (UID: \"a8d05e19-feeb-41e3-ab30-f55af42472ca\") " pod="metallb-system/controller-68d546b9d8-v2d4g" Oct 12 07:45:42 crc kubenswrapper[4599]: I1012 07:45:42.805420 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a8d05e19-feeb-41e3-ab30-f55af42472ca-cert\") pod \"controller-68d546b9d8-v2d4g\" (UID: \"a8d05e19-feeb-41e3-ab30-f55af42472ca\") " pod="metallb-system/controller-68d546b9d8-v2d4g" Oct 12 07:45:42 crc kubenswrapper[4599]: I1012 07:45:42.805438 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a8d05e19-feeb-41e3-ab30-f55af42472ca-metrics-certs\") pod \"controller-68d546b9d8-v2d4g\" (UID: \"a8d05e19-feeb-41e3-ab30-f55af42472ca\") " pod="metallb-system/controller-68d546b9d8-v2d4g" Oct 12 07:45:42 crc kubenswrapper[4599]: I1012 07:45:42.805460 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/e2411c1e-4e34-4ca5-aa25-0b317652dd35-frr-conf\") pod \"frr-k8s-76vb2\" (UID: \"e2411c1e-4e34-4ca5-aa25-0b317652dd35\") " pod="metallb-system/frr-k8s-76vb2" Oct 12 07:45:42 crc kubenswrapper[4599]: I1012 07:45:42.805475 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6fjl\" (UniqueName: \"kubernetes.io/projected/e2411c1e-4e34-4ca5-aa25-0b317652dd35-kube-api-access-f6fjl\") pod \"frr-k8s-76vb2\" (UID: \"e2411c1e-4e34-4ca5-aa25-0b317652dd35\") " pod="metallb-system/frr-k8s-76vb2" Oct 12 07:45:42 crc kubenswrapper[4599]: I1012 07:45:42.805522 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/e2411c1e-4e34-4ca5-aa25-0b317652dd35-frr-sockets\") pod \"frr-k8s-76vb2\" (UID: \"e2411c1e-4e34-4ca5-aa25-0b317652dd35\") " pod="metallb-system/frr-k8s-76vb2" Oct 12 07:45:42 crc kubenswrapper[4599]: I1012 07:45:42.805538 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/12d002b3-67b4-4405-ab2f-930346bfc610-metrics-certs\") pod \"speaker-g4nt4\" (UID: \"12d002b3-67b4-4405-ab2f-930346bfc610\") " pod="metallb-system/speaker-g4nt4" Oct 12 07:45:42 crc kubenswrapper[4599]: I1012 07:45:42.805561 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fvfn\" (UniqueName: \"kubernetes.io/projected/6f5b7478-9c00-4302-b356-cae717338202-kube-api-access-9fvfn\") pod \"frr-k8s-webhook-server-64bf5d555-zcx46\" (UID: \"6f5b7478-9c00-4302-b356-cae717338202\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-zcx46" Oct 12 07:45:42 crc kubenswrapper[4599]: E1012 07:45:42.805673 4599 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Oct 12 07:45:42 crc kubenswrapper[4599]: E1012 07:45:42.805761 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a8d05e19-feeb-41e3-ab30-f55af42472ca-metrics-certs podName:a8d05e19-feeb-41e3-ab30-f55af42472ca nodeName:}" failed. No retries permitted until 2025-10-12 07:45:43.305738789 +0000 UTC m=+640.094934291 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a8d05e19-feeb-41e3-ab30-f55af42472ca-metrics-certs") pod "controller-68d546b9d8-v2d4g" (UID: "a8d05e19-feeb-41e3-ab30-f55af42472ca") : secret "controller-certs-secret" not found Oct 12 07:45:42 crc kubenswrapper[4599]: I1012 07:45:42.806023 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/e2411c1e-4e34-4ca5-aa25-0b317652dd35-metrics\") pod \"frr-k8s-76vb2\" (UID: \"e2411c1e-4e34-4ca5-aa25-0b317652dd35\") " pod="metallb-system/frr-k8s-76vb2" Oct 12 07:45:42 crc kubenswrapper[4599]: I1012 07:45:42.806078 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/e2411c1e-4e34-4ca5-aa25-0b317652dd35-reloader\") pod \"frr-k8s-76vb2\" (UID: \"e2411c1e-4e34-4ca5-aa25-0b317652dd35\") " pod="metallb-system/frr-k8s-76vb2" Oct 12 07:45:42 crc kubenswrapper[4599]: E1012 07:45:42.806113 4599 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 12 07:45:42 crc kubenswrapper[4599]: E1012 07:45:42.806184 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/12d002b3-67b4-4405-ab2f-930346bfc610-memberlist podName:12d002b3-67b4-4405-ab2f-930346bfc610 nodeName:}" failed. No retries permitted until 2025-10-12 07:45:43.306160084 +0000 UTC m=+640.095355587 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/12d002b3-67b4-4405-ab2f-930346bfc610-memberlist") pod "speaker-g4nt4" (UID: "12d002b3-67b4-4405-ab2f-930346bfc610") : secret "metallb-memberlist" not found Oct 12 07:45:42 crc kubenswrapper[4599]: I1012 07:45:42.806234 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/e2411c1e-4e34-4ca5-aa25-0b317652dd35-frr-conf\") pod \"frr-k8s-76vb2\" (UID: \"e2411c1e-4e34-4ca5-aa25-0b317652dd35\") " pod="metallb-system/frr-k8s-76vb2" Oct 12 07:45:42 crc kubenswrapper[4599]: I1012 07:45:42.806286 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/e2411c1e-4e34-4ca5-aa25-0b317652dd35-frr-sockets\") pod \"frr-k8s-76vb2\" (UID: \"e2411c1e-4e34-4ca5-aa25-0b317652dd35\") " pod="metallb-system/frr-k8s-76vb2" Oct 12 07:45:42 crc kubenswrapper[4599]: I1012 07:45:42.806452 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/12d002b3-67b4-4405-ab2f-930346bfc610-metallb-excludel2\") pod \"speaker-g4nt4\" (UID: \"12d002b3-67b4-4405-ab2f-930346bfc610\") " pod="metallb-system/speaker-g4nt4" Oct 12 07:45:42 crc kubenswrapper[4599]: I1012 07:45:42.807268 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/e2411c1e-4e34-4ca5-aa25-0b317652dd35-frr-startup\") pod \"frr-k8s-76vb2\" (UID: \"e2411c1e-4e34-4ca5-aa25-0b317652dd35\") " pod="metallb-system/frr-k8s-76vb2" Oct 12 07:45:42 crc kubenswrapper[4599]: I1012 07:45:42.808424 4599 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 12 07:45:42 crc kubenswrapper[4599]: I1012 07:45:42.811220 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e2411c1e-4e34-4ca5-aa25-0b317652dd35-metrics-certs\") pod \"frr-k8s-76vb2\" (UID: \"e2411c1e-4e34-4ca5-aa25-0b317652dd35\") " pod="metallb-system/frr-k8s-76vb2" Oct 12 07:45:42 crc kubenswrapper[4599]: I1012 07:45:42.811741 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6f5b7478-9c00-4302-b356-cae717338202-cert\") pod \"frr-k8s-webhook-server-64bf5d555-zcx46\" (UID: \"6f5b7478-9c00-4302-b356-cae717338202\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-zcx46" Oct 12 07:45:42 crc kubenswrapper[4599]: I1012 07:45:42.812817 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/12d002b3-67b4-4405-ab2f-930346bfc610-metrics-certs\") pod \"speaker-g4nt4\" (UID: \"12d002b3-67b4-4405-ab2f-930346bfc610\") " pod="metallb-system/speaker-g4nt4" Oct 12 07:45:42 crc kubenswrapper[4599]: I1012 07:45:42.819135 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a8d05e19-feeb-41e3-ab30-f55af42472ca-cert\") pod \"controller-68d546b9d8-v2d4g\" (UID: \"a8d05e19-feeb-41e3-ab30-f55af42472ca\") " pod="metallb-system/controller-68d546b9d8-v2d4g" Oct 12 07:45:42 crc kubenswrapper[4599]: I1012 07:45:42.819687 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fvfn\" (UniqueName: \"kubernetes.io/projected/6f5b7478-9c00-4302-b356-cae717338202-kube-api-access-9fvfn\") pod \"frr-k8s-webhook-server-64bf5d555-zcx46\" (UID: \"6f5b7478-9c00-4302-b356-cae717338202\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-zcx46" Oct 12 07:45:42 crc kubenswrapper[4599]: I1012 07:45:42.822167 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6fjl\" (UniqueName: \"kubernetes.io/projected/e2411c1e-4e34-4ca5-aa25-0b317652dd35-kube-api-access-f6fjl\") pod \"frr-k8s-76vb2\" (UID: \"e2411c1e-4e34-4ca5-aa25-0b317652dd35\") " pod="metallb-system/frr-k8s-76vb2" Oct 12 07:45:42 crc kubenswrapper[4599]: I1012 07:45:42.824660 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzlq4\" (UniqueName: \"kubernetes.io/projected/a8d05e19-feeb-41e3-ab30-f55af42472ca-kube-api-access-hzlq4\") pod \"controller-68d546b9d8-v2d4g\" (UID: \"a8d05e19-feeb-41e3-ab30-f55af42472ca\") " pod="metallb-system/controller-68d546b9d8-v2d4g" Oct 12 07:45:42 crc kubenswrapper[4599]: I1012 07:45:42.826685 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qrjj\" (UniqueName: \"kubernetes.io/projected/12d002b3-67b4-4405-ab2f-930346bfc610-kube-api-access-9qrjj\") pod \"speaker-g4nt4\" (UID: \"12d002b3-67b4-4405-ab2f-930346bfc610\") " pod="metallb-system/speaker-g4nt4" Oct 12 07:45:42 crc kubenswrapper[4599]: I1012 07:45:42.897152 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-76vb2" Oct 12 07:45:42 crc kubenswrapper[4599]: I1012 07:45:42.906744 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-zcx46" Oct 12 07:45:43 crc kubenswrapper[4599]: I1012 07:45:43.166627 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-76vb2" event={"ID":"e2411c1e-4e34-4ca5-aa25-0b317652dd35","Type":"ContainerStarted","Data":"efa0960406a4d435d04fe63f4e96c7b2e0435800644d70a9f614ad191f874d58"} Oct 12 07:45:43 crc kubenswrapper[4599]: I1012 07:45:43.268042 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-zcx46"] Oct 12 07:45:43 crc kubenswrapper[4599]: W1012 07:45:43.273071 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6f5b7478_9c00_4302_b356_cae717338202.slice/crio-fd53354479627e8a608dc7882b5c97746a6d056b79c651bc6c66c354b468bbc0 WatchSource:0}: Error finding container fd53354479627e8a608dc7882b5c97746a6d056b79c651bc6c66c354b468bbc0: Status 404 returned error can't find the container with id fd53354479627e8a608dc7882b5c97746a6d056b79c651bc6c66c354b468bbc0 Oct 12 07:45:43 crc kubenswrapper[4599]: I1012 07:45:43.311574 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/12d002b3-67b4-4405-ab2f-930346bfc610-memberlist\") pod \"speaker-g4nt4\" (UID: \"12d002b3-67b4-4405-ab2f-930346bfc610\") " pod="metallb-system/speaker-g4nt4" Oct 12 07:45:43 crc kubenswrapper[4599]: I1012 07:45:43.311634 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a8d05e19-feeb-41e3-ab30-f55af42472ca-metrics-certs\") pod \"controller-68d546b9d8-v2d4g\" (UID: \"a8d05e19-feeb-41e3-ab30-f55af42472ca\") " pod="metallb-system/controller-68d546b9d8-v2d4g" Oct 12 07:45:43 crc kubenswrapper[4599]: E1012 07:45:43.311827 4599 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 12 07:45:43 crc kubenswrapper[4599]: E1012 07:45:43.311942 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/12d002b3-67b4-4405-ab2f-930346bfc610-memberlist podName:12d002b3-67b4-4405-ab2f-930346bfc610 nodeName:}" failed. No retries permitted until 2025-10-12 07:45:44.311917536 +0000 UTC m=+641.101113038 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/12d002b3-67b4-4405-ab2f-930346bfc610-memberlist") pod "speaker-g4nt4" (UID: "12d002b3-67b4-4405-ab2f-930346bfc610") : secret "metallb-memberlist" not found Oct 12 07:45:43 crc kubenswrapper[4599]: I1012 07:45:43.316876 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a8d05e19-feeb-41e3-ab30-f55af42472ca-metrics-certs\") pod \"controller-68d546b9d8-v2d4g\" (UID: \"a8d05e19-feeb-41e3-ab30-f55af42472ca\") " pod="metallb-system/controller-68d546b9d8-v2d4g" Oct 12 07:45:43 crc kubenswrapper[4599]: I1012 07:45:43.578769 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-v2d4g" Oct 12 07:45:43 crc kubenswrapper[4599]: I1012 07:45:43.805345 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-v2d4g"] Oct 12 07:45:43 crc kubenswrapper[4599]: W1012 07:45:43.808552 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8d05e19_feeb_41e3_ab30_f55af42472ca.slice/crio-d6af43319a50032cdbfb54f4457be4dfe12c4702538ecd5b1007271aeb776d6a WatchSource:0}: Error finding container d6af43319a50032cdbfb54f4457be4dfe12c4702538ecd5b1007271aeb776d6a: Status 404 returned error can't find the container with id d6af43319a50032cdbfb54f4457be4dfe12c4702538ecd5b1007271aeb776d6a Oct 12 07:45:44 crc kubenswrapper[4599]: I1012 07:45:44.172595 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-v2d4g" event={"ID":"a8d05e19-feeb-41e3-ab30-f55af42472ca","Type":"ContainerStarted","Data":"a76ad325c2b8c373db89502663bc947930be20486ee001ee136e42a3aa20ab95"} Oct 12 07:45:44 crc kubenswrapper[4599]: I1012 07:45:44.172647 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-v2d4g" event={"ID":"a8d05e19-feeb-41e3-ab30-f55af42472ca","Type":"ContainerStarted","Data":"07a6c3cdee05ec866e0dd08d143b6ed22e8938adcedc46ecb2ac613ba1e62daf"} Oct 12 07:45:44 crc kubenswrapper[4599]: I1012 07:45:44.172658 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-v2d4g" event={"ID":"a8d05e19-feeb-41e3-ab30-f55af42472ca","Type":"ContainerStarted","Data":"d6af43319a50032cdbfb54f4457be4dfe12c4702538ecd5b1007271aeb776d6a"} Oct 12 07:45:44 crc kubenswrapper[4599]: I1012 07:45:44.172694 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-68d546b9d8-v2d4g" Oct 12 07:45:44 crc kubenswrapper[4599]: I1012 07:45:44.174870 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-zcx46" event={"ID":"6f5b7478-9c00-4302-b356-cae717338202","Type":"ContainerStarted","Data":"fd53354479627e8a608dc7882b5c97746a6d056b79c651bc6c66c354b468bbc0"} Oct 12 07:45:44 crc kubenswrapper[4599]: I1012 07:45:44.187109 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-68d546b9d8-v2d4g" podStartSLOduration=2.187089287 podStartE2EDuration="2.187089287s" podCreationTimestamp="2025-10-12 07:45:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:45:44.183033686 +0000 UTC m=+640.972229188" watchObservedRunningTime="2025-10-12 07:45:44.187089287 +0000 UTC m=+640.976284789" Oct 12 07:45:44 crc kubenswrapper[4599]: I1012 07:45:44.323845 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/12d002b3-67b4-4405-ab2f-930346bfc610-memberlist\") pod \"speaker-g4nt4\" (UID: \"12d002b3-67b4-4405-ab2f-930346bfc610\") " pod="metallb-system/speaker-g4nt4" Oct 12 07:45:44 crc kubenswrapper[4599]: I1012 07:45:44.328738 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/12d002b3-67b4-4405-ab2f-930346bfc610-memberlist\") pod \"speaker-g4nt4\" (UID: \"12d002b3-67b4-4405-ab2f-930346bfc610\") " pod="metallb-system/speaker-g4nt4" Oct 12 07:45:44 crc kubenswrapper[4599]: I1012 07:45:44.468865 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-g4nt4" Oct 12 07:45:44 crc kubenswrapper[4599]: W1012 07:45:44.490293 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12d002b3_67b4_4405_ab2f_930346bfc610.slice/crio-8e22d310fbb8a4c122f8919fb418e668919c4fb5f1ffb1366bc49324dabb8b3b WatchSource:0}: Error finding container 8e22d310fbb8a4c122f8919fb418e668919c4fb5f1ffb1366bc49324dabb8b3b: Status 404 returned error can't find the container with id 8e22d310fbb8a4c122f8919fb418e668919c4fb5f1ffb1366bc49324dabb8b3b Oct 12 07:45:45 crc kubenswrapper[4599]: I1012 07:45:45.182577 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-g4nt4" event={"ID":"12d002b3-67b4-4405-ab2f-930346bfc610","Type":"ContainerStarted","Data":"84c84260cd5d7c85aec6acebc97cf3d771cc96ab60534e2db6c8c3312c7bef39"} Oct 12 07:45:45 crc kubenswrapper[4599]: I1012 07:45:45.182915 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-g4nt4" event={"ID":"12d002b3-67b4-4405-ab2f-930346bfc610","Type":"ContainerStarted","Data":"9be31b9b7d7e130b45f145efeca678196ba579ee882b688afb9c19f3ec1ba032"} Oct 12 07:45:45 crc kubenswrapper[4599]: I1012 07:45:45.182931 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-g4nt4" event={"ID":"12d002b3-67b4-4405-ab2f-930346bfc610","Type":"ContainerStarted","Data":"8e22d310fbb8a4c122f8919fb418e668919c4fb5f1ffb1366bc49324dabb8b3b"} Oct 12 07:45:45 crc kubenswrapper[4599]: I1012 07:45:45.183120 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-g4nt4" Oct 12 07:45:45 crc kubenswrapper[4599]: I1012 07:45:45.200133 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-g4nt4" podStartSLOduration=3.20011395 podStartE2EDuration="3.20011395s" podCreationTimestamp="2025-10-12 07:45:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:45:45.198853062 +0000 UTC m=+641.988048564" watchObservedRunningTime="2025-10-12 07:45:45.20011395 +0000 UTC m=+641.989309452" Oct 12 07:45:49 crc kubenswrapper[4599]: I1012 07:45:49.207322 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-zcx46" event={"ID":"6f5b7478-9c00-4302-b356-cae717338202","Type":"ContainerStarted","Data":"2b9066e05317ea1b9f5306afa9c754f138327c2e15643a885384a83eaac08faa"} Oct 12 07:45:49 crc kubenswrapper[4599]: I1012 07:45:49.207666 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-zcx46" Oct 12 07:45:49 crc kubenswrapper[4599]: I1012 07:45:49.209391 4599 generic.go:334] "Generic (PLEG): container finished" podID="e2411c1e-4e34-4ca5-aa25-0b317652dd35" containerID="37d6b953bcb6dea6e08cccda35d730e3a7c00d51797121daeee578e21db3dd18" exitCode=0 Oct 12 07:45:49 crc kubenswrapper[4599]: I1012 07:45:49.209448 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-76vb2" event={"ID":"e2411c1e-4e34-4ca5-aa25-0b317652dd35","Type":"ContainerDied","Data":"37d6b953bcb6dea6e08cccda35d730e3a7c00d51797121daeee578e21db3dd18"} Oct 12 07:45:49 crc kubenswrapper[4599]: I1012 07:45:49.242728 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-zcx46" podStartSLOduration=1.612768934 podStartE2EDuration="7.242702451s" podCreationTimestamp="2025-10-12 07:45:42 +0000 UTC" firstStartedPulling="2025-10-12 07:45:43.275374599 +0000 UTC m=+640.064570101" lastFinishedPulling="2025-10-12 07:45:48.905308116 +0000 UTC m=+645.694503618" observedRunningTime="2025-10-12 07:45:49.226174327 +0000 UTC m=+646.015369828" watchObservedRunningTime="2025-10-12 07:45:49.242702451 +0000 UTC m=+646.031898113" Oct 12 07:45:49 crc kubenswrapper[4599]: E1012 07:45:49.750736 4599 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6dd7df5_77c0_4ffb_9d14_425465bb9ab3.slice/crio-4c9ebcce603996059a2cd9cf9e3d47b7705d9410fb632f75638f89b270c0798c\": RecentStats: unable to find data in memory cache]" Oct 12 07:45:50 crc kubenswrapper[4599]: I1012 07:45:50.218235 4599 generic.go:334] "Generic (PLEG): container finished" podID="e2411c1e-4e34-4ca5-aa25-0b317652dd35" containerID="8d90b9b264d5e4e983639891ca8e65de6aabb654aaa73b9293a95b820d5f2b57" exitCode=0 Oct 12 07:45:50 crc kubenswrapper[4599]: I1012 07:45:50.218358 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-76vb2" event={"ID":"e2411c1e-4e34-4ca5-aa25-0b317652dd35","Type":"ContainerDied","Data":"8d90b9b264d5e4e983639891ca8e65de6aabb654aaa73b9293a95b820d5f2b57"} Oct 12 07:45:51 crc kubenswrapper[4599]: I1012 07:45:51.225403 4599 generic.go:334] "Generic (PLEG): container finished" podID="e2411c1e-4e34-4ca5-aa25-0b317652dd35" containerID="0c02e488841af3692bbd99194cfbe5644d1f36e57ce1dc7784fadec37f729dfe" exitCode=0 Oct 12 07:45:51 crc kubenswrapper[4599]: I1012 07:45:51.225641 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-76vb2" event={"ID":"e2411c1e-4e34-4ca5-aa25-0b317652dd35","Type":"ContainerDied","Data":"0c02e488841af3692bbd99194cfbe5644d1f36e57ce1dc7784fadec37f729dfe"} Oct 12 07:45:52 crc kubenswrapper[4599]: I1012 07:45:52.237202 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-76vb2" event={"ID":"e2411c1e-4e34-4ca5-aa25-0b317652dd35","Type":"ContainerStarted","Data":"59a98117990006f124d89ad80382a01628a7aff7834a8a586e6eb6045087fff5"} Oct 12 07:45:52 crc kubenswrapper[4599]: I1012 07:45:52.237249 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-76vb2" event={"ID":"e2411c1e-4e34-4ca5-aa25-0b317652dd35","Type":"ContainerStarted","Data":"b10b388a7d6783dd84327fa25b2e36e9e9a54094ac948fdfa0d48b56825e2034"} Oct 12 07:45:52 crc kubenswrapper[4599]: I1012 07:45:52.237261 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-76vb2" event={"ID":"e2411c1e-4e34-4ca5-aa25-0b317652dd35","Type":"ContainerStarted","Data":"08c7641c4216bbdf9f5c0e8f32c1e203198deffecadde0597747846824094a2a"} Oct 12 07:45:52 crc kubenswrapper[4599]: I1012 07:45:52.237270 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-76vb2" event={"ID":"e2411c1e-4e34-4ca5-aa25-0b317652dd35","Type":"ContainerStarted","Data":"aeb12aa52139c9bf0f3c03ca97df23069e8eda19462ac7eef8a76233549f85d6"} Oct 12 07:45:52 crc kubenswrapper[4599]: I1012 07:45:52.237281 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-76vb2" event={"ID":"e2411c1e-4e34-4ca5-aa25-0b317652dd35","Type":"ContainerStarted","Data":"c46cc0c06e4fe813c35f75bd850ee998f8290791da476ac41c17bdf83755e716"} Oct 12 07:45:52 crc kubenswrapper[4599]: I1012 07:45:52.237288 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-76vb2" event={"ID":"e2411c1e-4e34-4ca5-aa25-0b317652dd35","Type":"ContainerStarted","Data":"4b8c76956d8bca75db66828c69a0ad6a58469be05fe5db624255ba659ccd7ea4"} Oct 12 07:45:52 crc kubenswrapper[4599]: I1012 07:45:52.237373 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-76vb2" Oct 12 07:45:52 crc kubenswrapper[4599]: I1012 07:45:52.261691 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-76vb2" podStartSLOduration=4.372558879 podStartE2EDuration="10.261678221s" podCreationTimestamp="2025-10-12 07:45:42 +0000 UTC" firstStartedPulling="2025-10-12 07:45:43.014650772 +0000 UTC m=+639.803846274" lastFinishedPulling="2025-10-12 07:45:48.903770114 +0000 UTC m=+645.692965616" observedRunningTime="2025-10-12 07:45:52.257161397 +0000 UTC m=+649.046356900" watchObservedRunningTime="2025-10-12 07:45:52.261678221 +0000 UTC m=+649.050873723" Oct 12 07:45:52 crc kubenswrapper[4599]: I1012 07:45:52.897524 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-76vb2" Oct 12 07:45:52 crc kubenswrapper[4599]: I1012 07:45:52.927483 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-76vb2" Oct 12 07:45:53 crc kubenswrapper[4599]: I1012 07:45:53.583398 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-68d546b9d8-v2d4g" Oct 12 07:45:54 crc kubenswrapper[4599]: I1012 07:45:54.472521 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-g4nt4" Oct 12 07:45:56 crc kubenswrapper[4599]: I1012 07:45:56.638293 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-ds62j"] Oct 12 07:45:56 crc kubenswrapper[4599]: I1012 07:45:56.639084 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-ds62j" Oct 12 07:45:56 crc kubenswrapper[4599]: I1012 07:45:56.641232 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Oct 12 07:45:56 crc kubenswrapper[4599]: I1012 07:45:56.641232 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Oct 12 07:45:56 crc kubenswrapper[4599]: I1012 07:45:56.641440 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-tb6zl" Oct 12 07:45:56 crc kubenswrapper[4599]: I1012 07:45:56.662157 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-ds62j"] Oct 12 07:45:56 crc kubenswrapper[4599]: I1012 07:45:56.700319 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x984s\" (UniqueName: \"kubernetes.io/projected/2167f650-c160-4ede-ae67-5c8fd1f86b25-kube-api-access-x984s\") pod \"openstack-operator-index-ds62j\" (UID: \"2167f650-c160-4ede-ae67-5c8fd1f86b25\") " pod="openstack-operators/openstack-operator-index-ds62j" Oct 12 07:45:56 crc kubenswrapper[4599]: I1012 07:45:56.801395 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x984s\" (UniqueName: \"kubernetes.io/projected/2167f650-c160-4ede-ae67-5c8fd1f86b25-kube-api-access-x984s\") pod \"openstack-operator-index-ds62j\" (UID: \"2167f650-c160-4ede-ae67-5c8fd1f86b25\") " pod="openstack-operators/openstack-operator-index-ds62j" Oct 12 07:45:56 crc kubenswrapper[4599]: I1012 07:45:56.817645 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x984s\" (UniqueName: \"kubernetes.io/projected/2167f650-c160-4ede-ae67-5c8fd1f86b25-kube-api-access-x984s\") pod \"openstack-operator-index-ds62j\" (UID: \"2167f650-c160-4ede-ae67-5c8fd1f86b25\") " pod="openstack-operators/openstack-operator-index-ds62j" Oct 12 07:45:56 crc kubenswrapper[4599]: I1012 07:45:56.955841 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-ds62j" Oct 12 07:45:57 crc kubenswrapper[4599]: I1012 07:45:57.315781 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-ds62j"] Oct 12 07:45:57 crc kubenswrapper[4599]: W1012 07:45:57.321170 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2167f650_c160_4ede_ae67_5c8fd1f86b25.slice/crio-6d44daaa5ddfa36f5603f900543161892a108312ada82c3c460cec91da677145 WatchSource:0}: Error finding container 6d44daaa5ddfa36f5603f900543161892a108312ada82c3c460cec91da677145: Status 404 returned error can't find the container with id 6d44daaa5ddfa36f5603f900543161892a108312ada82c3c460cec91da677145 Oct 12 07:45:58 crc kubenswrapper[4599]: I1012 07:45:58.267308 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-ds62j" event={"ID":"2167f650-c160-4ede-ae67-5c8fd1f86b25","Type":"ContainerStarted","Data":"6d44daaa5ddfa36f5603f900543161892a108312ada82c3c460cec91da677145"} Oct 12 07:45:59 crc kubenswrapper[4599]: I1012 07:45:59.277814 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-ds62j" event={"ID":"2167f650-c160-4ede-ae67-5c8fd1f86b25","Type":"ContainerStarted","Data":"6acbbdeb9832aede105d4e02f28fcb80316521563eb9f6f384204282be1332ef"} Oct 12 07:45:59 crc kubenswrapper[4599]: I1012 07:45:59.289916 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-ds62j" podStartSLOduration=2.433027646 podStartE2EDuration="3.289896496s" podCreationTimestamp="2025-10-12 07:45:56 +0000 UTC" firstStartedPulling="2025-10-12 07:45:57.323154102 +0000 UTC m=+654.112349604" lastFinishedPulling="2025-10-12 07:45:58.180022952 +0000 UTC m=+654.969218454" observedRunningTime="2025-10-12 07:45:59.28956473 +0000 UTC m=+656.078760232" watchObservedRunningTime="2025-10-12 07:45:59.289896496 +0000 UTC m=+656.079091998" Oct 12 07:45:59 crc kubenswrapper[4599]: E1012 07:45:59.862993 4599 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6dd7df5_77c0_4ffb_9d14_425465bb9ab3.slice/crio-4c9ebcce603996059a2cd9cf9e3d47b7705d9410fb632f75638f89b270c0798c\": RecentStats: unable to find data in memory cache]" Oct 12 07:46:02 crc kubenswrapper[4599]: I1012 07:46:02.901127 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-76vb2" Oct 12 07:46:02 crc kubenswrapper[4599]: I1012 07:46:02.910000 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-zcx46" Oct 12 07:46:06 crc kubenswrapper[4599]: I1012 07:46:06.956283 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-ds62j" Oct 12 07:46:06 crc kubenswrapper[4599]: I1012 07:46:06.956390 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-ds62j" Oct 12 07:46:06 crc kubenswrapper[4599]: I1012 07:46:06.982991 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-ds62j" Oct 12 07:46:07 crc kubenswrapper[4599]: I1012 07:46:07.339124 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-ds62j" Oct 12 07:46:14 crc kubenswrapper[4599]: I1012 07:46:14.407088 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365bb5qpj"] Oct 12 07:46:14 crc kubenswrapper[4599]: I1012 07:46:14.408416 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365bb5qpj" Oct 12 07:46:14 crc kubenswrapper[4599]: I1012 07:46:14.410123 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-vd5tj" Oct 12 07:46:14 crc kubenswrapper[4599]: I1012 07:46:14.415569 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365bb5qpj"] Oct 12 07:46:14 crc kubenswrapper[4599]: I1012 07:46:14.503514 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbjnf\" (UniqueName: \"kubernetes.io/projected/a64bef86-ebe2-411a-92d3-f63087030b92-kube-api-access-rbjnf\") pod \"bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365bb5qpj\" (UID: \"a64bef86-ebe2-411a-92d3-f63087030b92\") " pod="openstack-operators/bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365bb5qpj" Oct 12 07:46:14 crc kubenswrapper[4599]: I1012 07:46:14.503701 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a64bef86-ebe2-411a-92d3-f63087030b92-bundle\") pod \"bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365bb5qpj\" (UID: \"a64bef86-ebe2-411a-92d3-f63087030b92\") " pod="openstack-operators/bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365bb5qpj" Oct 12 07:46:14 crc kubenswrapper[4599]: I1012 07:46:14.503793 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a64bef86-ebe2-411a-92d3-f63087030b92-util\") pod \"bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365bb5qpj\" (UID: \"a64bef86-ebe2-411a-92d3-f63087030b92\") " pod="openstack-operators/bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365bb5qpj" Oct 12 07:46:14 crc kubenswrapper[4599]: I1012 07:46:14.604705 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a64bef86-ebe2-411a-92d3-f63087030b92-bundle\") pod \"bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365bb5qpj\" (UID: \"a64bef86-ebe2-411a-92d3-f63087030b92\") " pod="openstack-operators/bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365bb5qpj" Oct 12 07:46:14 crc kubenswrapper[4599]: I1012 07:46:14.605210 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a64bef86-ebe2-411a-92d3-f63087030b92-util\") pod \"bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365bb5qpj\" (UID: \"a64bef86-ebe2-411a-92d3-f63087030b92\") " pod="openstack-operators/bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365bb5qpj" Oct 12 07:46:14 crc kubenswrapper[4599]: I1012 07:46:14.605261 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbjnf\" (UniqueName: \"kubernetes.io/projected/a64bef86-ebe2-411a-92d3-f63087030b92-kube-api-access-rbjnf\") pod \"bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365bb5qpj\" (UID: \"a64bef86-ebe2-411a-92d3-f63087030b92\") " pod="openstack-operators/bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365bb5qpj" Oct 12 07:46:14 crc kubenswrapper[4599]: I1012 07:46:14.605414 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a64bef86-ebe2-411a-92d3-f63087030b92-bundle\") pod \"bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365bb5qpj\" (UID: \"a64bef86-ebe2-411a-92d3-f63087030b92\") " pod="openstack-operators/bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365bb5qpj" Oct 12 07:46:14 crc kubenswrapper[4599]: I1012 07:46:14.605562 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a64bef86-ebe2-411a-92d3-f63087030b92-util\") pod \"bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365bb5qpj\" (UID: \"a64bef86-ebe2-411a-92d3-f63087030b92\") " pod="openstack-operators/bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365bb5qpj" Oct 12 07:46:14 crc kubenswrapper[4599]: I1012 07:46:14.625855 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbjnf\" (UniqueName: \"kubernetes.io/projected/a64bef86-ebe2-411a-92d3-f63087030b92-kube-api-access-rbjnf\") pod \"bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365bb5qpj\" (UID: \"a64bef86-ebe2-411a-92d3-f63087030b92\") " pod="openstack-operators/bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365bb5qpj" Oct 12 07:46:14 crc kubenswrapper[4599]: I1012 07:46:14.721814 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365bb5qpj" Oct 12 07:46:15 crc kubenswrapper[4599]: I1012 07:46:15.088984 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365bb5qpj"] Oct 12 07:46:15 crc kubenswrapper[4599]: I1012 07:46:15.359068 4599 generic.go:334] "Generic (PLEG): container finished" podID="a64bef86-ebe2-411a-92d3-f63087030b92" containerID="f97dd77f62a34691d2acc895cc14e39eed16760ab194cbf67c33fe17144dbc38" exitCode=0 Oct 12 07:46:15 crc kubenswrapper[4599]: I1012 07:46:15.359141 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365bb5qpj" event={"ID":"a64bef86-ebe2-411a-92d3-f63087030b92","Type":"ContainerDied","Data":"f97dd77f62a34691d2acc895cc14e39eed16760ab194cbf67c33fe17144dbc38"} Oct 12 07:46:15 crc kubenswrapper[4599]: I1012 07:46:15.359196 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365bb5qpj" event={"ID":"a64bef86-ebe2-411a-92d3-f63087030b92","Type":"ContainerStarted","Data":"49e457324434150922df35025e11e6a0dd41c0d080426392fc36f4ec5cbf42f7"} Oct 12 07:46:16 crc kubenswrapper[4599]: I1012 07:46:16.366147 4599 generic.go:334] "Generic (PLEG): container finished" podID="a64bef86-ebe2-411a-92d3-f63087030b92" containerID="f5e6b697ad1a35694b06cebddfba28b5ccd9291224d6fefccf44e2d39841a639" exitCode=0 Oct 12 07:46:16 crc kubenswrapper[4599]: I1012 07:46:16.366235 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365bb5qpj" event={"ID":"a64bef86-ebe2-411a-92d3-f63087030b92","Type":"ContainerDied","Data":"f5e6b697ad1a35694b06cebddfba28b5ccd9291224d6fefccf44e2d39841a639"} Oct 12 07:46:17 crc kubenswrapper[4599]: I1012 07:46:17.375240 4599 generic.go:334] "Generic (PLEG): container finished" podID="a64bef86-ebe2-411a-92d3-f63087030b92" containerID="0ad10dc0a161092f155a2bf955d706a233afc75036d06a71ff0e4fb1dda52508" exitCode=0 Oct 12 07:46:17 crc kubenswrapper[4599]: I1012 07:46:17.375305 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365bb5qpj" event={"ID":"a64bef86-ebe2-411a-92d3-f63087030b92","Type":"ContainerDied","Data":"0ad10dc0a161092f155a2bf955d706a233afc75036d06a71ff0e4fb1dda52508"} Oct 12 07:46:18 crc kubenswrapper[4599]: I1012 07:46:18.591046 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365bb5qpj" Oct 12 07:46:18 crc kubenswrapper[4599]: I1012 07:46:18.663744 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a64bef86-ebe2-411a-92d3-f63087030b92-bundle\") pod \"a64bef86-ebe2-411a-92d3-f63087030b92\" (UID: \"a64bef86-ebe2-411a-92d3-f63087030b92\") " Oct 12 07:46:18 crc kubenswrapper[4599]: I1012 07:46:18.663794 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a64bef86-ebe2-411a-92d3-f63087030b92-util\") pod \"a64bef86-ebe2-411a-92d3-f63087030b92\" (UID: \"a64bef86-ebe2-411a-92d3-f63087030b92\") " Oct 12 07:46:18 crc kubenswrapper[4599]: I1012 07:46:18.663873 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbjnf\" (UniqueName: \"kubernetes.io/projected/a64bef86-ebe2-411a-92d3-f63087030b92-kube-api-access-rbjnf\") pod \"a64bef86-ebe2-411a-92d3-f63087030b92\" (UID: \"a64bef86-ebe2-411a-92d3-f63087030b92\") " Oct 12 07:46:18 crc kubenswrapper[4599]: I1012 07:46:18.664764 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a64bef86-ebe2-411a-92d3-f63087030b92-bundle" (OuterVolumeSpecName: "bundle") pod "a64bef86-ebe2-411a-92d3-f63087030b92" (UID: "a64bef86-ebe2-411a-92d3-f63087030b92"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:46:18 crc kubenswrapper[4599]: I1012 07:46:18.670806 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a64bef86-ebe2-411a-92d3-f63087030b92-kube-api-access-rbjnf" (OuterVolumeSpecName: "kube-api-access-rbjnf") pod "a64bef86-ebe2-411a-92d3-f63087030b92" (UID: "a64bef86-ebe2-411a-92d3-f63087030b92"). InnerVolumeSpecName "kube-api-access-rbjnf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:46:18 crc kubenswrapper[4599]: I1012 07:46:18.676240 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a64bef86-ebe2-411a-92d3-f63087030b92-util" (OuterVolumeSpecName: "util") pod "a64bef86-ebe2-411a-92d3-f63087030b92" (UID: "a64bef86-ebe2-411a-92d3-f63087030b92"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:46:18 crc kubenswrapper[4599]: I1012 07:46:18.765133 4599 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a64bef86-ebe2-411a-92d3-f63087030b92-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 07:46:18 crc kubenswrapper[4599]: I1012 07:46:18.765434 4599 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a64bef86-ebe2-411a-92d3-f63087030b92-util\") on node \"crc\" DevicePath \"\"" Oct 12 07:46:18 crc kubenswrapper[4599]: I1012 07:46:18.765446 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rbjnf\" (UniqueName: \"kubernetes.io/projected/a64bef86-ebe2-411a-92d3-f63087030b92-kube-api-access-rbjnf\") on node \"crc\" DevicePath \"\"" Oct 12 07:46:19 crc kubenswrapper[4599]: I1012 07:46:19.392166 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365bb5qpj" event={"ID":"a64bef86-ebe2-411a-92d3-f63087030b92","Type":"ContainerDied","Data":"49e457324434150922df35025e11e6a0dd41c0d080426392fc36f4ec5cbf42f7"} Oct 12 07:46:19 crc kubenswrapper[4599]: I1012 07:46:19.392209 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365bb5qpj" Oct 12 07:46:19 crc kubenswrapper[4599]: I1012 07:46:19.392212 4599 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="49e457324434150922df35025e11e6a0dd41c0d080426392fc36f4ec5cbf42f7" Oct 12 07:46:22 crc kubenswrapper[4599]: I1012 07:46:22.439710 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-688d597459-gfhbq"] Oct 12 07:46:22 crc kubenswrapper[4599]: E1012 07:46:22.440501 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a64bef86-ebe2-411a-92d3-f63087030b92" containerName="extract" Oct 12 07:46:22 crc kubenswrapper[4599]: I1012 07:46:22.440514 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="a64bef86-ebe2-411a-92d3-f63087030b92" containerName="extract" Oct 12 07:46:22 crc kubenswrapper[4599]: E1012 07:46:22.440524 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a64bef86-ebe2-411a-92d3-f63087030b92" containerName="util" Oct 12 07:46:22 crc kubenswrapper[4599]: I1012 07:46:22.440530 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="a64bef86-ebe2-411a-92d3-f63087030b92" containerName="util" Oct 12 07:46:22 crc kubenswrapper[4599]: E1012 07:46:22.440552 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a64bef86-ebe2-411a-92d3-f63087030b92" containerName="pull" Oct 12 07:46:22 crc kubenswrapper[4599]: I1012 07:46:22.440557 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="a64bef86-ebe2-411a-92d3-f63087030b92" containerName="pull" Oct 12 07:46:22 crc kubenswrapper[4599]: I1012 07:46:22.440655 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="a64bef86-ebe2-411a-92d3-f63087030b92" containerName="extract" Oct 12 07:46:22 crc kubenswrapper[4599]: I1012 07:46:22.441262 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-688d597459-gfhbq" Oct 12 07:46:22 crc kubenswrapper[4599]: I1012 07:46:22.443230 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-nqptn" Oct 12 07:46:22 crc kubenswrapper[4599]: I1012 07:46:22.500718 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-688d597459-gfhbq"] Oct 12 07:46:22 crc kubenswrapper[4599]: I1012 07:46:22.515299 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-575b6\" (UniqueName: \"kubernetes.io/projected/bc697113-b995-44e9-92d2-070e55b12965-kube-api-access-575b6\") pod \"openstack-operator-controller-operator-688d597459-gfhbq\" (UID: \"bc697113-b995-44e9-92d2-070e55b12965\") " pod="openstack-operators/openstack-operator-controller-operator-688d597459-gfhbq" Oct 12 07:46:22 crc kubenswrapper[4599]: I1012 07:46:22.616023 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-575b6\" (UniqueName: \"kubernetes.io/projected/bc697113-b995-44e9-92d2-070e55b12965-kube-api-access-575b6\") pod \"openstack-operator-controller-operator-688d597459-gfhbq\" (UID: \"bc697113-b995-44e9-92d2-070e55b12965\") " pod="openstack-operators/openstack-operator-controller-operator-688d597459-gfhbq" Oct 12 07:46:22 crc kubenswrapper[4599]: I1012 07:46:22.650172 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-575b6\" (UniqueName: \"kubernetes.io/projected/bc697113-b995-44e9-92d2-070e55b12965-kube-api-access-575b6\") pod \"openstack-operator-controller-operator-688d597459-gfhbq\" (UID: \"bc697113-b995-44e9-92d2-070e55b12965\") " pod="openstack-operators/openstack-operator-controller-operator-688d597459-gfhbq" Oct 12 07:46:22 crc kubenswrapper[4599]: I1012 07:46:22.757764 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-688d597459-gfhbq" Oct 12 07:46:22 crc kubenswrapper[4599]: I1012 07:46:22.943446 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-688d597459-gfhbq"] Oct 12 07:46:23 crc kubenswrapper[4599]: I1012 07:46:23.415565 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-688d597459-gfhbq" event={"ID":"bc697113-b995-44e9-92d2-070e55b12965","Type":"ContainerStarted","Data":"0ba6269b6066707411a33b47bab2d92a1544399bded3fc77a708a0d0546513c7"} Oct 12 07:46:27 crc kubenswrapper[4599]: I1012 07:46:27.437486 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-688d597459-gfhbq" event={"ID":"bc697113-b995-44e9-92d2-070e55b12965","Type":"ContainerStarted","Data":"bb6bf644fa610de6044e2af7ca5d799195c6dda919a407d62072cf644e4a488c"} Oct 12 07:46:28 crc kubenswrapper[4599]: I1012 07:46:28.322248 4599 patch_prober.go:28] interesting pod/machine-config-daemon-5mz5c container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 07:46:28 crc kubenswrapper[4599]: I1012 07:46:28.322611 4599 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 07:46:29 crc kubenswrapper[4599]: I1012 07:46:29.451402 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-688d597459-gfhbq" event={"ID":"bc697113-b995-44e9-92d2-070e55b12965","Type":"ContainerStarted","Data":"141e6214709dd4568b95764bb32c03e3146bd8685605287b2a5953d5b7929641"} Oct 12 07:46:29 crc kubenswrapper[4599]: I1012 07:46:29.451602 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-688d597459-gfhbq" Oct 12 07:46:29 crc kubenswrapper[4599]: I1012 07:46:29.480527 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-688d597459-gfhbq" podStartSLOduration=2.157011051 podStartE2EDuration="7.480500308s" podCreationTimestamp="2025-10-12 07:46:22 +0000 UTC" firstStartedPulling="2025-10-12 07:46:22.952611363 +0000 UTC m=+679.741806865" lastFinishedPulling="2025-10-12 07:46:28.27610062 +0000 UTC m=+685.065296122" observedRunningTime="2025-10-12 07:46:29.475557249 +0000 UTC m=+686.264752751" watchObservedRunningTime="2025-10-12 07:46:29.480500308 +0000 UTC m=+686.269695810" Oct 12 07:46:32 crc kubenswrapper[4599]: I1012 07:46:32.760903 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-688d597459-gfhbq" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.175305 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-658bdf4b74-65btt"] Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.176741 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-658bdf4b74-65btt" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.180901 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-tzrg7" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.182952 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-7b7fb68549-nwqxv"] Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.184030 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-7b7fb68549-nwqxv" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.185249 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-9tsq5" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.193574 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-658bdf4b74-65btt"] Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.202024 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-85d5d9dd78-rd8wv"] Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.203008 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-85d5d9dd78-rd8wv" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.204855 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-d7tql" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.207194 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-84b9b84486-kbqh6"] Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.208096 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-84b9b84486-kbqh6" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.209417 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-kx9r6" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.218121 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-7b7fb68549-nwqxv"] Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.260092 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-85d5d9dd78-rd8wv"] Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.284769 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-84b9b84486-kbqh6"] Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.299887 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-858f76bbdd-n6scs"] Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.301016 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-858f76bbdd-n6scs" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.311443 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-nn5kh" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.321811 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-858f76bbdd-n6scs"] Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.326534 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-7ffbcb7588-hlvdz"] Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.327599 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-7ffbcb7588-hlvdz" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.329866 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-mfr72" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.331916 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-656bcbd775-6xt98"] Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.333054 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-656bcbd775-6xt98" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.340700 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.340992 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-hwn8j" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.357674 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-7ffbcb7588-hlvdz"] Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.366382 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-656bcbd775-6xt98"] Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.367573 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-7zfpt"] Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.368680 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-7zfpt" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.373493 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-hjk9b" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.378064 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-9c5c78d49-gq5k6"] Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.379178 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-7zfpt"] Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.379277 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-9c5c78d49-gq5k6" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.380903 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bldf\" (UniqueName: \"kubernetes.io/projected/4f32c957-e414-456c-b06e-6f38553efe85-kube-api-access-2bldf\") pod \"glance-operator-controller-manager-84b9b84486-kbqh6\" (UID: \"4f32c957-e414-456c-b06e-6f38553efe85\") " pod="openstack-operators/glance-operator-controller-manager-84b9b84486-kbqh6" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.380956 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxvrg\" (UniqueName: \"kubernetes.io/projected/32a9cc54-3488-4659-a83f-0a6dc0c402c9-kube-api-access-jxvrg\") pod \"infra-operator-controller-manager-656bcbd775-6xt98\" (UID: \"32a9cc54-3488-4659-a83f-0a6dc0c402c9\") " pod="openstack-operators/infra-operator-controller-manager-656bcbd775-6xt98" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.380994 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2xst\" (UniqueName: \"kubernetes.io/projected/f13311d0-566a-4c8d-823c-fae47384cd53-kube-api-access-p2xst\") pod \"cinder-operator-controller-manager-7b7fb68549-nwqxv\" (UID: \"f13311d0-566a-4c8d-823c-fae47384cd53\") " pod="openstack-operators/cinder-operator-controller-manager-7b7fb68549-nwqxv" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.381027 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w74pp\" (UniqueName: \"kubernetes.io/projected/390a00af-1983-41ce-b7f2-3190e2d1594e-kube-api-access-w74pp\") pod \"barbican-operator-controller-manager-658bdf4b74-65btt\" (UID: \"390a00af-1983-41ce-b7f2-3190e2d1594e\") " pod="openstack-operators/barbican-operator-controller-manager-658bdf4b74-65btt" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.381061 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-td8s5\" (UniqueName: \"kubernetes.io/projected/297195f2-cb07-4bd8-9994-82f16e2f83f3-kube-api-access-td8s5\") pod \"keystone-operator-controller-manager-55b6b7c7b8-7zfpt\" (UID: \"297195f2-cb07-4bd8-9994-82f16e2f83f3\") " pod="openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-7zfpt" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.381109 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/32a9cc54-3488-4659-a83f-0a6dc0c402c9-cert\") pod \"infra-operator-controller-manager-656bcbd775-6xt98\" (UID: \"32a9cc54-3488-4659-a83f-0a6dc0c402c9\") " pod="openstack-operators/infra-operator-controller-manager-656bcbd775-6xt98" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.381151 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fknw\" (UniqueName: \"kubernetes.io/projected/8ce364df-e28b-45cf-ae95-92ae415392f0-kube-api-access-4fknw\") pod \"horizon-operator-controller-manager-7ffbcb7588-hlvdz\" (UID: \"8ce364df-e28b-45cf-ae95-92ae415392f0\") " pod="openstack-operators/horizon-operator-controller-manager-7ffbcb7588-hlvdz" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.381172 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvg57\" (UniqueName: \"kubernetes.io/projected/37355d3f-b321-446a-b0ac-5d3a770bd0c5-kube-api-access-tvg57\") pod \"designate-operator-controller-manager-85d5d9dd78-rd8wv\" (UID: \"37355d3f-b321-446a-b0ac-5d3a770bd0c5\") " pod="openstack-operators/designate-operator-controller-manager-85d5d9dd78-rd8wv" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.381194 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mpzf\" (UniqueName: \"kubernetes.io/projected/f70b2a0c-df5a-4a41-89db-e1bf314ee45a-kube-api-access-5mpzf\") pod \"ironic-operator-controller-manager-9c5c78d49-gq5k6\" (UID: \"f70b2a0c-df5a-4a41-89db-e1bf314ee45a\") " pod="openstack-operators/ironic-operator-controller-manager-9c5c78d49-gq5k6" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.381214 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2ksb\" (UniqueName: \"kubernetes.io/projected/986744d6-7f19-4f08-9dfd-03629fe2ca58-kube-api-access-d2ksb\") pod \"heat-operator-controller-manager-858f76bbdd-n6scs\" (UID: \"986744d6-7f19-4f08-9dfd-03629fe2ca58\") " pod="openstack-operators/heat-operator-controller-manager-858f76bbdd-n6scs" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.382543 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-nvp86" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.385472 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-9c5c78d49-gq5k6"] Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.400614 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-5f67fbc655-sbgr8"] Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.401817 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-5f67fbc655-sbgr8" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.411699 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-c62kr" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.413464 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-5f67fbc655-sbgr8"] Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.418382 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-5zhzq"] Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.419443 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-5zhzq" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.423402 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-d6dzw" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.429707 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-79d585cb66-cz2h5"] Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.434396 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-79d585cb66-cz2h5" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.435570 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-w9zsp" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.438430 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-5zhzq"] Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.455518 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-79d585cb66-cz2h5"] Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.461419 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-5df598886f-md9kr"] Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.462753 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5df598886f-md9kr" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.464204 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-qxsp4" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.466629 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-8ccz2"] Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.467764 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-8ccz2" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.478870 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-7lbwg" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.483956 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5df598886f-md9kr"] Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.486291 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvg57\" (UniqueName: \"kubernetes.io/projected/37355d3f-b321-446a-b0ac-5d3a770bd0c5-kube-api-access-tvg57\") pod \"designate-operator-controller-manager-85d5d9dd78-rd8wv\" (UID: \"37355d3f-b321-446a-b0ac-5d3a770bd0c5\") " pod="openstack-operators/designate-operator-controller-manager-85d5d9dd78-rd8wv" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.486345 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fknw\" (UniqueName: \"kubernetes.io/projected/8ce364df-e28b-45cf-ae95-92ae415392f0-kube-api-access-4fknw\") pod \"horizon-operator-controller-manager-7ffbcb7588-hlvdz\" (UID: \"8ce364df-e28b-45cf-ae95-92ae415392f0\") " pod="openstack-operators/horizon-operator-controller-manager-7ffbcb7588-hlvdz" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.486382 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mpzf\" (UniqueName: \"kubernetes.io/projected/f70b2a0c-df5a-4a41-89db-e1bf314ee45a-kube-api-access-5mpzf\") pod \"ironic-operator-controller-manager-9c5c78d49-gq5k6\" (UID: \"f70b2a0c-df5a-4a41-89db-e1bf314ee45a\") " pod="openstack-operators/ironic-operator-controller-manager-9c5c78d49-gq5k6" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.486413 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2ksb\" (UniqueName: \"kubernetes.io/projected/986744d6-7f19-4f08-9dfd-03629fe2ca58-kube-api-access-d2ksb\") pod \"heat-operator-controller-manager-858f76bbdd-n6scs\" (UID: \"986744d6-7f19-4f08-9dfd-03629fe2ca58\") " pod="openstack-operators/heat-operator-controller-manager-858f76bbdd-n6scs" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.486473 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bldf\" (UniqueName: \"kubernetes.io/projected/4f32c957-e414-456c-b06e-6f38553efe85-kube-api-access-2bldf\") pod \"glance-operator-controller-manager-84b9b84486-kbqh6\" (UID: \"4f32c957-e414-456c-b06e-6f38553efe85\") " pod="openstack-operators/glance-operator-controller-manager-84b9b84486-kbqh6" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.486500 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxvrg\" (UniqueName: \"kubernetes.io/projected/32a9cc54-3488-4659-a83f-0a6dc0c402c9-kube-api-access-jxvrg\") pod \"infra-operator-controller-manager-656bcbd775-6xt98\" (UID: \"32a9cc54-3488-4659-a83f-0a6dc0c402c9\") " pod="openstack-operators/infra-operator-controller-manager-656bcbd775-6xt98" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.486535 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2xst\" (UniqueName: \"kubernetes.io/projected/f13311d0-566a-4c8d-823c-fae47384cd53-kube-api-access-p2xst\") pod \"cinder-operator-controller-manager-7b7fb68549-nwqxv\" (UID: \"f13311d0-566a-4c8d-823c-fae47384cd53\") " pod="openstack-operators/cinder-operator-controller-manager-7b7fb68549-nwqxv" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.486575 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w74pp\" (UniqueName: \"kubernetes.io/projected/390a00af-1983-41ce-b7f2-3190e2d1594e-kube-api-access-w74pp\") pod \"barbican-operator-controller-manager-658bdf4b74-65btt\" (UID: \"390a00af-1983-41ce-b7f2-3190e2d1594e\") " pod="openstack-operators/barbican-operator-controller-manager-658bdf4b74-65btt" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.486606 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-td8s5\" (UniqueName: \"kubernetes.io/projected/297195f2-cb07-4bd8-9994-82f16e2f83f3-kube-api-access-td8s5\") pod \"keystone-operator-controller-manager-55b6b7c7b8-7zfpt\" (UID: \"297195f2-cb07-4bd8-9994-82f16e2f83f3\") " pod="openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-7zfpt" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.486671 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/32a9cc54-3488-4659-a83f-0a6dc0c402c9-cert\") pod \"infra-operator-controller-manager-656bcbd775-6xt98\" (UID: \"32a9cc54-3488-4659-a83f-0a6dc0c402c9\") " pod="openstack-operators/infra-operator-controller-manager-656bcbd775-6xt98" Oct 12 07:46:48 crc kubenswrapper[4599]: E1012 07:46:48.486823 4599 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Oct 12 07:46:48 crc kubenswrapper[4599]: E1012 07:46:48.486880 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32a9cc54-3488-4659-a83f-0a6dc0c402c9-cert podName:32a9cc54-3488-4659-a83f-0a6dc0c402c9 nodeName:}" failed. No retries permitted until 2025-10-12 07:46:48.986859966 +0000 UTC m=+705.776055468 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/32a9cc54-3488-4659-a83f-0a6dc0c402c9-cert") pod "infra-operator-controller-manager-656bcbd775-6xt98" (UID: "32a9cc54-3488-4659-a83f-0a6dc0c402c9") : secret "infra-operator-webhook-server-cert" not found Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.497436 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5956dffb7b2gdkj"] Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.507608 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-8ccz2"] Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.507745 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5956dffb7b2gdkj" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.514043 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.514239 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-79df5fb58c-2fhbx"] Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.514707 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-lghtg" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.515294 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-79df5fb58c-2fhbx" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.527266 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mpzf\" (UniqueName: \"kubernetes.io/projected/f70b2a0c-df5a-4a41-89db-e1bf314ee45a-kube-api-access-5mpzf\") pod \"ironic-operator-controller-manager-9c5c78d49-gq5k6\" (UID: \"f70b2a0c-df5a-4a41-89db-e1bf314ee45a\") " pod="openstack-operators/ironic-operator-controller-manager-9c5c78d49-gq5k6" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.527285 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fknw\" (UniqueName: \"kubernetes.io/projected/8ce364df-e28b-45cf-ae95-92ae415392f0-kube-api-access-4fknw\") pod \"horizon-operator-controller-manager-7ffbcb7588-hlvdz\" (UID: \"8ce364df-e28b-45cf-ae95-92ae415392f0\") " pod="openstack-operators/horizon-operator-controller-manager-7ffbcb7588-hlvdz" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.527760 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-td8s5\" (UniqueName: \"kubernetes.io/projected/297195f2-cb07-4bd8-9994-82f16e2f83f3-kube-api-access-td8s5\") pod \"keystone-operator-controller-manager-55b6b7c7b8-7zfpt\" (UID: \"297195f2-cb07-4bd8-9994-82f16e2f83f3\") " pod="openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-7zfpt" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.528260 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvg57\" (UniqueName: \"kubernetes.io/projected/37355d3f-b321-446a-b0ac-5d3a770bd0c5-kube-api-access-tvg57\") pod \"designate-operator-controller-manager-85d5d9dd78-rd8wv\" (UID: \"37355d3f-b321-446a-b0ac-5d3a770bd0c5\") " pod="openstack-operators/designate-operator-controller-manager-85d5d9dd78-rd8wv" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.528362 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2xst\" (UniqueName: \"kubernetes.io/projected/f13311d0-566a-4c8d-823c-fae47384cd53-kube-api-access-p2xst\") pod \"cinder-operator-controller-manager-7b7fb68549-nwqxv\" (UID: \"f13311d0-566a-4c8d-823c-fae47384cd53\") " pod="openstack-operators/cinder-operator-controller-manager-7b7fb68549-nwqxv" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.532289 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w74pp\" (UniqueName: \"kubernetes.io/projected/390a00af-1983-41ce-b7f2-3190e2d1594e-kube-api-access-w74pp\") pod \"barbican-operator-controller-manager-658bdf4b74-65btt\" (UID: \"390a00af-1983-41ce-b7f2-3190e2d1594e\") " pod="openstack-operators/barbican-operator-controller-manager-658bdf4b74-65btt" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.533648 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-68b6c87b68-d2q7d"] Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.533773 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-wxpbw" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.534523 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxvrg\" (UniqueName: \"kubernetes.io/projected/32a9cc54-3488-4659-a83f-0a6dc0c402c9-kube-api-access-jxvrg\") pod \"infra-operator-controller-manager-656bcbd775-6xt98\" (UID: \"32a9cc54-3488-4659-a83f-0a6dc0c402c9\") " pod="openstack-operators/infra-operator-controller-manager-656bcbd775-6xt98" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.533659 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-85d5d9dd78-rd8wv" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.550783 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-79df5fb58c-2fhbx"] Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.551277 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-68b6c87b68-d2q7d"] Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.551694 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-d2q7d" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.554170 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-ht7lp" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.556657 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2ksb\" (UniqueName: \"kubernetes.io/projected/986744d6-7f19-4f08-9dfd-03629fe2ca58-kube-api-access-d2ksb\") pod \"heat-operator-controller-manager-858f76bbdd-n6scs\" (UID: \"986744d6-7f19-4f08-9dfd-03629fe2ca58\") " pod="openstack-operators/heat-operator-controller-manager-858f76bbdd-n6scs" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.559861 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5956dffb7b2gdkj"] Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.566958 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bldf\" (UniqueName: \"kubernetes.io/projected/4f32c957-e414-456c-b06e-6f38553efe85-kube-api-access-2bldf\") pod \"glance-operator-controller-manager-84b9b84486-kbqh6\" (UID: \"4f32c957-e414-456c-b06e-6f38553efe85\") " pod="openstack-operators/glance-operator-controller-manager-84b9b84486-kbqh6" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.579063 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-db6d7f97b-gzlp6"] Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.583119 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-gzlp6" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.588128 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnzfb\" (UniqueName: \"kubernetes.io/projected/e88fb634-df40-47e9-a349-e7ac89e134f2-kube-api-access-nnzfb\") pod \"neutron-operator-controller-manager-79d585cb66-cz2h5\" (UID: \"e88fb634-df40-47e9-a349-e7ac89e134f2\") " pod="openstack-operators/neutron-operator-controller-manager-79d585cb66-cz2h5" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.588247 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4bvq\" (UniqueName: \"kubernetes.io/projected/6d340683-0013-4bf4-b98b-32610996ded4-kube-api-access-q4bvq\") pod \"manila-operator-controller-manager-5f67fbc655-sbgr8\" (UID: \"6d340683-0013-4bf4-b98b-32610996ded4\") " pod="openstack-operators/manila-operator-controller-manager-5f67fbc655-sbgr8" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.588270 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svc85\" (UniqueName: \"kubernetes.io/projected/bdeb0fae-d9af-4253-a90c-8a50255cc6fe-kube-api-access-svc85\") pod \"octavia-operator-controller-manager-69fdcfc5f5-8ccz2\" (UID: \"bdeb0fae-d9af-4253-a90c-8a50255cc6fe\") " pod="openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-8ccz2" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.588307 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqqh6\" (UniqueName: \"kubernetes.io/projected/07c5394f-62de-4eca-86c0-c534788aead5-kube-api-access-pqqh6\") pod \"mariadb-operator-controller-manager-f9fb45f8f-5zhzq\" (UID: \"07c5394f-62de-4eca-86c0-c534788aead5\") " pod="openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-5zhzq" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.588326 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s98nr\" (UniqueName: \"kubernetes.io/projected/84280387-ac26-4496-8c00-72673a91cb12-kube-api-access-s98nr\") pod \"nova-operator-controller-manager-5df598886f-md9kr\" (UID: \"84280387-ac26-4496-8c00-72673a91cb12\") " pod="openstack-operators/nova-operator-controller-manager-5df598886f-md9kr" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.589375 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-5f6wx" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.591841 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-db6d7f97b-gzlp6"] Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.609471 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-67cfc6749b-m9l8f"] Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.610769 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-67cfc6749b-m9l8f" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.612269 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-xxgrq" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.614952 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-67cfc6749b-m9l8f"] Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.625021 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-858f76bbdd-n6scs" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.651315 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-7ffbcb7588-hlvdz" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.688225 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5458f77c4-b6bwb"] Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.689815 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5458f77c4-b6bwb" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.689855 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-7zfpt" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.691822 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ws7mb\" (UniqueName: \"kubernetes.io/projected/caf862ab-9fa9-4c44-8e6c-35599bcc45a1-kube-api-access-ws7mb\") pod \"swift-operator-controller-manager-db6d7f97b-gzlp6\" (UID: \"caf862ab-9fa9-4c44-8e6c-35599bcc45a1\") " pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-gzlp6" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.691849 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0cdab794-6175-4fb9-bd9d-c1080d45ee30-cert\") pod \"openstack-baremetal-operator-controller-manager-5956dffb7b2gdkj\" (UID: \"0cdab794-6175-4fb9-bd9d-c1080d45ee30\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5956dffb7b2gdkj" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.692326 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4bvq\" (UniqueName: \"kubernetes.io/projected/6d340683-0013-4bf4-b98b-32610996ded4-kube-api-access-q4bvq\") pod \"manila-operator-controller-manager-5f67fbc655-sbgr8\" (UID: \"6d340683-0013-4bf4-b98b-32610996ded4\") " pod="openstack-operators/manila-operator-controller-manager-5f67fbc655-sbgr8" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.692943 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svc85\" (UniqueName: \"kubernetes.io/projected/bdeb0fae-d9af-4253-a90c-8a50255cc6fe-kube-api-access-svc85\") pod \"octavia-operator-controller-manager-69fdcfc5f5-8ccz2\" (UID: \"bdeb0fae-d9af-4253-a90c-8a50255cc6fe\") " pod="openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-8ccz2" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.693021 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mz2ts\" (UniqueName: \"kubernetes.io/projected/0cdab794-6175-4fb9-bd9d-c1080d45ee30-kube-api-access-mz2ts\") pod \"openstack-baremetal-operator-controller-manager-5956dffb7b2gdkj\" (UID: \"0cdab794-6175-4fb9-bd9d-c1080d45ee30\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5956dffb7b2gdkj" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.693058 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqqh6\" (UniqueName: \"kubernetes.io/projected/07c5394f-62de-4eca-86c0-c534788aead5-kube-api-access-pqqh6\") pod \"mariadb-operator-controller-manager-f9fb45f8f-5zhzq\" (UID: \"07c5394f-62de-4eca-86c0-c534788aead5\") " pod="openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-5zhzq" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.693083 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s98nr\" (UniqueName: \"kubernetes.io/projected/84280387-ac26-4496-8c00-72673a91cb12-kube-api-access-s98nr\") pod \"nova-operator-controller-manager-5df598886f-md9kr\" (UID: \"84280387-ac26-4496-8c00-72673a91cb12\") " pod="openstack-operators/nova-operator-controller-manager-5df598886f-md9kr" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.694006 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dx966\" (UniqueName: \"kubernetes.io/projected/dc9b0db0-82c1-492b-94de-c8f93e96364f-kube-api-access-dx966\") pod \"ovn-operator-controller-manager-79df5fb58c-2fhbx\" (UID: \"dc9b0db0-82c1-492b-94de-c8f93e96364f\") " pod="openstack-operators/ovn-operator-controller-manager-79df5fb58c-2fhbx" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.694096 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnzfb\" (UniqueName: \"kubernetes.io/projected/e88fb634-df40-47e9-a349-e7ac89e134f2-kube-api-access-nnzfb\") pod \"neutron-operator-controller-manager-79d585cb66-cz2h5\" (UID: \"e88fb634-df40-47e9-a349-e7ac89e134f2\") " pod="openstack-operators/neutron-operator-controller-manager-79d585cb66-cz2h5" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.694121 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdlxz\" (UniqueName: \"kubernetes.io/projected/b1de56e0-d6e0-4c5a-9c4e-c725f171e142-kube-api-access-hdlxz\") pod \"placement-operator-controller-manager-68b6c87b68-d2q7d\" (UID: \"b1de56e0-d6e0-4c5a-9c4e-c725f171e142\") " pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-d2q7d" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.694202 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-hmf6z" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.703083 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-9c5c78d49-gq5k6" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.709977 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqqh6\" (UniqueName: \"kubernetes.io/projected/07c5394f-62de-4eca-86c0-c534788aead5-kube-api-access-pqqh6\") pod \"mariadb-operator-controller-manager-f9fb45f8f-5zhzq\" (UID: \"07c5394f-62de-4eca-86c0-c534788aead5\") " pod="openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-5zhzq" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.710048 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svc85\" (UniqueName: \"kubernetes.io/projected/bdeb0fae-d9af-4253-a90c-8a50255cc6fe-kube-api-access-svc85\") pod \"octavia-operator-controller-manager-69fdcfc5f5-8ccz2\" (UID: \"bdeb0fae-d9af-4253-a90c-8a50255cc6fe\") " pod="openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-8ccz2" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.710223 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnzfb\" (UniqueName: \"kubernetes.io/projected/e88fb634-df40-47e9-a349-e7ac89e134f2-kube-api-access-nnzfb\") pod \"neutron-operator-controller-manager-79d585cb66-cz2h5\" (UID: \"e88fb634-df40-47e9-a349-e7ac89e134f2\") " pod="openstack-operators/neutron-operator-controller-manager-79d585cb66-cz2h5" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.710863 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4bvq\" (UniqueName: \"kubernetes.io/projected/6d340683-0013-4bf4-b98b-32610996ded4-kube-api-access-q4bvq\") pod \"manila-operator-controller-manager-5f67fbc655-sbgr8\" (UID: \"6d340683-0013-4bf4-b98b-32610996ded4\") " pod="openstack-operators/manila-operator-controller-manager-5f67fbc655-sbgr8" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.714259 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s98nr\" (UniqueName: \"kubernetes.io/projected/84280387-ac26-4496-8c00-72673a91cb12-kube-api-access-s98nr\") pod \"nova-operator-controller-manager-5df598886f-md9kr\" (UID: \"84280387-ac26-4496-8c00-72673a91cb12\") " pod="openstack-operators/nova-operator-controller-manager-5df598886f-md9kr" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.715507 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5458f77c4-b6bwb"] Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.727516 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-5f67fbc655-sbgr8" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.742990 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-5zhzq" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.753694 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-79d585cb66-cz2h5" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.782448 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-7f554bff7b-ffpvz"] Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.784030 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-7f554bff7b-ffpvz" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.789532 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-gfxst" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.793785 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-658bdf4b74-65btt" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.796034 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdlxz\" (UniqueName: \"kubernetes.io/projected/b1de56e0-d6e0-4c5a-9c4e-c725f171e142-kube-api-access-hdlxz\") pod \"placement-operator-controller-manager-68b6c87b68-d2q7d\" (UID: \"b1de56e0-d6e0-4c5a-9c4e-c725f171e142\") " pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-d2q7d" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.796071 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4v7dd\" (UniqueName: \"kubernetes.io/projected/5f9bd209-b904-4998-843c-d4573b0a2cd0-kube-api-access-4v7dd\") pod \"test-operator-controller-manager-5458f77c4-b6bwb\" (UID: \"5f9bd209-b904-4998-843c-d4573b0a2cd0\") " pod="openstack-operators/test-operator-controller-manager-5458f77c4-b6bwb" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.796095 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ws7mb\" (UniqueName: \"kubernetes.io/projected/caf862ab-9fa9-4c44-8e6c-35599bcc45a1-kube-api-access-ws7mb\") pod \"swift-operator-controller-manager-db6d7f97b-gzlp6\" (UID: \"caf862ab-9fa9-4c44-8e6c-35599bcc45a1\") " pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-gzlp6" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.796113 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0cdab794-6175-4fb9-bd9d-c1080d45ee30-cert\") pod \"openstack-baremetal-operator-controller-manager-5956dffb7b2gdkj\" (UID: \"0cdab794-6175-4fb9-bd9d-c1080d45ee30\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5956dffb7b2gdkj" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.796173 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mz2ts\" (UniqueName: \"kubernetes.io/projected/0cdab794-6175-4fb9-bd9d-c1080d45ee30-kube-api-access-mz2ts\") pod \"openstack-baremetal-operator-controller-manager-5956dffb7b2gdkj\" (UID: \"0cdab794-6175-4fb9-bd9d-c1080d45ee30\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5956dffb7b2gdkj" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.796219 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxdgb\" (UniqueName: \"kubernetes.io/projected/5c565798-c0f8-4d14-b531-386b1b0efc63-kube-api-access-cxdgb\") pod \"telemetry-operator-controller-manager-67cfc6749b-m9l8f\" (UID: \"5c565798-c0f8-4d14-b531-386b1b0efc63\") " pod="openstack-operators/telemetry-operator-controller-manager-67cfc6749b-m9l8f" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.796246 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dx966\" (UniqueName: \"kubernetes.io/projected/dc9b0db0-82c1-492b-94de-c8f93e96364f-kube-api-access-dx966\") pod \"ovn-operator-controller-manager-79df5fb58c-2fhbx\" (UID: \"dc9b0db0-82c1-492b-94de-c8f93e96364f\") " pod="openstack-operators/ovn-operator-controller-manager-79df5fb58c-2fhbx" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.796263 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwkjh\" (UniqueName: \"kubernetes.io/projected/20be3b1c-5de2-4c22-a3af-215e2272d586-kube-api-access-hwkjh\") pod \"watcher-operator-controller-manager-7f554bff7b-ffpvz\" (UID: \"20be3b1c-5de2-4c22-a3af-215e2272d586\") " pod="openstack-operators/watcher-operator-controller-manager-7f554bff7b-ffpvz" Oct 12 07:46:48 crc kubenswrapper[4599]: E1012 07:46:48.796385 4599 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 12 07:46:48 crc kubenswrapper[4599]: E1012 07:46:48.796424 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0cdab794-6175-4fb9-bd9d-c1080d45ee30-cert podName:0cdab794-6175-4fb9-bd9d-c1080d45ee30 nodeName:}" failed. No retries permitted until 2025-10-12 07:46:49.296409898 +0000 UTC m=+706.085605400 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0cdab794-6175-4fb9-bd9d-c1080d45ee30-cert") pod "openstack-baremetal-operator-controller-manager-5956dffb7b2gdkj" (UID: "0cdab794-6175-4fb9-bd9d-c1080d45ee30") : secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.798495 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-7f554bff7b-ffpvz"] Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.815812 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5df598886f-md9kr" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.816608 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-7b7fb68549-nwqxv" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.823095 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdlxz\" (UniqueName: \"kubernetes.io/projected/b1de56e0-d6e0-4c5a-9c4e-c725f171e142-kube-api-access-hdlxz\") pod \"placement-operator-controller-manager-68b6c87b68-d2q7d\" (UID: \"b1de56e0-d6e0-4c5a-9c4e-c725f171e142\") " pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-d2q7d" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.824491 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mz2ts\" (UniqueName: \"kubernetes.io/projected/0cdab794-6175-4fb9-bd9d-c1080d45ee30-kube-api-access-mz2ts\") pod \"openstack-baremetal-operator-controller-manager-5956dffb7b2gdkj\" (UID: \"0cdab794-6175-4fb9-bd9d-c1080d45ee30\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5956dffb7b2gdkj" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.827182 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dx966\" (UniqueName: \"kubernetes.io/projected/dc9b0db0-82c1-492b-94de-c8f93e96364f-kube-api-access-dx966\") pod \"ovn-operator-controller-manager-79df5fb58c-2fhbx\" (UID: \"dc9b0db0-82c1-492b-94de-c8f93e96364f\") " pod="openstack-operators/ovn-operator-controller-manager-79df5fb58c-2fhbx" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.833962 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ws7mb\" (UniqueName: \"kubernetes.io/projected/caf862ab-9fa9-4c44-8e6c-35599bcc45a1-kube-api-access-ws7mb\") pod \"swift-operator-controller-manager-db6d7f97b-gzlp6\" (UID: \"caf862ab-9fa9-4c44-8e6c-35599bcc45a1\") " pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-gzlp6" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.839762 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-8ccz2" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.852678 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-84b9b84486-kbqh6" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.898542 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxdgb\" (UniqueName: \"kubernetes.io/projected/5c565798-c0f8-4d14-b531-386b1b0efc63-kube-api-access-cxdgb\") pod \"telemetry-operator-controller-manager-67cfc6749b-m9l8f\" (UID: \"5c565798-c0f8-4d14-b531-386b1b0efc63\") " pod="openstack-operators/telemetry-operator-controller-manager-67cfc6749b-m9l8f" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.898616 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwkjh\" (UniqueName: \"kubernetes.io/projected/20be3b1c-5de2-4c22-a3af-215e2272d586-kube-api-access-hwkjh\") pod \"watcher-operator-controller-manager-7f554bff7b-ffpvz\" (UID: \"20be3b1c-5de2-4c22-a3af-215e2272d586\") " pod="openstack-operators/watcher-operator-controller-manager-7f554bff7b-ffpvz" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.898685 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4v7dd\" (UniqueName: \"kubernetes.io/projected/5f9bd209-b904-4998-843c-d4573b0a2cd0-kube-api-access-4v7dd\") pod \"test-operator-controller-manager-5458f77c4-b6bwb\" (UID: \"5f9bd209-b904-4998-843c-d4573b0a2cd0\") " pod="openstack-operators/test-operator-controller-manager-5458f77c4-b6bwb" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.924662 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-79df5fb58c-2fhbx" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.925515 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxdgb\" (UniqueName: \"kubernetes.io/projected/5c565798-c0f8-4d14-b531-386b1b0efc63-kube-api-access-cxdgb\") pod \"telemetry-operator-controller-manager-67cfc6749b-m9l8f\" (UID: \"5c565798-c0f8-4d14-b531-386b1b0efc63\") " pod="openstack-operators/telemetry-operator-controller-manager-67cfc6749b-m9l8f" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.925923 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwkjh\" (UniqueName: \"kubernetes.io/projected/20be3b1c-5de2-4c22-a3af-215e2272d586-kube-api-access-hwkjh\") pod \"watcher-operator-controller-manager-7f554bff7b-ffpvz\" (UID: \"20be3b1c-5de2-4c22-a3af-215e2272d586\") " pod="openstack-operators/watcher-operator-controller-manager-7f554bff7b-ffpvz" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.926508 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5b95c8954b-z5lsg"] Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.928610 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-d2q7d" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.929384 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4v7dd\" (UniqueName: \"kubernetes.io/projected/5f9bd209-b904-4998-843c-d4573b0a2cd0-kube-api-access-4v7dd\") pod \"test-operator-controller-manager-5458f77c4-b6bwb\" (UID: \"5f9bd209-b904-4998-843c-d4573b0a2cd0\") " pod="openstack-operators/test-operator-controller-manager-5458f77c4-b6bwb" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.935178 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5b95c8954b-z5lsg" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.938199 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.938492 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-gbgch" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.939930 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-gzlp6" Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.946807 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5b95c8954b-z5lsg"] Oct 12 07:46:48 crc kubenswrapper[4599]: I1012 07:46:48.981812 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-67cfc6749b-m9l8f" Oct 12 07:46:49 crc kubenswrapper[4599]: I1012 07:46:48.999950 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/32a9cc54-3488-4659-a83f-0a6dc0c402c9-cert\") pod \"infra-operator-controller-manager-656bcbd775-6xt98\" (UID: \"32a9cc54-3488-4659-a83f-0a6dc0c402c9\") " pod="openstack-operators/infra-operator-controller-manager-656bcbd775-6xt98" Oct 12 07:46:49 crc kubenswrapper[4599]: I1012 07:46:49.005532 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/32a9cc54-3488-4659-a83f-0a6dc0c402c9-cert\") pod \"infra-operator-controller-manager-656bcbd775-6xt98\" (UID: \"32a9cc54-3488-4659-a83f-0a6dc0c402c9\") " pod="openstack-operators/infra-operator-controller-manager-656bcbd775-6xt98" Oct 12 07:46:49 crc kubenswrapper[4599]: I1012 07:46:49.016234 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5458f77c4-b6bwb" Oct 12 07:46:49 crc kubenswrapper[4599]: I1012 07:46:49.020473 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-7sh6m"] Oct 12 07:46:49 crc kubenswrapper[4599]: I1012 07:46:49.021732 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-7sh6m" Oct 12 07:46:49 crc kubenswrapper[4599]: I1012 07:46:49.034701 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-nk9w8" Oct 12 07:46:49 crc kubenswrapper[4599]: I1012 07:46:49.039728 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-7sh6m"] Oct 12 07:46:49 crc kubenswrapper[4599]: I1012 07:46:49.094748 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-85d5d9dd78-rd8wv"] Oct 12 07:46:49 crc kubenswrapper[4599]: I1012 07:46:49.115781 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f4099eb0-0999-42a2-b525-5ae6b0ad984b-cert\") pod \"openstack-operator-controller-manager-5b95c8954b-z5lsg\" (UID: \"f4099eb0-0999-42a2-b525-5ae6b0ad984b\") " pod="openstack-operators/openstack-operator-controller-manager-5b95c8954b-z5lsg" Oct 12 07:46:49 crc kubenswrapper[4599]: I1012 07:46:49.115880 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrmss\" (UniqueName: \"kubernetes.io/projected/f4099eb0-0999-42a2-b525-5ae6b0ad984b-kube-api-access-wrmss\") pod \"openstack-operator-controller-manager-5b95c8954b-z5lsg\" (UID: \"f4099eb0-0999-42a2-b525-5ae6b0ad984b\") " pod="openstack-operators/openstack-operator-controller-manager-5b95c8954b-z5lsg" Oct 12 07:46:49 crc kubenswrapper[4599]: I1012 07:46:49.126769 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-7f554bff7b-ffpvz" Oct 12 07:46:49 crc kubenswrapper[4599]: I1012 07:46:49.216955 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f4099eb0-0999-42a2-b525-5ae6b0ad984b-cert\") pod \"openstack-operator-controller-manager-5b95c8954b-z5lsg\" (UID: \"f4099eb0-0999-42a2-b525-5ae6b0ad984b\") " pod="openstack-operators/openstack-operator-controller-manager-5b95c8954b-z5lsg" Oct 12 07:46:49 crc kubenswrapper[4599]: I1012 07:46:49.217090 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrmss\" (UniqueName: \"kubernetes.io/projected/f4099eb0-0999-42a2-b525-5ae6b0ad984b-kube-api-access-wrmss\") pod \"openstack-operator-controller-manager-5b95c8954b-z5lsg\" (UID: \"f4099eb0-0999-42a2-b525-5ae6b0ad984b\") " pod="openstack-operators/openstack-operator-controller-manager-5b95c8954b-z5lsg" Oct 12 07:46:49 crc kubenswrapper[4599]: I1012 07:46:49.217149 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tq6z\" (UniqueName: \"kubernetes.io/projected/d5252e14-f285-43af-ace5-375bcfbe4c68-kube-api-access-2tq6z\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-7sh6m\" (UID: \"d5252e14-f285-43af-ace5-375bcfbe4c68\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-7sh6m" Oct 12 07:46:49 crc kubenswrapper[4599]: E1012 07:46:49.217429 4599 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Oct 12 07:46:49 crc kubenswrapper[4599]: E1012 07:46:49.217477 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f4099eb0-0999-42a2-b525-5ae6b0ad984b-cert podName:f4099eb0-0999-42a2-b525-5ae6b0ad984b nodeName:}" failed. No retries permitted until 2025-10-12 07:46:49.717462423 +0000 UTC m=+706.506657925 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f4099eb0-0999-42a2-b525-5ae6b0ad984b-cert") pod "openstack-operator-controller-manager-5b95c8954b-z5lsg" (UID: "f4099eb0-0999-42a2-b525-5ae6b0ad984b") : secret "webhook-server-cert" not found Oct 12 07:46:49 crc kubenswrapper[4599]: I1012 07:46:49.246838 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrmss\" (UniqueName: \"kubernetes.io/projected/f4099eb0-0999-42a2-b525-5ae6b0ad984b-kube-api-access-wrmss\") pod \"openstack-operator-controller-manager-5b95c8954b-z5lsg\" (UID: \"f4099eb0-0999-42a2-b525-5ae6b0ad984b\") " pod="openstack-operators/openstack-operator-controller-manager-5b95c8954b-z5lsg" Oct 12 07:46:49 crc kubenswrapper[4599]: I1012 07:46:49.269074 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-656bcbd775-6xt98" Oct 12 07:46:49 crc kubenswrapper[4599]: I1012 07:46:49.319623 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tq6z\" (UniqueName: \"kubernetes.io/projected/d5252e14-f285-43af-ace5-375bcfbe4c68-kube-api-access-2tq6z\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-7sh6m\" (UID: \"d5252e14-f285-43af-ace5-375bcfbe4c68\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-7sh6m" Oct 12 07:46:49 crc kubenswrapper[4599]: I1012 07:46:49.319852 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0cdab794-6175-4fb9-bd9d-c1080d45ee30-cert\") pod \"openstack-baremetal-operator-controller-manager-5956dffb7b2gdkj\" (UID: \"0cdab794-6175-4fb9-bd9d-c1080d45ee30\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5956dffb7b2gdkj" Oct 12 07:46:49 crc kubenswrapper[4599]: E1012 07:46:49.320165 4599 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 12 07:46:49 crc kubenswrapper[4599]: E1012 07:46:49.320250 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0cdab794-6175-4fb9-bd9d-c1080d45ee30-cert podName:0cdab794-6175-4fb9-bd9d-c1080d45ee30 nodeName:}" failed. No retries permitted until 2025-10-12 07:46:50.320222956 +0000 UTC m=+707.109418459 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0cdab794-6175-4fb9-bd9d-c1080d45ee30-cert") pod "openstack-baremetal-operator-controller-manager-5956dffb7b2gdkj" (UID: "0cdab794-6175-4fb9-bd9d-c1080d45ee30") : secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 12 07:46:49 crc kubenswrapper[4599]: I1012 07:46:49.328443 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-7ffbcb7588-hlvdz"] Oct 12 07:46:49 crc kubenswrapper[4599]: I1012 07:46:49.371708 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tq6z\" (UniqueName: \"kubernetes.io/projected/d5252e14-f285-43af-ace5-375bcfbe4c68-kube-api-access-2tq6z\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-7sh6m\" (UID: \"d5252e14-f285-43af-ace5-375bcfbe4c68\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-7sh6m" Oct 12 07:46:49 crc kubenswrapper[4599]: I1012 07:46:49.395321 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-7sh6m" Oct 12 07:46:49 crc kubenswrapper[4599]: I1012 07:46:49.635272 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-85d5d9dd78-rd8wv" event={"ID":"37355d3f-b321-446a-b0ac-5d3a770bd0c5","Type":"ContainerStarted","Data":"fbf37ca123f72d13a4ec7ee1d72a12b0b1e09fac48e01633e3c9e38057e44d04"} Oct 12 07:46:49 crc kubenswrapper[4599]: I1012 07:46:49.643604 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-7ffbcb7588-hlvdz" event={"ID":"8ce364df-e28b-45cf-ae95-92ae415392f0","Type":"ContainerStarted","Data":"fb90ca31292b5f1a358d1fd967cac73d5505dcb5e7639089440794ecb2ab6134"} Oct 12 07:46:49 crc kubenswrapper[4599]: I1012 07:46:49.727026 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f4099eb0-0999-42a2-b525-5ae6b0ad984b-cert\") pod \"openstack-operator-controller-manager-5b95c8954b-z5lsg\" (UID: \"f4099eb0-0999-42a2-b525-5ae6b0ad984b\") " pod="openstack-operators/openstack-operator-controller-manager-5b95c8954b-z5lsg" Oct 12 07:46:49 crc kubenswrapper[4599]: I1012 07:46:49.733670 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f4099eb0-0999-42a2-b525-5ae6b0ad984b-cert\") pod \"openstack-operator-controller-manager-5b95c8954b-z5lsg\" (UID: \"f4099eb0-0999-42a2-b525-5ae6b0ad984b\") " pod="openstack-operators/openstack-operator-controller-manager-5b95c8954b-z5lsg" Oct 12 07:46:49 crc kubenswrapper[4599]: I1012 07:46:49.752719 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-9c5c78d49-gq5k6"] Oct 12 07:46:49 crc kubenswrapper[4599]: I1012 07:46:49.769519 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-5f67fbc655-sbgr8"] Oct 12 07:46:49 crc kubenswrapper[4599]: W1012 07:46:49.816358 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod986744d6_7f19_4f08_9dfd_03629fe2ca58.slice/crio-420b721bd127079d64a00bd3f5ae83aa5da045bd528ddab50c8595a828578371 WatchSource:0}: Error finding container 420b721bd127079d64a00bd3f5ae83aa5da045bd528ddab50c8595a828578371: Status 404 returned error can't find the container with id 420b721bd127079d64a00bd3f5ae83aa5da045bd528ddab50c8595a828578371 Oct 12 07:46:49 crc kubenswrapper[4599]: I1012 07:46:49.816680 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-858f76bbdd-n6scs"] Oct 12 07:46:49 crc kubenswrapper[4599]: I1012 07:46:49.823225 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-7zfpt"] Oct 12 07:46:49 crc kubenswrapper[4599]: I1012 07:46:49.878087 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5b95c8954b-z5lsg" Oct 12 07:46:50 crc kubenswrapper[4599]: I1012 07:46:50.025488 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-79d585cb66-cz2h5"] Oct 12 07:46:50 crc kubenswrapper[4599]: I1012 07:46:50.049221 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-84b9b84486-kbqh6"] Oct 12 07:46:50 crc kubenswrapper[4599]: I1012 07:46:50.060495 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-8ccz2"] Oct 12 07:46:50 crc kubenswrapper[4599]: I1012 07:46:50.063819 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-5zhzq"] Oct 12 07:46:50 crc kubenswrapper[4599]: W1012 07:46:50.070986 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07c5394f_62de_4eca_86c0_c534788aead5.slice/crio-24f7fdc2d19271769dcaf87545f1b930e1b7d0c1a105e8480c1b1294ccf7e99b WatchSource:0}: Error finding container 24f7fdc2d19271769dcaf87545f1b930e1b7d0c1a105e8480c1b1294ccf7e99b: Status 404 returned error can't find the container with id 24f7fdc2d19271769dcaf87545f1b930e1b7d0c1a105e8480c1b1294ccf7e99b Oct 12 07:46:50 crc kubenswrapper[4599]: I1012 07:46:50.078883 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-db6d7f97b-gzlp6"] Oct 12 07:46:50 crc kubenswrapper[4599]: W1012 07:46:50.081437 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod390a00af_1983_41ce_b7f2_3190e2d1594e.slice/crio-f95864dcd3d940f99ff7c50f47af1b4a63744c80117fb7536a91127b275a5be5 WatchSource:0}: Error finding container f95864dcd3d940f99ff7c50f47af1b4a63744c80117fb7536a91127b275a5be5: Status 404 returned error can't find the container with id f95864dcd3d940f99ff7c50f47af1b4a63744c80117fb7536a91127b275a5be5 Oct 12 07:46:50 crc kubenswrapper[4599]: I1012 07:46:50.083297 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-658bdf4b74-65btt"] Oct 12 07:46:50 crc kubenswrapper[4599]: W1012 07:46:50.083599 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcaf862ab_9fa9_4c44_8e6c_35599bcc45a1.slice/crio-daf6d0d7542e6dda779c113bf3468ba0cbed64a599e64ab4dfa1e2a7c79013ff WatchSource:0}: Error finding container daf6d0d7542e6dda779c113bf3468ba0cbed64a599e64ab4dfa1e2a7c79013ff: Status 404 returned error can't find the container with id daf6d0d7542e6dda779c113bf3468ba0cbed64a599e64ab4dfa1e2a7c79013ff Oct 12 07:46:50 crc kubenswrapper[4599]: I1012 07:46:50.235554 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-68b6c87b68-d2q7d"] Oct 12 07:46:50 crc kubenswrapper[4599]: I1012 07:46:50.241946 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-79df5fb58c-2fhbx"] Oct 12 07:46:50 crc kubenswrapper[4599]: W1012 07:46:50.242232 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc9b0db0_82c1_492b_94de_c8f93e96364f.slice/crio-4b23c8dd774a8254e0c7506da15739ccb1a73b371a5efc6d26a0e28d2e0dbfe9 WatchSource:0}: Error finding container 4b23c8dd774a8254e0c7506da15739ccb1a73b371a5efc6d26a0e28d2e0dbfe9: Status 404 returned error can't find the container with id 4b23c8dd774a8254e0c7506da15739ccb1a73b371a5efc6d26a0e28d2e0dbfe9 Oct 12 07:46:50 crc kubenswrapper[4599]: W1012 07:46:50.243249 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32a9cc54_3488_4659_a83f_0a6dc0c402c9.slice/crio-9f5be51dd896d7e70ee3f65ed4109468f9488b8118925203a1a0508e28728ec0 WatchSource:0}: Error finding container 9f5be51dd896d7e70ee3f65ed4109468f9488b8118925203a1a0508e28728ec0: Status 404 returned error can't find the container with id 9f5be51dd896d7e70ee3f65ed4109468f9488b8118925203a1a0508e28728ec0 Oct 12 07:46:50 crc kubenswrapper[4599]: I1012 07:46:50.247224 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-656bcbd775-6xt98"] Oct 12 07:46:50 crc kubenswrapper[4599]: I1012 07:46:50.253563 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-67cfc6749b-m9l8f"] Oct 12 07:46:50 crc kubenswrapper[4599]: I1012 07:46:50.261814 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5458f77c4-b6bwb"] Oct 12 07:46:50 crc kubenswrapper[4599]: I1012 07:46:50.264581 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-7b7fb68549-nwqxv"] Oct 12 07:46:50 crc kubenswrapper[4599]: E1012 07:46:50.267357 4599 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:abe978f8da75223de5043cca50278ad4e28c8dd309883f502fe1e7a9998733b0,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-cxdgb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-67cfc6749b-m9l8f_openstack-operators(5c565798-c0f8-4d14-b531-386b1b0efc63): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 12 07:46:50 crc kubenswrapper[4599]: I1012 07:46:50.267735 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5df598886f-md9kr"] Oct 12 07:46:50 crc kubenswrapper[4599]: E1012 07:46:50.269400 4599 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:d33c1f507e1f5b9a4bf226ad98917e92101ac66b36e19d35cbe04ae7014f6bff,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hdlxz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-68b6c87b68-d2q7d_openstack-operators(b1de56e0-d6e0-4c5a-9c4e-c725f171e142): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 12 07:46:50 crc kubenswrapper[4599]: W1012 07:46:50.274744 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf13311d0_566a_4c8d_823c_fae47384cd53.slice/crio-d694117c58f4998ebe2fc434b83f7dddcb5c6ecc701a56adc611b5f4400645ec WatchSource:0}: Error finding container d694117c58f4998ebe2fc434b83f7dddcb5c6ecc701a56adc611b5f4400645ec: Status 404 returned error can't find the container with id d694117c58f4998ebe2fc434b83f7dddcb5c6ecc701a56adc611b5f4400645ec Oct 12 07:46:50 crc kubenswrapper[4599]: W1012 07:46:50.276116 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f9bd209_b904_4998_843c_d4573b0a2cd0.slice/crio-0f0942466d9b9678a2c2777fba3399b0a476b9d4c901c13581a847db764c81d8 WatchSource:0}: Error finding container 0f0942466d9b9678a2c2777fba3399b0a476b9d4c901c13581a847db764c81d8: Status 404 returned error can't find the container with id 0f0942466d9b9678a2c2777fba3399b0a476b9d4c901c13581a847db764c81d8 Oct 12 07:46:50 crc kubenswrapper[4599]: E1012 07:46:50.284887 4599 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:b2e9acf568a48c28cf2aed6012e432eeeb7d5f0eb11878fc91b62bc34cba10cd,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-s98nr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-5df598886f-md9kr_openstack-operators(84280387-ac26-4496-8c00-72673a91cb12): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 12 07:46:50 crc kubenswrapper[4599]: E1012 07:46:50.284980 4599 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/cinder-operator@sha256:c487a793648e64af2d64df5f6efbda2d4fd586acd7aee6838d3ec2b3edd9efb9,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-p2xst,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-7b7fb68549-nwqxv_openstack-operators(f13311d0-566a-4c8d-823c-fae47384cd53): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 12 07:46:50 crc kubenswrapper[4599]: E1012 07:46:50.285239 4599 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:7e584b1c430441c8b6591dadeff32e065de8a185ad37ef90d2e08d37e59aab4a,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4v7dd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5458f77c4-b6bwb_openstack-operators(5f9bd209-b904-4998-843c-d4573b0a2cd0): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 12 07:46:50 crc kubenswrapper[4599]: I1012 07:46:50.338325 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0cdab794-6175-4fb9-bd9d-c1080d45ee30-cert\") pod \"openstack-baremetal-operator-controller-manager-5956dffb7b2gdkj\" (UID: \"0cdab794-6175-4fb9-bd9d-c1080d45ee30\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5956dffb7b2gdkj" Oct 12 07:46:50 crc kubenswrapper[4599]: I1012 07:46:50.352781 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0cdab794-6175-4fb9-bd9d-c1080d45ee30-cert\") pod \"openstack-baremetal-operator-controller-manager-5956dffb7b2gdkj\" (UID: \"0cdab794-6175-4fb9-bd9d-c1080d45ee30\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5956dffb7b2gdkj" Oct 12 07:46:50 crc kubenswrapper[4599]: I1012 07:46:50.405919 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5956dffb7b2gdkj" Oct 12 07:46:50 crc kubenswrapper[4599]: I1012 07:46:50.416696 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-7sh6m"] Oct 12 07:46:50 crc kubenswrapper[4599]: E1012 07:46:50.425081 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-67cfc6749b-m9l8f" podUID="5c565798-c0f8-4d14-b531-386b1b0efc63" Oct 12 07:46:50 crc kubenswrapper[4599]: I1012 07:46:50.425312 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-7f554bff7b-ffpvz"] Oct 12 07:46:50 crc kubenswrapper[4599]: I1012 07:46:50.447088 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5b95c8954b-z5lsg"] Oct 12 07:46:50 crc kubenswrapper[4599]: E1012 07:46:50.464246 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-d2q7d" podUID="b1de56e0-d6e0-4c5a-9c4e-c725f171e142" Oct 12 07:46:50 crc kubenswrapper[4599]: E1012 07:46:50.467038 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/cinder-operator-controller-manager-7b7fb68549-nwqxv" podUID="f13311d0-566a-4c8d-823c-fae47384cd53" Oct 12 07:46:50 crc kubenswrapper[4599]: E1012 07:46:50.467945 4599 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:98a5233f0596591acdf2c6a5838b08be108787cdb6ad1995b2b7886bac0fe6ca,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hwkjh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-7f554bff7b-ffpvz_openstack-operators(20be3b1c-5de2-4c22-a3af-215e2272d586): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 12 07:46:50 crc kubenswrapper[4599]: E1012 07:46:50.504403 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-5458f77c4-b6bwb" podUID="5f9bd209-b904-4998-843c-d4573b0a2cd0" Oct 12 07:46:50 crc kubenswrapper[4599]: E1012 07:46:50.504501 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/nova-operator-controller-manager-5df598886f-md9kr" podUID="84280387-ac26-4496-8c00-72673a91cb12" Oct 12 07:46:50 crc kubenswrapper[4599]: I1012 07:46:50.681553 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-79df5fb58c-2fhbx" event={"ID":"dc9b0db0-82c1-492b-94de-c8f93e96364f","Type":"ContainerStarted","Data":"4b23c8dd774a8254e0c7506da15739ccb1a73b371a5efc6d26a0e28d2e0dbfe9"} Oct 12 07:46:50 crc kubenswrapper[4599]: I1012 07:46:50.699610 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-gzlp6" event={"ID":"caf862ab-9fa9-4c44-8e6c-35599bcc45a1","Type":"ContainerStarted","Data":"daf6d0d7542e6dda779c113bf3468ba0cbed64a599e64ab4dfa1e2a7c79013ff"} Oct 12 07:46:50 crc kubenswrapper[4599]: I1012 07:46:50.719457 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-84b9b84486-kbqh6" event={"ID":"4f32c957-e414-456c-b06e-6f38553efe85","Type":"ContainerStarted","Data":"aabc74cc1acb3e427865085383752a47081f44003899c7ebaa9f835e25adbcd3"} Oct 12 07:46:50 crc kubenswrapper[4599]: I1012 07:46:50.736467 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-658bdf4b74-65btt" event={"ID":"390a00af-1983-41ce-b7f2-3190e2d1594e","Type":"ContainerStarted","Data":"f95864dcd3d940f99ff7c50f47af1b4a63744c80117fb7536a91127b275a5be5"} Oct 12 07:46:50 crc kubenswrapper[4599]: I1012 07:46:50.768595 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-d2q7d" event={"ID":"b1de56e0-d6e0-4c5a-9c4e-c725f171e142","Type":"ContainerStarted","Data":"59f06a3874e47e8d14904405c37636828c607ae64bfee4297716c6a6d138eed9"} Oct 12 07:46:50 crc kubenswrapper[4599]: I1012 07:46:50.768649 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-d2q7d" event={"ID":"b1de56e0-d6e0-4c5a-9c4e-c725f171e142","Type":"ContainerStarted","Data":"23cd130abb090bbb493b1809333f707abb3174e421b710a4a1b42faa90b3cd45"} Oct 12 07:46:50 crc kubenswrapper[4599]: E1012 07:46:50.777494 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:d33c1f507e1f5b9a4bf226ad98917e92101ac66b36e19d35cbe04ae7014f6bff\\\"\"" pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-d2q7d" podUID="b1de56e0-d6e0-4c5a-9c4e-c725f171e142" Oct 12 07:46:50 crc kubenswrapper[4599]: I1012 07:46:50.781823 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-8ccz2" event={"ID":"bdeb0fae-d9af-4253-a90c-8a50255cc6fe","Type":"ContainerStarted","Data":"8d2f2d4895aa27a7b7a25f55f82a96f8541bb91b46668c303788319ff01ac3fb"} Oct 12 07:46:50 crc kubenswrapper[4599]: I1012 07:46:50.797548 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-67cfc6749b-m9l8f" event={"ID":"5c565798-c0f8-4d14-b531-386b1b0efc63","Type":"ContainerStarted","Data":"19ade6593f561a865e212d54a9d0fc8da2ba64b1e3fdcc247e21d3488b656976"} Oct 12 07:46:50 crc kubenswrapper[4599]: I1012 07:46:50.797599 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-67cfc6749b-m9l8f" event={"ID":"5c565798-c0f8-4d14-b531-386b1b0efc63","Type":"ContainerStarted","Data":"b281bca84e02d72a3343fd8caab4f850c08245143168d83994e005d020bcbae0"} Oct 12 07:46:50 crc kubenswrapper[4599]: E1012 07:46:50.814316 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:abe978f8da75223de5043cca50278ad4e28c8dd309883f502fe1e7a9998733b0\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-67cfc6749b-m9l8f" podUID="5c565798-c0f8-4d14-b531-386b1b0efc63" Oct 12 07:46:50 crc kubenswrapper[4599]: I1012 07:46:50.821427 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-79d585cb66-cz2h5" event={"ID":"e88fb634-df40-47e9-a349-e7ac89e134f2","Type":"ContainerStarted","Data":"17abd9efe086503b5bfb624e0be37b27dc6fe59fe3c87036e2b3c05c20d820f4"} Oct 12 07:46:50 crc kubenswrapper[4599]: I1012 07:46:50.829860 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5956dffb7b2gdkj"] Oct 12 07:46:50 crc kubenswrapper[4599]: I1012 07:46:50.832135 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-656bcbd775-6xt98" event={"ID":"32a9cc54-3488-4659-a83f-0a6dc0c402c9","Type":"ContainerStarted","Data":"9f5be51dd896d7e70ee3f65ed4109468f9488b8118925203a1a0508e28728ec0"} Oct 12 07:46:50 crc kubenswrapper[4599]: I1012 07:46:50.833601 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-7sh6m" event={"ID":"d5252e14-f285-43af-ace5-375bcfbe4c68","Type":"ContainerStarted","Data":"dc54b00a72639d94636a817272c31d588a5e971c51f5ef01e43837a058cd520c"} Oct 12 07:46:50 crc kubenswrapper[4599]: I1012 07:46:50.835412 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-858f76bbdd-n6scs" event={"ID":"986744d6-7f19-4f08-9dfd-03629fe2ca58","Type":"ContainerStarted","Data":"420b721bd127079d64a00bd3f5ae83aa5da045bd528ddab50c8595a828578371"} Oct 12 07:46:50 crc kubenswrapper[4599]: I1012 07:46:50.838537 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5f67fbc655-sbgr8" event={"ID":"6d340683-0013-4bf4-b98b-32610996ded4","Type":"ContainerStarted","Data":"3c72ddc8dc66c886b71cbf52660d9a6d28f05fb4dbd9488196f44eb03fc50a8b"} Oct 12 07:46:50 crc kubenswrapper[4599]: I1012 07:46:50.843187 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-9c5c78d49-gq5k6" event={"ID":"f70b2a0c-df5a-4a41-89db-e1bf314ee45a","Type":"ContainerStarted","Data":"15830b2960073934c5b0d5969c3eff5318dd549c846fb35ed645f329bf471e26"} Oct 12 07:46:50 crc kubenswrapper[4599]: E1012 07:46:50.853597 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-7f554bff7b-ffpvz" podUID="20be3b1c-5de2-4c22-a3af-215e2272d586" Oct 12 07:46:50 crc kubenswrapper[4599]: I1012 07:46:50.854690 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-7b7fb68549-nwqxv" event={"ID":"f13311d0-566a-4c8d-823c-fae47384cd53","Type":"ContainerStarted","Data":"da8e5d1541f05ba67e05bcb952499d88dcfaf5e6a1f69ff0b4802a66995d5758"} Oct 12 07:46:50 crc kubenswrapper[4599]: I1012 07:46:50.854728 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-7b7fb68549-nwqxv" event={"ID":"f13311d0-566a-4c8d-823c-fae47384cd53","Type":"ContainerStarted","Data":"d694117c58f4998ebe2fc434b83f7dddcb5c6ecc701a56adc611b5f4400645ec"} Oct 12 07:46:50 crc kubenswrapper[4599]: I1012 07:46:50.863158 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-7f554bff7b-ffpvz" event={"ID":"20be3b1c-5de2-4c22-a3af-215e2272d586","Type":"ContainerStarted","Data":"8e6cabb8797afd18561c0f04f23deafc8544674521b59d28e9644c6539aea4d2"} Oct 12 07:46:50 crc kubenswrapper[4599]: E1012 07:46:50.869946 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/cinder-operator@sha256:c487a793648e64af2d64df5f6efbda2d4fd586acd7aee6838d3ec2b3edd9efb9\\\"\"" pod="openstack-operators/cinder-operator-controller-manager-7b7fb68549-nwqxv" podUID="f13311d0-566a-4c8d-823c-fae47384cd53" Oct 12 07:46:50 crc kubenswrapper[4599]: E1012 07:46:50.870385 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:98a5233f0596591acdf2c6a5838b08be108787cdb6ad1995b2b7886bac0fe6ca\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-7f554bff7b-ffpvz" podUID="20be3b1c-5de2-4c22-a3af-215e2272d586" Oct 12 07:46:50 crc kubenswrapper[4599]: I1012 07:46:50.870679 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-7zfpt" event={"ID":"297195f2-cb07-4bd8-9994-82f16e2f83f3","Type":"ContainerStarted","Data":"1f49a812001d5bf64a262f4143148964c6a982b2475383c293b35d278560d6bb"} Oct 12 07:46:50 crc kubenswrapper[4599]: I1012 07:46:50.874506 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5b95c8954b-z5lsg" event={"ID":"f4099eb0-0999-42a2-b525-5ae6b0ad984b","Type":"ContainerStarted","Data":"8bcb765c4662dae09586d74424b35b1b91fc8441630a2c1d0119d42347654628"} Oct 12 07:46:50 crc kubenswrapper[4599]: I1012 07:46:50.880487 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5458f77c4-b6bwb" event={"ID":"5f9bd209-b904-4998-843c-d4573b0a2cd0","Type":"ContainerStarted","Data":"2f651e7ffa253957ff31b506c43b875fbb1d9f98ce27db06d11a3a126e01f814"} Oct 12 07:46:50 crc kubenswrapper[4599]: I1012 07:46:50.880517 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5458f77c4-b6bwb" event={"ID":"5f9bd209-b904-4998-843c-d4573b0a2cd0","Type":"ContainerStarted","Data":"0f0942466d9b9678a2c2777fba3399b0a476b9d4c901c13581a847db764c81d8"} Oct 12 07:46:50 crc kubenswrapper[4599]: I1012 07:46:50.883683 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-5zhzq" event={"ID":"07c5394f-62de-4eca-86c0-c534788aead5","Type":"ContainerStarted","Data":"24f7fdc2d19271769dcaf87545f1b930e1b7d0c1a105e8480c1b1294ccf7e99b"} Oct 12 07:46:50 crc kubenswrapper[4599]: I1012 07:46:50.887745 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5df598886f-md9kr" event={"ID":"84280387-ac26-4496-8c00-72673a91cb12","Type":"ContainerStarted","Data":"d401ae76a809971cc9abad112cfe6c3bd2ce670b500dd5c0cee66d0af47cfedd"} Oct 12 07:46:50 crc kubenswrapper[4599]: I1012 07:46:50.887790 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5df598886f-md9kr" event={"ID":"84280387-ac26-4496-8c00-72673a91cb12","Type":"ContainerStarted","Data":"fb44f07b0cedcfa3133ae43023bfe5962afbbc3cd8c2921453186ccb1b462d17"} Oct 12 07:46:50 crc kubenswrapper[4599]: E1012 07:46:50.893034 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:7e584b1c430441c8b6591dadeff32e065de8a185ad37ef90d2e08d37e59aab4a\\\"\"" pod="openstack-operators/test-operator-controller-manager-5458f77c4-b6bwb" podUID="5f9bd209-b904-4998-843c-d4573b0a2cd0" Oct 12 07:46:50 crc kubenswrapper[4599]: E1012 07:46:50.895299 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:b2e9acf568a48c28cf2aed6012e432eeeb7d5f0eb11878fc91b62bc34cba10cd\\\"\"" pod="openstack-operators/nova-operator-controller-manager-5df598886f-md9kr" podUID="84280387-ac26-4496-8c00-72673a91cb12" Oct 12 07:46:51 crc kubenswrapper[4599]: I1012 07:46:51.905360 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-7f554bff7b-ffpvz" event={"ID":"20be3b1c-5de2-4c22-a3af-215e2272d586","Type":"ContainerStarted","Data":"7d9b5e1c8330c8a315227d57c53676a7d600cfb8165a864e48df601f210fa994"} Oct 12 07:46:51 crc kubenswrapper[4599]: E1012 07:46:51.907147 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:98a5233f0596591acdf2c6a5838b08be108787cdb6ad1995b2b7886bac0fe6ca\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-7f554bff7b-ffpvz" podUID="20be3b1c-5de2-4c22-a3af-215e2272d586" Oct 12 07:46:51 crc kubenswrapper[4599]: I1012 07:46:51.918505 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5956dffb7b2gdkj" event={"ID":"0cdab794-6175-4fb9-bd9d-c1080d45ee30","Type":"ContainerStarted","Data":"5dc755dc41838e5323325ad52a5fa0c6c04b5b30d903e10ce06b64dd332d5e43"} Oct 12 07:46:51 crc kubenswrapper[4599]: I1012 07:46:51.929028 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5b95c8954b-z5lsg" event={"ID":"f4099eb0-0999-42a2-b525-5ae6b0ad984b","Type":"ContainerStarted","Data":"af01ea0719aef3a4c7b09b28ee63fbe58b10362fe8149b85638b477816ac3e34"} Oct 12 07:46:51 crc kubenswrapper[4599]: I1012 07:46:51.929099 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5b95c8954b-z5lsg" event={"ID":"f4099eb0-0999-42a2-b525-5ae6b0ad984b","Type":"ContainerStarted","Data":"1636bb8e432c37b4605b8ab243a6e6550c3a791be83dd362ab08833bd3607806"} Oct 12 07:46:51 crc kubenswrapper[4599]: E1012 07:46:51.931204 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:d33c1f507e1f5b9a4bf226ad98917e92101ac66b36e19d35cbe04ae7014f6bff\\\"\"" pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-d2q7d" podUID="b1de56e0-d6e0-4c5a-9c4e-c725f171e142" Oct 12 07:46:51 crc kubenswrapper[4599]: E1012 07:46:51.931658 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:b2e9acf568a48c28cf2aed6012e432eeeb7d5f0eb11878fc91b62bc34cba10cd\\\"\"" pod="openstack-operators/nova-operator-controller-manager-5df598886f-md9kr" podUID="84280387-ac26-4496-8c00-72673a91cb12" Oct 12 07:46:51 crc kubenswrapper[4599]: E1012 07:46:51.932260 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:7e584b1c430441c8b6591dadeff32e065de8a185ad37ef90d2e08d37e59aab4a\\\"\"" pod="openstack-operators/test-operator-controller-manager-5458f77c4-b6bwb" podUID="5f9bd209-b904-4998-843c-d4573b0a2cd0" Oct 12 07:46:51 crc kubenswrapper[4599]: E1012 07:46:51.932560 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:abe978f8da75223de5043cca50278ad4e28c8dd309883f502fe1e7a9998733b0\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-67cfc6749b-m9l8f" podUID="5c565798-c0f8-4d14-b531-386b1b0efc63" Oct 12 07:46:51 crc kubenswrapper[4599]: E1012 07:46:51.937150 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/cinder-operator@sha256:c487a793648e64af2d64df5f6efbda2d4fd586acd7aee6838d3ec2b3edd9efb9\\\"\"" pod="openstack-operators/cinder-operator-controller-manager-7b7fb68549-nwqxv" podUID="f13311d0-566a-4c8d-823c-fae47384cd53" Oct 12 07:46:51 crc kubenswrapper[4599]: I1012 07:46:51.954010 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-5b95c8954b-z5lsg" podStartSLOduration=3.953990108 podStartE2EDuration="3.953990108s" podCreationTimestamp="2025-10-12 07:46:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:46:51.951836734 +0000 UTC m=+708.741032236" watchObservedRunningTime="2025-10-12 07:46:51.953990108 +0000 UTC m=+708.743185609" Oct 12 07:46:52 crc kubenswrapper[4599]: I1012 07:46:52.935837 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-5b95c8954b-z5lsg" Oct 12 07:46:52 crc kubenswrapper[4599]: E1012 07:46:52.938005 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:98a5233f0596591acdf2c6a5838b08be108787cdb6ad1995b2b7886bac0fe6ca\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-7f554bff7b-ffpvz" podUID="20be3b1c-5de2-4c22-a3af-215e2272d586" Oct 12 07:46:58 crc kubenswrapper[4599]: I1012 07:46:58.321979 4599 patch_prober.go:28] interesting pod/machine-config-daemon-5mz5c container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 07:46:58 crc kubenswrapper[4599]: I1012 07:46:58.322312 4599 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 07:46:58 crc kubenswrapper[4599]: I1012 07:46:58.985880 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-79d585cb66-cz2h5" event={"ID":"e88fb634-df40-47e9-a349-e7ac89e134f2","Type":"ContainerStarted","Data":"054bd6d7e134ad2bc14313fcd89fe8eacb71aa4bfbfb96c247c1164c37199520"} Oct 12 07:46:58 crc kubenswrapper[4599]: I1012 07:46:58.985935 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-79d585cb66-cz2h5" event={"ID":"e88fb634-df40-47e9-a349-e7ac89e134f2","Type":"ContainerStarted","Data":"955c05ac5f664741b4dc7cea4c3403a397d7fc334bd98df197171b3598b8141d"} Oct 12 07:46:58 crc kubenswrapper[4599]: I1012 07:46:58.986983 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-79d585cb66-cz2h5" Oct 12 07:46:58 crc kubenswrapper[4599]: I1012 07:46:58.998357 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-656bcbd775-6xt98" event={"ID":"32a9cc54-3488-4659-a83f-0a6dc0c402c9","Type":"ContainerStarted","Data":"2bb9b0abf8c9b801fc49b030283b0b138084f532b327417365d9e1ff1e73aeda"} Oct 12 07:46:58 crc kubenswrapper[4599]: I1012 07:46:58.998397 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-656bcbd775-6xt98" event={"ID":"32a9cc54-3488-4659-a83f-0a6dc0c402c9","Type":"ContainerStarted","Data":"3462f92376445d04fe342779945280b4653b64aca6d4dd2af5adfd8a86571369"} Oct 12 07:46:58 crc kubenswrapper[4599]: I1012 07:46:58.998816 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-656bcbd775-6xt98" Oct 12 07:46:59 crc kubenswrapper[4599]: I1012 07:46:59.004009 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-5zhzq" event={"ID":"07c5394f-62de-4eca-86c0-c534788aead5","Type":"ContainerStarted","Data":"479ae93edf3778b246652827f5143991bd7953b563560d2eb6e7e44ffc3ce3e2"} Oct 12 07:46:59 crc kubenswrapper[4599]: I1012 07:46:59.011138 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5f67fbc655-sbgr8" event={"ID":"6d340683-0013-4bf4-b98b-32610996ded4","Type":"ContainerStarted","Data":"60c2d76448b8393415ea0a379124492439e25180651687558ce8405b210c5d96"} Oct 12 07:46:59 crc kubenswrapper[4599]: I1012 07:46:59.012841 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-858f76bbdd-n6scs" event={"ID":"986744d6-7f19-4f08-9dfd-03629fe2ca58","Type":"ContainerStarted","Data":"fd720d85685412d1c8c12d111ee11ff75b1120a7a217c3e66bf91414868e9cab"} Oct 12 07:46:59 crc kubenswrapper[4599]: I1012 07:46:59.019378 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-84b9b84486-kbqh6" event={"ID":"4f32c957-e414-456c-b06e-6f38553efe85","Type":"ContainerStarted","Data":"bb403b81fc425ad7b37727aa776a247a506c554d3415803f4bc8e92c12aab45d"} Oct 12 07:46:59 crc kubenswrapper[4599]: I1012 07:46:59.019432 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-84b9b84486-kbqh6" event={"ID":"4f32c957-e414-456c-b06e-6f38553efe85","Type":"ContainerStarted","Data":"63725b96f2418d543b85b39aff0d46c66d6f348497edf07c81394606aedef3ae"} Oct 12 07:46:59 crc kubenswrapper[4599]: I1012 07:46:59.020244 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-84b9b84486-kbqh6" Oct 12 07:46:59 crc kubenswrapper[4599]: I1012 07:46:59.020555 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-79d585cb66-cz2h5" podStartSLOduration=3.170750818 podStartE2EDuration="11.020539949s" podCreationTimestamp="2025-10-12 07:46:48 +0000 UTC" firstStartedPulling="2025-10-12 07:46:50.044975654 +0000 UTC m=+706.834171156" lastFinishedPulling="2025-10-12 07:46:57.894764785 +0000 UTC m=+714.683960287" observedRunningTime="2025-10-12 07:46:59.016998783 +0000 UTC m=+715.806194274" watchObservedRunningTime="2025-10-12 07:46:59.020539949 +0000 UTC m=+715.809735450" Oct 12 07:46:59 crc kubenswrapper[4599]: I1012 07:46:59.024654 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-85d5d9dd78-rd8wv" event={"ID":"37355d3f-b321-446a-b0ac-5d3a770bd0c5","Type":"ContainerStarted","Data":"a67070aa490600ce72a3bb4fcadb5892a8d09b33b5859881f3493c041d5cbec5"} Oct 12 07:46:59 crc kubenswrapper[4599]: I1012 07:46:59.028787 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-9c5c78d49-gq5k6" event={"ID":"f70b2a0c-df5a-4a41-89db-e1bf314ee45a","Type":"ContainerStarted","Data":"bfe6920a7d81e3b256c17f1796d27e83f0afffd981c54a9fca04ce9e8ce3a9af"} Oct 12 07:46:59 crc kubenswrapper[4599]: I1012 07:46:59.030854 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5956dffb7b2gdkj" event={"ID":"0cdab794-6175-4fb9-bd9d-c1080d45ee30","Type":"ContainerStarted","Data":"ef8d7d69e989937f495c7a1807c46a63a6ed6dd96c571d52c6361bbac5f767c2"} Oct 12 07:46:59 crc kubenswrapper[4599]: I1012 07:46:59.035393 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-79df5fb58c-2fhbx" event={"ID":"dc9b0db0-82c1-492b-94de-c8f93e96364f","Type":"ContainerStarted","Data":"eaabd3e4f6cc1594cc7a31bff0f30bd974ffa4ed6f97e4fa19f40d502c8f17be"} Oct 12 07:46:59 crc kubenswrapper[4599]: I1012 07:46:59.039717 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-658bdf4b74-65btt" event={"ID":"390a00af-1983-41ce-b7f2-3190e2d1594e","Type":"ContainerStarted","Data":"13b9ea7bef0701120ddf98bdcb7778f8ca11f3ea9f38fabffe3126953471d796"} Oct 12 07:46:59 crc kubenswrapper[4599]: I1012 07:46:59.049393 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-8ccz2" event={"ID":"bdeb0fae-d9af-4253-a90c-8a50255cc6fe","Type":"ContainerStarted","Data":"6e444b5b9e5831bc6d57898894afebcb80f3bf9378baec7a7096f597c9140ce1"} Oct 12 07:46:59 crc kubenswrapper[4599]: I1012 07:46:59.049600 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-8ccz2" Oct 12 07:46:59 crc kubenswrapper[4599]: I1012 07:46:59.054786 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-656bcbd775-6xt98" podStartSLOduration=3.443489992 podStartE2EDuration="11.05477304s" podCreationTimestamp="2025-10-12 07:46:48 +0000 UTC" firstStartedPulling="2025-10-12 07:46:50.245544325 +0000 UTC m=+707.034739827" lastFinishedPulling="2025-10-12 07:46:57.856827373 +0000 UTC m=+714.646022875" observedRunningTime="2025-10-12 07:46:59.048901056 +0000 UTC m=+715.838096558" watchObservedRunningTime="2025-10-12 07:46:59.05477304 +0000 UTC m=+715.843968543" Oct 12 07:46:59 crc kubenswrapper[4599]: I1012 07:46:59.060493 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-7sh6m" event={"ID":"d5252e14-f285-43af-ace5-375bcfbe4c68","Type":"ContainerStarted","Data":"8f63b20f0550a16a34e9b95fd3e396feff8c07935c1b809b8e4fe7ac10b91808"} Oct 12 07:46:59 crc kubenswrapper[4599]: I1012 07:46:59.063897 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-gzlp6" event={"ID":"caf862ab-9fa9-4c44-8e6c-35599bcc45a1","Type":"ContainerStarted","Data":"2dfca2bb5f188f6ea9551753de63eac42fe5532531846e61967059f6d6b2343f"} Oct 12 07:46:59 crc kubenswrapper[4599]: I1012 07:46:59.063923 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-gzlp6" event={"ID":"caf862ab-9fa9-4c44-8e6c-35599bcc45a1","Type":"ContainerStarted","Data":"71781fe760682ec99902f8e972ac2c8575da0e9908dcde2e62bfb69aecabf1a2"} Oct 12 07:46:59 crc kubenswrapper[4599]: I1012 07:46:59.064327 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-gzlp6" Oct 12 07:46:59 crc kubenswrapper[4599]: I1012 07:46:59.068625 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-7zfpt" event={"ID":"297195f2-cb07-4bd8-9994-82f16e2f83f3","Type":"ContainerStarted","Data":"8d70ae43cb056dbd10fd2f3e3aab823dca96987f38869e5eb033caad8e9ce285"} Oct 12 07:46:59 crc kubenswrapper[4599]: I1012 07:46:59.068647 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-7zfpt" event={"ID":"297195f2-cb07-4bd8-9994-82f16e2f83f3","Type":"ContainerStarted","Data":"daea9a3bd77e4c1f287aa9923afbb3a3f8668d34333da692aefa5fb2491eba4c"} Oct 12 07:46:59 crc kubenswrapper[4599]: I1012 07:46:59.069006 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-7zfpt" Oct 12 07:46:59 crc kubenswrapper[4599]: I1012 07:46:59.075201 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-84b9b84486-kbqh6" podStartSLOduration=3.268054101 podStartE2EDuration="11.075182473s" podCreationTimestamp="2025-10-12 07:46:48 +0000 UTC" firstStartedPulling="2025-10-12 07:46:50.066029395 +0000 UTC m=+706.855224898" lastFinishedPulling="2025-10-12 07:46:57.873157767 +0000 UTC m=+714.662353270" observedRunningTime="2025-10-12 07:46:59.070186874 +0000 UTC m=+715.859382376" watchObservedRunningTime="2025-10-12 07:46:59.075182473 +0000 UTC m=+715.864377976" Oct 12 07:46:59 crc kubenswrapper[4599]: I1012 07:46:59.089759 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-7ffbcb7588-hlvdz" event={"ID":"8ce364df-e28b-45cf-ae95-92ae415392f0","Type":"ContainerStarted","Data":"4ce6734010d541c0e84559eb6f8dc09e20349d67155e51c9c0e6e0d1e2244810"} Oct 12 07:46:59 crc kubenswrapper[4599]: I1012 07:46:59.089905 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-7ffbcb7588-hlvdz" event={"ID":"8ce364df-e28b-45cf-ae95-92ae415392f0","Type":"ContainerStarted","Data":"54fa1694bc7f9a7e09af0e49862af20fc7d112befbc594a35a05129c6eed9eb0"} Oct 12 07:46:59 crc kubenswrapper[4599]: I1012 07:46:59.090744 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-7ffbcb7588-hlvdz" Oct 12 07:46:59 crc kubenswrapper[4599]: I1012 07:46:59.096681 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-gzlp6" podStartSLOduration=3.298842151 podStartE2EDuration="11.096661172s" podCreationTimestamp="2025-10-12 07:46:48 +0000 UTC" firstStartedPulling="2025-10-12 07:46:50.08701617 +0000 UTC m=+706.876211672" lastFinishedPulling="2025-10-12 07:46:57.884835191 +0000 UTC m=+714.674030693" observedRunningTime="2025-10-12 07:46:59.093310191 +0000 UTC m=+715.882505694" watchObservedRunningTime="2025-10-12 07:46:59.096661172 +0000 UTC m=+715.885856674" Oct 12 07:46:59 crc kubenswrapper[4599]: I1012 07:46:59.137481 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-7zfpt" podStartSLOduration=3.098066322 podStartE2EDuration="11.137462327s" podCreationTimestamp="2025-10-12 07:46:48 +0000 UTC" firstStartedPulling="2025-10-12 07:46:49.828091945 +0000 UTC m=+706.617287447" lastFinishedPulling="2025-10-12 07:46:57.86748795 +0000 UTC m=+714.656683452" observedRunningTime="2025-10-12 07:46:59.137202642 +0000 UTC m=+715.926398144" watchObservedRunningTime="2025-10-12 07:46:59.137462327 +0000 UTC m=+715.926657829" Oct 12 07:46:59 crc kubenswrapper[4599]: I1012 07:46:59.158606 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-8ccz2" podStartSLOduration=3.357290431 podStartE2EDuration="11.158585111s" podCreationTimestamp="2025-10-12 07:46:48 +0000 UTC" firstStartedPulling="2025-10-12 07:46:50.066465438 +0000 UTC m=+706.855660940" lastFinishedPulling="2025-10-12 07:46:57.867760118 +0000 UTC m=+714.656955620" observedRunningTime="2025-10-12 07:46:59.155589352 +0000 UTC m=+715.944784854" watchObservedRunningTime="2025-10-12 07:46:59.158585111 +0000 UTC m=+715.947780612" Oct 12 07:46:59 crc kubenswrapper[4599]: I1012 07:46:59.184673 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-7sh6m" podStartSLOduration=2.740036847 podStartE2EDuration="10.184655054s" podCreationTimestamp="2025-10-12 07:46:49 +0000 UTC" firstStartedPulling="2025-10-12 07:46:50.463601499 +0000 UTC m=+707.252797000" lastFinishedPulling="2025-10-12 07:46:57.908219705 +0000 UTC m=+714.697415207" observedRunningTime="2025-10-12 07:46:59.180673386 +0000 UTC m=+715.969868889" watchObservedRunningTime="2025-10-12 07:46:59.184655054 +0000 UTC m=+715.973850556" Oct 12 07:46:59 crc kubenswrapper[4599]: I1012 07:46:59.204998 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-7ffbcb7588-hlvdz" podStartSLOduration=2.750868239 podStartE2EDuration="11.204978227s" podCreationTimestamp="2025-10-12 07:46:48 +0000 UTC" firstStartedPulling="2025-10-12 07:46:49.413393471 +0000 UTC m=+706.202588973" lastFinishedPulling="2025-10-12 07:46:57.867503459 +0000 UTC m=+714.656698961" observedRunningTime="2025-10-12 07:46:59.200817485 +0000 UTC m=+715.990012987" watchObservedRunningTime="2025-10-12 07:46:59.204978227 +0000 UTC m=+715.994173730" Oct 12 07:46:59 crc kubenswrapper[4599]: I1012 07:46:59.885455 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-5b95c8954b-z5lsg" Oct 12 07:47:00 crc kubenswrapper[4599]: I1012 07:47:00.101795 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-9c5c78d49-gq5k6" event={"ID":"f70b2a0c-df5a-4a41-89db-e1bf314ee45a","Type":"ContainerStarted","Data":"5e71dd46fef5bcd9b636f4880139a1757c2702fffa7756632eb3ca32c4560c0c"} Oct 12 07:47:00 crc kubenswrapper[4599]: I1012 07:47:00.102478 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-9c5c78d49-gq5k6" Oct 12 07:47:00 crc kubenswrapper[4599]: I1012 07:47:00.105600 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-8ccz2" event={"ID":"bdeb0fae-d9af-4253-a90c-8a50255cc6fe","Type":"ContainerStarted","Data":"7ef429e6352d48fda35c61a82587713e2d3af86a0d76b71ee478c10f518b1fc1"} Oct 12 07:47:00 crc kubenswrapper[4599]: I1012 07:47:00.107472 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-79df5fb58c-2fhbx" event={"ID":"dc9b0db0-82c1-492b-94de-c8f93e96364f","Type":"ContainerStarted","Data":"0e8c229768e391e4703fb1cbda98b3e58aac6e8feed22512fd22aa4b48240c17"} Oct 12 07:47:00 crc kubenswrapper[4599]: I1012 07:47:00.107542 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-79df5fb58c-2fhbx" Oct 12 07:47:00 crc kubenswrapper[4599]: I1012 07:47:00.109593 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-5zhzq" event={"ID":"07c5394f-62de-4eca-86c0-c534788aead5","Type":"ContainerStarted","Data":"cb3ca596d1318b251de83f9bee511cab8917d419a706a1b04b76efd4136a42ba"} Oct 12 07:47:00 crc kubenswrapper[4599]: I1012 07:47:00.109687 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-5zhzq" Oct 12 07:47:00 crc kubenswrapper[4599]: I1012 07:47:00.111348 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5f67fbc655-sbgr8" event={"ID":"6d340683-0013-4bf4-b98b-32610996ded4","Type":"ContainerStarted","Data":"aa292fe2e43d31166b7c3c274a24be87f57041bb554bb2ba6e137580602fc9b4"} Oct 12 07:47:00 crc kubenswrapper[4599]: I1012 07:47:00.111605 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-5f67fbc655-sbgr8" Oct 12 07:47:00 crc kubenswrapper[4599]: I1012 07:47:00.115110 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-858f76bbdd-n6scs" event={"ID":"986744d6-7f19-4f08-9dfd-03629fe2ca58","Type":"ContainerStarted","Data":"b9412fd7f81a372db09d00338ce95691d2d24a56f6933a7363fc20347d0522f8"} Oct 12 07:47:00 crc kubenswrapper[4599]: I1012 07:47:00.115538 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-858f76bbdd-n6scs" Oct 12 07:47:00 crc kubenswrapper[4599]: I1012 07:47:00.116043 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-9c5c78d49-gq5k6" podStartSLOduration=4.011104439 podStartE2EDuration="12.116019216s" podCreationTimestamp="2025-10-12 07:46:48 +0000 UTC" firstStartedPulling="2025-10-12 07:46:49.780214012 +0000 UTC m=+706.569409514" lastFinishedPulling="2025-10-12 07:46:57.885128788 +0000 UTC m=+714.674324291" observedRunningTime="2025-10-12 07:47:00.113957891 +0000 UTC m=+716.903153393" watchObservedRunningTime="2025-10-12 07:47:00.116019216 +0000 UTC m=+716.905214719" Oct 12 07:47:00 crc kubenswrapper[4599]: I1012 07:47:00.117789 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-85d5d9dd78-rd8wv" event={"ID":"37355d3f-b321-446a-b0ac-5d3a770bd0c5","Type":"ContainerStarted","Data":"f50eef760debc9c38c0cd2a71b972f2066b28edeb26b87e231d6a92c2a7df103"} Oct 12 07:47:00 crc kubenswrapper[4599]: I1012 07:47:00.117920 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-85d5d9dd78-rd8wv" Oct 12 07:47:00 crc kubenswrapper[4599]: I1012 07:47:00.120130 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-658bdf4b74-65btt" event={"ID":"390a00af-1983-41ce-b7f2-3190e2d1594e","Type":"ContainerStarted","Data":"c384c017c6eba2d71e6c5f09667e3c6b0aa08eba901b3eecc599c89efbfa6305"} Oct 12 07:47:00 crc kubenswrapper[4599]: I1012 07:47:00.120269 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-658bdf4b74-65btt" Oct 12 07:47:00 crc kubenswrapper[4599]: I1012 07:47:00.122202 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5956dffb7b2gdkj" event={"ID":"0cdab794-6175-4fb9-bd9d-c1080d45ee30","Type":"ContainerStarted","Data":"ed87bf31aa29c91fbb98dab4d80e396a02fa9348fe262d6e0a8185a633fc0d66"} Oct 12 07:47:00 crc kubenswrapper[4599]: I1012 07:47:00.135099 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-5f67fbc655-sbgr8" podStartSLOduration=4.096319574 podStartE2EDuration="12.135071418s" podCreationTimestamp="2025-10-12 07:46:48 +0000 UTC" firstStartedPulling="2025-10-12 07:46:49.800365771 +0000 UTC m=+706.589561273" lastFinishedPulling="2025-10-12 07:46:57.839117615 +0000 UTC m=+714.628313117" observedRunningTime="2025-10-12 07:47:00.126631744 +0000 UTC m=+716.915827246" watchObservedRunningTime="2025-10-12 07:47:00.135071418 +0000 UTC m=+716.924266919" Oct 12 07:47:00 crc kubenswrapper[4599]: I1012 07:47:00.143015 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-79df5fb58c-2fhbx" podStartSLOduration=4.478667083 podStartE2EDuration="12.142994258s" podCreationTimestamp="2025-10-12 07:46:48 +0000 UTC" firstStartedPulling="2025-10-12 07:46:50.243980534 +0000 UTC m=+707.033176036" lastFinishedPulling="2025-10-12 07:46:57.908307709 +0000 UTC m=+714.697503211" observedRunningTime="2025-10-12 07:47:00.140048171 +0000 UTC m=+716.929243673" watchObservedRunningTime="2025-10-12 07:47:00.142994258 +0000 UTC m=+716.932189750" Oct 12 07:47:00 crc kubenswrapper[4599]: I1012 07:47:00.155242 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-5zhzq" podStartSLOduration=4.356624698 podStartE2EDuration="12.155215215s" podCreationTimestamp="2025-10-12 07:46:48 +0000 UTC" firstStartedPulling="2025-10-12 07:46:50.075114991 +0000 UTC m=+706.864310494" lastFinishedPulling="2025-10-12 07:46:57.873705508 +0000 UTC m=+714.662901011" observedRunningTime="2025-10-12 07:47:00.151285695 +0000 UTC m=+716.940481197" watchObservedRunningTime="2025-10-12 07:47:00.155215215 +0000 UTC m=+716.944410718" Oct 12 07:47:00 crc kubenswrapper[4599]: I1012 07:47:00.166662 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-85d5d9dd78-rd8wv" podStartSLOduration=3.5697235430000003 podStartE2EDuration="12.166641732s" podCreationTimestamp="2025-10-12 07:46:48 +0000 UTC" firstStartedPulling="2025-10-12 07:46:49.275954185 +0000 UTC m=+706.065149687" lastFinishedPulling="2025-10-12 07:46:57.872872374 +0000 UTC m=+714.662067876" observedRunningTime="2025-10-12 07:47:00.161433746 +0000 UTC m=+716.950629248" watchObservedRunningTime="2025-10-12 07:47:00.166641732 +0000 UTC m=+716.955837234" Oct 12 07:47:00 crc kubenswrapper[4599]: I1012 07:47:00.176470 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-858f76bbdd-n6scs" podStartSLOduration=4.133425411 podStartE2EDuration="12.176450741s" podCreationTimestamp="2025-10-12 07:46:48 +0000 UTC" firstStartedPulling="2025-10-12 07:46:49.820069765 +0000 UTC m=+706.609265268" lastFinishedPulling="2025-10-12 07:46:57.863095095 +0000 UTC m=+714.652290598" observedRunningTime="2025-10-12 07:47:00.1715986 +0000 UTC m=+716.960794112" watchObservedRunningTime="2025-10-12 07:47:00.176450741 +0000 UTC m=+716.965646243" Oct 12 07:47:00 crc kubenswrapper[4599]: I1012 07:47:00.191174 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5956dffb7b2gdkj" podStartSLOduration=5.164517952 podStartE2EDuration="12.1911595s" podCreationTimestamp="2025-10-12 07:46:48 +0000 UTC" firstStartedPulling="2025-10-12 07:46:50.852884158 +0000 UTC m=+707.642079660" lastFinishedPulling="2025-10-12 07:46:57.879525706 +0000 UTC m=+714.668721208" observedRunningTime="2025-10-12 07:47:00.189231413 +0000 UTC m=+716.978426925" watchObservedRunningTime="2025-10-12 07:47:00.1911595 +0000 UTC m=+716.980355002" Oct 12 07:47:00 crc kubenswrapper[4599]: I1012 07:47:00.209025 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-658bdf4b74-65btt" podStartSLOduration=4.407004713 podStartE2EDuration="12.209003187s" podCreationTimestamp="2025-10-12 07:46:48 +0000 UTC" firstStartedPulling="2025-10-12 07:46:50.083065965 +0000 UTC m=+706.872261467" lastFinishedPulling="2025-10-12 07:46:57.885064438 +0000 UTC m=+714.674259941" observedRunningTime="2025-10-12 07:47:00.205555736 +0000 UTC m=+716.994751238" watchObservedRunningTime="2025-10-12 07:47:00.209003187 +0000 UTC m=+716.998198689" Oct 12 07:47:00 crc kubenswrapper[4599]: I1012 07:47:00.406650 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5956dffb7b2gdkj" Oct 12 07:47:06 crc kubenswrapper[4599]: I1012 07:47:06.173620 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-d2q7d" event={"ID":"b1de56e0-d6e0-4c5a-9c4e-c725f171e142","Type":"ContainerStarted","Data":"124944bec672bb73eb375b38fea2e49563bcc3994945330d7cbeacc32e525be7"} Oct 12 07:47:06 crc kubenswrapper[4599]: I1012 07:47:06.174423 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-d2q7d" Oct 12 07:47:06 crc kubenswrapper[4599]: I1012 07:47:06.193890 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-d2q7d" podStartSLOduration=3.316044437 podStartE2EDuration="18.193876388s" podCreationTimestamp="2025-10-12 07:46:48 +0000 UTC" firstStartedPulling="2025-10-12 07:46:50.26927079 +0000 UTC m=+707.058466292" lastFinishedPulling="2025-10-12 07:47:05.147102741 +0000 UTC m=+721.936298243" observedRunningTime="2025-10-12 07:47:06.189927752 +0000 UTC m=+722.979123254" watchObservedRunningTime="2025-10-12 07:47:06.193876388 +0000 UTC m=+722.983071890" Oct 12 07:47:07 crc kubenswrapper[4599]: I1012 07:47:07.194635 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5df598886f-md9kr" event={"ID":"84280387-ac26-4496-8c00-72673a91cb12","Type":"ContainerStarted","Data":"9aa24ff71b131bab0967032f3c4e912d19bedf46f6cbec84ca0c965779b17051"} Oct 12 07:47:07 crc kubenswrapper[4599]: I1012 07:47:07.194952 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-5df598886f-md9kr" Oct 12 07:47:07 crc kubenswrapper[4599]: I1012 07:47:07.198088 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-67cfc6749b-m9l8f" event={"ID":"5c565798-c0f8-4d14-b531-386b1b0efc63","Type":"ContainerStarted","Data":"12b1e6165f0d8d26e5abac798f13f9ad4d97f44a069cdaa6beac04c61a7e32e7"} Oct 12 07:47:07 crc kubenswrapper[4599]: I1012 07:47:07.198383 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-67cfc6749b-m9l8f" Oct 12 07:47:07 crc kubenswrapper[4599]: I1012 07:47:07.215734 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-5df598886f-md9kr" podStartSLOduration=2.692183254 podStartE2EDuration="19.215711613s" podCreationTimestamp="2025-10-12 07:46:48 +0000 UTC" firstStartedPulling="2025-10-12 07:46:50.284753688 +0000 UTC m=+707.073949190" lastFinishedPulling="2025-10-12 07:47:06.808282047 +0000 UTC m=+723.597477549" observedRunningTime="2025-10-12 07:47:07.212905338 +0000 UTC m=+724.002100839" watchObservedRunningTime="2025-10-12 07:47:07.215711613 +0000 UTC m=+724.004907114" Oct 12 07:47:07 crc kubenswrapper[4599]: I1012 07:47:07.229931 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-67cfc6749b-m9l8f" podStartSLOduration=2.690375922 podStartE2EDuration="19.229909128s" podCreationTimestamp="2025-10-12 07:46:48 +0000 UTC" firstStartedPulling="2025-10-12 07:46:50.26717787 +0000 UTC m=+707.056373372" lastFinishedPulling="2025-10-12 07:47:06.806711076 +0000 UTC m=+723.595906578" observedRunningTime="2025-10-12 07:47:07.227140233 +0000 UTC m=+724.016335735" watchObservedRunningTime="2025-10-12 07:47:07.229909128 +0000 UTC m=+724.019104631" Oct 12 07:47:08 crc kubenswrapper[4599]: I1012 07:47:08.205077 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5458f77c4-b6bwb" event={"ID":"5f9bd209-b904-4998-843c-d4573b0a2cd0","Type":"ContainerStarted","Data":"5ea85661264d1a83a9d291e44768a576be783988bbfaeed1a747b978e5563c8f"} Oct 12 07:47:08 crc kubenswrapper[4599]: I1012 07:47:08.223506 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5458f77c4-b6bwb" podStartSLOduration=3.143458994 podStartE2EDuration="20.223486598s" podCreationTimestamp="2025-10-12 07:46:48 +0000 UTC" firstStartedPulling="2025-10-12 07:46:50.285156368 +0000 UTC m=+707.074351870" lastFinishedPulling="2025-10-12 07:47:07.365183973 +0000 UTC m=+724.154379474" observedRunningTime="2025-10-12 07:47:08.219061969 +0000 UTC m=+725.008257472" watchObservedRunningTime="2025-10-12 07:47:08.223486598 +0000 UTC m=+725.012682100" Oct 12 07:47:08 crc kubenswrapper[4599]: I1012 07:47:08.537845 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-85d5d9dd78-rd8wv" Oct 12 07:47:08 crc kubenswrapper[4599]: I1012 07:47:08.628083 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-858f76bbdd-n6scs" Oct 12 07:47:08 crc kubenswrapper[4599]: I1012 07:47:08.656906 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-7ffbcb7588-hlvdz" Oct 12 07:47:08 crc kubenswrapper[4599]: I1012 07:47:08.694276 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-7zfpt" Oct 12 07:47:08 crc kubenswrapper[4599]: I1012 07:47:08.705103 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-9c5c78d49-gq5k6" Oct 12 07:47:08 crc kubenswrapper[4599]: I1012 07:47:08.729553 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-5f67fbc655-sbgr8" Oct 12 07:47:08 crc kubenswrapper[4599]: I1012 07:47:08.744712 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-5zhzq" Oct 12 07:47:08 crc kubenswrapper[4599]: I1012 07:47:08.757655 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-79d585cb66-cz2h5" Oct 12 07:47:08 crc kubenswrapper[4599]: I1012 07:47:08.796845 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-658bdf4b74-65btt" Oct 12 07:47:08 crc kubenswrapper[4599]: I1012 07:47:08.844496 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-8ccz2" Oct 12 07:47:08 crc kubenswrapper[4599]: I1012 07:47:08.858506 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-84b9b84486-kbqh6" Oct 12 07:47:08 crc kubenswrapper[4599]: I1012 07:47:08.927733 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-79df5fb58c-2fhbx" Oct 12 07:47:08 crc kubenswrapper[4599]: I1012 07:47:08.942540 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-gzlp6" Oct 12 07:47:09 crc kubenswrapper[4599]: I1012 07:47:09.017535 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5458f77c4-b6bwb" Oct 12 07:47:09 crc kubenswrapper[4599]: I1012 07:47:09.216102 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-7b7fb68549-nwqxv" event={"ID":"f13311d0-566a-4c8d-823c-fae47384cd53","Type":"ContainerStarted","Data":"33439c270b578b0634b2258a354f4474da3dd14246e44c8cc81d7611fdcd130e"} Oct 12 07:47:09 crc kubenswrapper[4599]: I1012 07:47:09.216513 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-7b7fb68549-nwqxv" Oct 12 07:47:09 crc kubenswrapper[4599]: I1012 07:47:09.219376 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-7f554bff7b-ffpvz" event={"ID":"20be3b1c-5de2-4c22-a3af-215e2272d586","Type":"ContainerStarted","Data":"a2ab67cc152ade12a5d5022df745180f174636514384bf6eb6be44457810c48b"} Oct 12 07:47:09 crc kubenswrapper[4599]: I1012 07:47:09.219736 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-7f554bff7b-ffpvz" Oct 12 07:47:09 crc kubenswrapper[4599]: I1012 07:47:09.233477 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-7b7fb68549-nwqxv" podStartSLOduration=3.330551545 podStartE2EDuration="21.233456494s" podCreationTimestamp="2025-10-12 07:46:48 +0000 UTC" firstStartedPulling="2025-10-12 07:46:50.284816988 +0000 UTC m=+707.074012489" lastFinishedPulling="2025-10-12 07:47:08.187721936 +0000 UTC m=+724.976917438" observedRunningTime="2025-10-12 07:47:09.23262179 +0000 UTC m=+726.021817302" watchObservedRunningTime="2025-10-12 07:47:09.233456494 +0000 UTC m=+726.022651996" Oct 12 07:47:09 crc kubenswrapper[4599]: I1012 07:47:09.252774 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-7f554bff7b-ffpvz" podStartSLOduration=3.531496262 podStartE2EDuration="21.252758044s" podCreationTimestamp="2025-10-12 07:46:48 +0000 UTC" firstStartedPulling="2025-10-12 07:46:50.467747863 +0000 UTC m=+707.256943365" lastFinishedPulling="2025-10-12 07:47:08.189009645 +0000 UTC m=+724.978205147" observedRunningTime="2025-10-12 07:47:09.246051219 +0000 UTC m=+726.035246722" watchObservedRunningTime="2025-10-12 07:47:09.252758044 +0000 UTC m=+726.041953546" Oct 12 07:47:09 crc kubenswrapper[4599]: I1012 07:47:09.276554 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-656bcbd775-6xt98" Oct 12 07:47:10 crc kubenswrapper[4599]: I1012 07:47:10.413253 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5956dffb7b2gdkj" Oct 12 07:47:18 crc kubenswrapper[4599]: I1012 07:47:18.819589 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-5df598886f-md9kr" Oct 12 07:47:18 crc kubenswrapper[4599]: I1012 07:47:18.820485 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-7b7fb68549-nwqxv" Oct 12 07:47:18 crc kubenswrapper[4599]: I1012 07:47:18.931846 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-d2q7d" Oct 12 07:47:18 crc kubenswrapper[4599]: I1012 07:47:18.984958 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-67cfc6749b-m9l8f" Oct 12 07:47:19 crc kubenswrapper[4599]: I1012 07:47:19.022101 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5458f77c4-b6bwb" Oct 12 07:47:19 crc kubenswrapper[4599]: I1012 07:47:19.129912 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-7f554bff7b-ffpvz" Oct 12 07:47:20 crc kubenswrapper[4599]: I1012 07:47:20.136208 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-4twv2"] Oct 12 07:47:20 crc kubenswrapper[4599]: I1012 07:47:20.136792 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-4twv2" podUID="2b0cadaa-2cc9-45a1-add5-6d1b13c114f8" containerName="controller-manager" containerID="cri-o://e793344546dda56bb8c93f53481ee5ab7982deabefe22e9ee1934a7126048cce" gracePeriod=30 Oct 12 07:47:20 crc kubenswrapper[4599]: I1012 07:47:20.236365 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-8r7fw"] Oct 12 07:47:20 crc kubenswrapper[4599]: I1012 07:47:20.236596 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8r7fw" podUID="69d5f53c-375a-424e-a120-93d89d06ae50" containerName="route-controller-manager" containerID="cri-o://1e2e2fbce08b456e2555df4f72f7e3bacfb9ae9768ff4a4d00995b004ec99e1f" gracePeriod=30 Oct 12 07:47:20 crc kubenswrapper[4599]: I1012 07:47:20.295311 4599 generic.go:334] "Generic (PLEG): container finished" podID="2b0cadaa-2cc9-45a1-add5-6d1b13c114f8" containerID="e793344546dda56bb8c93f53481ee5ab7982deabefe22e9ee1934a7126048cce" exitCode=0 Oct 12 07:47:20 crc kubenswrapper[4599]: I1012 07:47:20.295378 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-4twv2" event={"ID":"2b0cadaa-2cc9-45a1-add5-6d1b13c114f8","Type":"ContainerDied","Data":"e793344546dda56bb8c93f53481ee5ab7982deabefe22e9ee1934a7126048cce"} Oct 12 07:47:20 crc kubenswrapper[4599]: I1012 07:47:20.530020 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-4twv2" Oct 12 07:47:20 crc kubenswrapper[4599]: I1012 07:47:20.626498 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8f6vn\" (UniqueName: \"kubernetes.io/projected/2b0cadaa-2cc9-45a1-add5-6d1b13c114f8-kube-api-access-8f6vn\") pod \"2b0cadaa-2cc9-45a1-add5-6d1b13c114f8\" (UID: \"2b0cadaa-2cc9-45a1-add5-6d1b13c114f8\") " Oct 12 07:47:20 crc kubenswrapper[4599]: I1012 07:47:20.626582 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b0cadaa-2cc9-45a1-add5-6d1b13c114f8-config\") pod \"2b0cadaa-2cc9-45a1-add5-6d1b13c114f8\" (UID: \"2b0cadaa-2cc9-45a1-add5-6d1b13c114f8\") " Oct 12 07:47:20 crc kubenswrapper[4599]: I1012 07:47:20.626626 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2b0cadaa-2cc9-45a1-add5-6d1b13c114f8-client-ca\") pod \"2b0cadaa-2cc9-45a1-add5-6d1b13c114f8\" (UID: \"2b0cadaa-2cc9-45a1-add5-6d1b13c114f8\") " Oct 12 07:47:20 crc kubenswrapper[4599]: I1012 07:47:20.626687 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2b0cadaa-2cc9-45a1-add5-6d1b13c114f8-proxy-ca-bundles\") pod \"2b0cadaa-2cc9-45a1-add5-6d1b13c114f8\" (UID: \"2b0cadaa-2cc9-45a1-add5-6d1b13c114f8\") " Oct 12 07:47:20 crc kubenswrapper[4599]: I1012 07:47:20.626726 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2b0cadaa-2cc9-45a1-add5-6d1b13c114f8-serving-cert\") pod \"2b0cadaa-2cc9-45a1-add5-6d1b13c114f8\" (UID: \"2b0cadaa-2cc9-45a1-add5-6d1b13c114f8\") " Oct 12 07:47:20 crc kubenswrapper[4599]: I1012 07:47:20.627291 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b0cadaa-2cc9-45a1-add5-6d1b13c114f8-client-ca" (OuterVolumeSpecName: "client-ca") pod "2b0cadaa-2cc9-45a1-add5-6d1b13c114f8" (UID: "2b0cadaa-2cc9-45a1-add5-6d1b13c114f8"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:47:20 crc kubenswrapper[4599]: I1012 07:47:20.627362 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b0cadaa-2cc9-45a1-add5-6d1b13c114f8-config" (OuterVolumeSpecName: "config") pod "2b0cadaa-2cc9-45a1-add5-6d1b13c114f8" (UID: "2b0cadaa-2cc9-45a1-add5-6d1b13c114f8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:47:20 crc kubenswrapper[4599]: I1012 07:47:20.627505 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b0cadaa-2cc9-45a1-add5-6d1b13c114f8-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "2b0cadaa-2cc9-45a1-add5-6d1b13c114f8" (UID: "2b0cadaa-2cc9-45a1-add5-6d1b13c114f8"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:47:20 crc kubenswrapper[4599]: I1012 07:47:20.635010 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b0cadaa-2cc9-45a1-add5-6d1b13c114f8-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2b0cadaa-2cc9-45a1-add5-6d1b13c114f8" (UID: "2b0cadaa-2cc9-45a1-add5-6d1b13c114f8"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:47:20 crc kubenswrapper[4599]: I1012 07:47:20.635199 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b0cadaa-2cc9-45a1-add5-6d1b13c114f8-kube-api-access-8f6vn" (OuterVolumeSpecName: "kube-api-access-8f6vn") pod "2b0cadaa-2cc9-45a1-add5-6d1b13c114f8" (UID: "2b0cadaa-2cc9-45a1-add5-6d1b13c114f8"). InnerVolumeSpecName "kube-api-access-8f6vn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:47:20 crc kubenswrapper[4599]: I1012 07:47:20.728011 4599 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b0cadaa-2cc9-45a1-add5-6d1b13c114f8-config\") on node \"crc\" DevicePath \"\"" Oct 12 07:47:20 crc kubenswrapper[4599]: I1012 07:47:20.728051 4599 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2b0cadaa-2cc9-45a1-add5-6d1b13c114f8-client-ca\") on node \"crc\" DevicePath \"\"" Oct 12 07:47:20 crc kubenswrapper[4599]: I1012 07:47:20.728061 4599 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2b0cadaa-2cc9-45a1-add5-6d1b13c114f8-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 12 07:47:20 crc kubenswrapper[4599]: I1012 07:47:20.728101 4599 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2b0cadaa-2cc9-45a1-add5-6d1b13c114f8-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 12 07:47:20 crc kubenswrapper[4599]: I1012 07:47:20.728110 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8f6vn\" (UniqueName: \"kubernetes.io/projected/2b0cadaa-2cc9-45a1-add5-6d1b13c114f8-kube-api-access-8f6vn\") on node \"crc\" DevicePath \"\"" Oct 12 07:47:21 crc kubenswrapper[4599]: I1012 07:47:21.292585 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-854d7dcb6-svggh"] Oct 12 07:47:21 crc kubenswrapper[4599]: E1012 07:47:21.292911 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b0cadaa-2cc9-45a1-add5-6d1b13c114f8" containerName="controller-manager" Oct 12 07:47:21 crc kubenswrapper[4599]: I1012 07:47:21.292926 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b0cadaa-2cc9-45a1-add5-6d1b13c114f8" containerName="controller-manager" Oct 12 07:47:21 crc kubenswrapper[4599]: I1012 07:47:21.293115 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b0cadaa-2cc9-45a1-add5-6d1b13c114f8" containerName="controller-manager" Oct 12 07:47:21 crc kubenswrapper[4599]: I1012 07:47:21.293607 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-854d7dcb6-svggh" Oct 12 07:47:21 crc kubenswrapper[4599]: I1012 07:47:21.301938 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-854d7dcb6-svggh"] Oct 12 07:47:21 crc kubenswrapper[4599]: I1012 07:47:21.305624 4599 generic.go:334] "Generic (PLEG): container finished" podID="69d5f53c-375a-424e-a120-93d89d06ae50" containerID="1e2e2fbce08b456e2555df4f72f7e3bacfb9ae9768ff4a4d00995b004ec99e1f" exitCode=0 Oct 12 07:47:21 crc kubenswrapper[4599]: I1012 07:47:21.305698 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8r7fw" event={"ID":"69d5f53c-375a-424e-a120-93d89d06ae50","Type":"ContainerDied","Data":"1e2e2fbce08b456e2555df4f72f7e3bacfb9ae9768ff4a4d00995b004ec99e1f"} Oct 12 07:47:21 crc kubenswrapper[4599]: I1012 07:47:21.307358 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-4twv2" event={"ID":"2b0cadaa-2cc9-45a1-add5-6d1b13c114f8","Type":"ContainerDied","Data":"84287a8208edc4d28dfe446e6f2535c88edb8cf46e7c1186260e18c960562c90"} Oct 12 07:47:21 crc kubenswrapper[4599]: I1012 07:47:21.307399 4599 scope.go:117] "RemoveContainer" containerID="e793344546dda56bb8c93f53481ee5ab7982deabefe22e9ee1934a7126048cce" Oct 12 07:47:21 crc kubenswrapper[4599]: I1012 07:47:21.307452 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-4twv2" Oct 12 07:47:21 crc kubenswrapper[4599]: I1012 07:47:21.338383 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-4twv2"] Oct 12 07:47:21 crc kubenswrapper[4599]: I1012 07:47:21.344893 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-4twv2"] Oct 12 07:47:21 crc kubenswrapper[4599]: I1012 07:47:21.438430 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c2f2023-e542-49b6-99f5-1051ee9b8e83-serving-cert\") pod \"controller-manager-854d7dcb6-svggh\" (UID: \"3c2f2023-e542-49b6-99f5-1051ee9b8e83\") " pod="openshift-controller-manager/controller-manager-854d7dcb6-svggh" Oct 12 07:47:21 crc kubenswrapper[4599]: I1012 07:47:21.438520 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tr97\" (UniqueName: \"kubernetes.io/projected/3c2f2023-e542-49b6-99f5-1051ee9b8e83-kube-api-access-6tr97\") pod \"controller-manager-854d7dcb6-svggh\" (UID: \"3c2f2023-e542-49b6-99f5-1051ee9b8e83\") " pod="openshift-controller-manager/controller-manager-854d7dcb6-svggh" Oct 12 07:47:21 crc kubenswrapper[4599]: I1012 07:47:21.438568 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3c2f2023-e542-49b6-99f5-1051ee9b8e83-client-ca\") pod \"controller-manager-854d7dcb6-svggh\" (UID: \"3c2f2023-e542-49b6-99f5-1051ee9b8e83\") " pod="openshift-controller-manager/controller-manager-854d7dcb6-svggh" Oct 12 07:47:21 crc kubenswrapper[4599]: I1012 07:47:21.438600 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c2f2023-e542-49b6-99f5-1051ee9b8e83-config\") pod \"controller-manager-854d7dcb6-svggh\" (UID: \"3c2f2023-e542-49b6-99f5-1051ee9b8e83\") " pod="openshift-controller-manager/controller-manager-854d7dcb6-svggh" Oct 12 07:47:21 crc kubenswrapper[4599]: I1012 07:47:21.438827 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3c2f2023-e542-49b6-99f5-1051ee9b8e83-proxy-ca-bundles\") pod \"controller-manager-854d7dcb6-svggh\" (UID: \"3c2f2023-e542-49b6-99f5-1051ee9b8e83\") " pod="openshift-controller-manager/controller-manager-854d7dcb6-svggh" Oct 12 07:47:21 crc kubenswrapper[4599]: I1012 07:47:21.524248 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-854d7dcb6-svggh"] Oct 12 07:47:21 crc kubenswrapper[4599]: E1012 07:47:21.524767 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[client-ca config kube-api-access-6tr97 proxy-ca-bundles serving-cert], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-controller-manager/controller-manager-854d7dcb6-svggh" podUID="3c2f2023-e542-49b6-99f5-1051ee9b8e83" Oct 12 07:47:21 crc kubenswrapper[4599]: I1012 07:47:21.540440 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c2f2023-e542-49b6-99f5-1051ee9b8e83-serving-cert\") pod \"controller-manager-854d7dcb6-svggh\" (UID: \"3c2f2023-e542-49b6-99f5-1051ee9b8e83\") " pod="openshift-controller-manager/controller-manager-854d7dcb6-svggh" Oct 12 07:47:21 crc kubenswrapper[4599]: I1012 07:47:21.540496 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tr97\" (UniqueName: \"kubernetes.io/projected/3c2f2023-e542-49b6-99f5-1051ee9b8e83-kube-api-access-6tr97\") pod \"controller-manager-854d7dcb6-svggh\" (UID: \"3c2f2023-e542-49b6-99f5-1051ee9b8e83\") " pod="openshift-controller-manager/controller-manager-854d7dcb6-svggh" Oct 12 07:47:21 crc kubenswrapper[4599]: I1012 07:47:21.540530 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3c2f2023-e542-49b6-99f5-1051ee9b8e83-client-ca\") pod \"controller-manager-854d7dcb6-svggh\" (UID: \"3c2f2023-e542-49b6-99f5-1051ee9b8e83\") " pod="openshift-controller-manager/controller-manager-854d7dcb6-svggh" Oct 12 07:47:21 crc kubenswrapper[4599]: I1012 07:47:21.540551 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c2f2023-e542-49b6-99f5-1051ee9b8e83-config\") pod \"controller-manager-854d7dcb6-svggh\" (UID: \"3c2f2023-e542-49b6-99f5-1051ee9b8e83\") " pod="openshift-controller-manager/controller-manager-854d7dcb6-svggh" Oct 12 07:47:21 crc kubenswrapper[4599]: I1012 07:47:21.540614 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3c2f2023-e542-49b6-99f5-1051ee9b8e83-proxy-ca-bundles\") pod \"controller-manager-854d7dcb6-svggh\" (UID: \"3c2f2023-e542-49b6-99f5-1051ee9b8e83\") " pod="openshift-controller-manager/controller-manager-854d7dcb6-svggh" Oct 12 07:47:21 crc kubenswrapper[4599]: I1012 07:47:21.541582 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3c2f2023-e542-49b6-99f5-1051ee9b8e83-proxy-ca-bundles\") pod \"controller-manager-854d7dcb6-svggh\" (UID: \"3c2f2023-e542-49b6-99f5-1051ee9b8e83\") " pod="openshift-controller-manager/controller-manager-854d7dcb6-svggh" Oct 12 07:47:21 crc kubenswrapper[4599]: I1012 07:47:21.543315 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3c2f2023-e542-49b6-99f5-1051ee9b8e83-client-ca\") pod \"controller-manager-854d7dcb6-svggh\" (UID: \"3c2f2023-e542-49b6-99f5-1051ee9b8e83\") " pod="openshift-controller-manager/controller-manager-854d7dcb6-svggh" Oct 12 07:47:21 crc kubenswrapper[4599]: I1012 07:47:21.543842 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c2f2023-e542-49b6-99f5-1051ee9b8e83-config\") pod \"controller-manager-854d7dcb6-svggh\" (UID: \"3c2f2023-e542-49b6-99f5-1051ee9b8e83\") " pod="openshift-controller-manager/controller-manager-854d7dcb6-svggh" Oct 12 07:47:21 crc kubenswrapper[4599]: I1012 07:47:21.546702 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c2f2023-e542-49b6-99f5-1051ee9b8e83-serving-cert\") pod \"controller-manager-854d7dcb6-svggh\" (UID: \"3c2f2023-e542-49b6-99f5-1051ee9b8e83\") " pod="openshift-controller-manager/controller-manager-854d7dcb6-svggh" Oct 12 07:47:21 crc kubenswrapper[4599]: I1012 07:47:21.563829 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tr97\" (UniqueName: \"kubernetes.io/projected/3c2f2023-e542-49b6-99f5-1051ee9b8e83-kube-api-access-6tr97\") pod \"controller-manager-854d7dcb6-svggh\" (UID: \"3c2f2023-e542-49b6-99f5-1051ee9b8e83\") " pod="openshift-controller-manager/controller-manager-854d7dcb6-svggh" Oct 12 07:47:21 crc kubenswrapper[4599]: I1012 07:47:21.566045 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b0cadaa-2cc9-45a1-add5-6d1b13c114f8" path="/var/lib/kubelet/pods/2b0cadaa-2cc9-45a1-add5-6d1b13c114f8/volumes" Oct 12 07:47:22 crc kubenswrapper[4599]: I1012 07:47:22.316733 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-854d7dcb6-svggh" Oct 12 07:47:22 crc kubenswrapper[4599]: I1012 07:47:22.326835 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-854d7dcb6-svggh" Oct 12 07:47:22 crc kubenswrapper[4599]: I1012 07:47:22.456526 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c2f2023-e542-49b6-99f5-1051ee9b8e83-serving-cert\") pod \"3c2f2023-e542-49b6-99f5-1051ee9b8e83\" (UID: \"3c2f2023-e542-49b6-99f5-1051ee9b8e83\") " Oct 12 07:47:22 crc kubenswrapper[4599]: I1012 07:47:22.456608 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3c2f2023-e542-49b6-99f5-1051ee9b8e83-client-ca\") pod \"3c2f2023-e542-49b6-99f5-1051ee9b8e83\" (UID: \"3c2f2023-e542-49b6-99f5-1051ee9b8e83\") " Oct 12 07:47:22 crc kubenswrapper[4599]: I1012 07:47:22.456709 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3c2f2023-e542-49b6-99f5-1051ee9b8e83-proxy-ca-bundles\") pod \"3c2f2023-e542-49b6-99f5-1051ee9b8e83\" (UID: \"3c2f2023-e542-49b6-99f5-1051ee9b8e83\") " Oct 12 07:47:22 crc kubenswrapper[4599]: I1012 07:47:22.456734 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6tr97\" (UniqueName: \"kubernetes.io/projected/3c2f2023-e542-49b6-99f5-1051ee9b8e83-kube-api-access-6tr97\") pod \"3c2f2023-e542-49b6-99f5-1051ee9b8e83\" (UID: \"3c2f2023-e542-49b6-99f5-1051ee9b8e83\") " Oct 12 07:47:22 crc kubenswrapper[4599]: I1012 07:47:22.456754 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c2f2023-e542-49b6-99f5-1051ee9b8e83-config\") pod \"3c2f2023-e542-49b6-99f5-1051ee9b8e83\" (UID: \"3c2f2023-e542-49b6-99f5-1051ee9b8e83\") " Oct 12 07:47:22 crc kubenswrapper[4599]: I1012 07:47:22.457441 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c2f2023-e542-49b6-99f5-1051ee9b8e83-client-ca" (OuterVolumeSpecName: "client-ca") pod "3c2f2023-e542-49b6-99f5-1051ee9b8e83" (UID: "3c2f2023-e542-49b6-99f5-1051ee9b8e83"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:47:22 crc kubenswrapper[4599]: I1012 07:47:22.457714 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c2f2023-e542-49b6-99f5-1051ee9b8e83-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "3c2f2023-e542-49b6-99f5-1051ee9b8e83" (UID: "3c2f2023-e542-49b6-99f5-1051ee9b8e83"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:47:22 crc kubenswrapper[4599]: I1012 07:47:22.457773 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c2f2023-e542-49b6-99f5-1051ee9b8e83-config" (OuterVolumeSpecName: "config") pod "3c2f2023-e542-49b6-99f5-1051ee9b8e83" (UID: "3c2f2023-e542-49b6-99f5-1051ee9b8e83"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:47:22 crc kubenswrapper[4599]: I1012 07:47:22.458402 4599 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3c2f2023-e542-49b6-99f5-1051ee9b8e83-client-ca\") on node \"crc\" DevicePath \"\"" Oct 12 07:47:22 crc kubenswrapper[4599]: I1012 07:47:22.458427 4599 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3c2f2023-e542-49b6-99f5-1051ee9b8e83-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 12 07:47:22 crc kubenswrapper[4599]: I1012 07:47:22.458441 4599 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c2f2023-e542-49b6-99f5-1051ee9b8e83-config\") on node \"crc\" DevicePath \"\"" Oct 12 07:47:22 crc kubenswrapper[4599]: I1012 07:47:22.461560 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c2f2023-e542-49b6-99f5-1051ee9b8e83-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "3c2f2023-e542-49b6-99f5-1051ee9b8e83" (UID: "3c2f2023-e542-49b6-99f5-1051ee9b8e83"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:47:22 crc kubenswrapper[4599]: I1012 07:47:22.461690 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c2f2023-e542-49b6-99f5-1051ee9b8e83-kube-api-access-6tr97" (OuterVolumeSpecName: "kube-api-access-6tr97") pod "3c2f2023-e542-49b6-99f5-1051ee9b8e83" (UID: "3c2f2023-e542-49b6-99f5-1051ee9b8e83"). InnerVolumeSpecName "kube-api-access-6tr97". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:47:22 crc kubenswrapper[4599]: I1012 07:47:22.560527 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6tr97\" (UniqueName: \"kubernetes.io/projected/3c2f2023-e542-49b6-99f5-1051ee9b8e83-kube-api-access-6tr97\") on node \"crc\" DevicePath \"\"" Oct 12 07:47:22 crc kubenswrapper[4599]: I1012 07:47:22.560697 4599 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c2f2023-e542-49b6-99f5-1051ee9b8e83-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 12 07:47:23 crc kubenswrapper[4599]: I1012 07:47:23.324148 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-854d7dcb6-svggh" Oct 12 07:47:23 crc kubenswrapper[4599]: I1012 07:47:23.359096 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-854d7dcb6-svggh"] Oct 12 07:47:23 crc kubenswrapper[4599]: I1012 07:47:23.362151 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-854d7dcb6-svggh"] Oct 12 07:47:23 crc kubenswrapper[4599]: I1012 07:47:23.369265 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5d8454f85d-4km5d"] Oct 12 07:47:23 crc kubenswrapper[4599]: I1012 07:47:23.370314 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5d8454f85d-4km5d" Oct 12 07:47:23 crc kubenswrapper[4599]: I1012 07:47:23.372963 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5c27c9a1-bf33-4325-8939-4609b3c786b6-client-ca\") pod \"controller-manager-5d8454f85d-4km5d\" (UID: \"5c27c9a1-bf33-4325-8939-4609b3c786b6\") " pod="openshift-controller-manager/controller-manager-5d8454f85d-4km5d" Oct 12 07:47:23 crc kubenswrapper[4599]: I1012 07:47:23.373016 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c27c9a1-bf33-4325-8939-4609b3c786b6-serving-cert\") pod \"controller-manager-5d8454f85d-4km5d\" (UID: \"5c27c9a1-bf33-4325-8939-4609b3c786b6\") " pod="openshift-controller-manager/controller-manager-5d8454f85d-4km5d" Oct 12 07:47:23 crc kubenswrapper[4599]: I1012 07:47:23.373057 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5c27c9a1-bf33-4325-8939-4609b3c786b6-proxy-ca-bundles\") pod \"controller-manager-5d8454f85d-4km5d\" (UID: \"5c27c9a1-bf33-4325-8939-4609b3c786b6\") " pod="openshift-controller-manager/controller-manager-5d8454f85d-4km5d" Oct 12 07:47:23 crc kubenswrapper[4599]: I1012 07:47:23.373077 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fzh8\" (UniqueName: \"kubernetes.io/projected/5c27c9a1-bf33-4325-8939-4609b3c786b6-kube-api-access-4fzh8\") pod \"controller-manager-5d8454f85d-4km5d\" (UID: \"5c27c9a1-bf33-4325-8939-4609b3c786b6\") " pod="openshift-controller-manager/controller-manager-5d8454f85d-4km5d" Oct 12 07:47:23 crc kubenswrapper[4599]: I1012 07:47:23.373110 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c27c9a1-bf33-4325-8939-4609b3c786b6-config\") pod \"controller-manager-5d8454f85d-4km5d\" (UID: \"5c27c9a1-bf33-4325-8939-4609b3c786b6\") " pod="openshift-controller-manager/controller-manager-5d8454f85d-4km5d" Oct 12 07:47:23 crc kubenswrapper[4599]: I1012 07:47:23.376644 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 12 07:47:23 crc kubenswrapper[4599]: I1012 07:47:23.376878 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 12 07:47:23 crc kubenswrapper[4599]: I1012 07:47:23.376898 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 12 07:47:23 crc kubenswrapper[4599]: I1012 07:47:23.376967 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 12 07:47:23 crc kubenswrapper[4599]: I1012 07:47:23.377023 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 12 07:47:23 crc kubenswrapper[4599]: I1012 07:47:23.377208 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 12 07:47:23 crc kubenswrapper[4599]: I1012 07:47:23.380732 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 12 07:47:23 crc kubenswrapper[4599]: I1012 07:47:23.382373 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5d8454f85d-4km5d"] Oct 12 07:47:23 crc kubenswrapper[4599]: I1012 07:47:23.474237 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5c27c9a1-bf33-4325-8939-4609b3c786b6-client-ca\") pod \"controller-manager-5d8454f85d-4km5d\" (UID: \"5c27c9a1-bf33-4325-8939-4609b3c786b6\") " pod="openshift-controller-manager/controller-manager-5d8454f85d-4km5d" Oct 12 07:47:23 crc kubenswrapper[4599]: I1012 07:47:23.474367 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c27c9a1-bf33-4325-8939-4609b3c786b6-serving-cert\") pod \"controller-manager-5d8454f85d-4km5d\" (UID: \"5c27c9a1-bf33-4325-8939-4609b3c786b6\") " pod="openshift-controller-manager/controller-manager-5d8454f85d-4km5d" Oct 12 07:47:23 crc kubenswrapper[4599]: I1012 07:47:23.474582 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fzh8\" (UniqueName: \"kubernetes.io/projected/5c27c9a1-bf33-4325-8939-4609b3c786b6-kube-api-access-4fzh8\") pod \"controller-manager-5d8454f85d-4km5d\" (UID: \"5c27c9a1-bf33-4325-8939-4609b3c786b6\") " pod="openshift-controller-manager/controller-manager-5d8454f85d-4km5d" Oct 12 07:47:23 crc kubenswrapper[4599]: I1012 07:47:23.475148 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5c27c9a1-bf33-4325-8939-4609b3c786b6-proxy-ca-bundles\") pod \"controller-manager-5d8454f85d-4km5d\" (UID: \"5c27c9a1-bf33-4325-8939-4609b3c786b6\") " pod="openshift-controller-manager/controller-manager-5d8454f85d-4km5d" Oct 12 07:47:23 crc kubenswrapper[4599]: I1012 07:47:23.475194 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c27c9a1-bf33-4325-8939-4609b3c786b6-config\") pod \"controller-manager-5d8454f85d-4km5d\" (UID: \"5c27c9a1-bf33-4325-8939-4609b3c786b6\") " pod="openshift-controller-manager/controller-manager-5d8454f85d-4km5d" Oct 12 07:47:23 crc kubenswrapper[4599]: I1012 07:47:23.475298 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5c27c9a1-bf33-4325-8939-4609b3c786b6-client-ca\") pod \"controller-manager-5d8454f85d-4km5d\" (UID: \"5c27c9a1-bf33-4325-8939-4609b3c786b6\") " pod="openshift-controller-manager/controller-manager-5d8454f85d-4km5d" Oct 12 07:47:23 crc kubenswrapper[4599]: I1012 07:47:23.476450 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c27c9a1-bf33-4325-8939-4609b3c786b6-config\") pod \"controller-manager-5d8454f85d-4km5d\" (UID: \"5c27c9a1-bf33-4325-8939-4609b3c786b6\") " pod="openshift-controller-manager/controller-manager-5d8454f85d-4km5d" Oct 12 07:47:23 crc kubenswrapper[4599]: I1012 07:47:23.476719 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5c27c9a1-bf33-4325-8939-4609b3c786b6-proxy-ca-bundles\") pod \"controller-manager-5d8454f85d-4km5d\" (UID: \"5c27c9a1-bf33-4325-8939-4609b3c786b6\") " pod="openshift-controller-manager/controller-manager-5d8454f85d-4km5d" Oct 12 07:47:23 crc kubenswrapper[4599]: I1012 07:47:23.478148 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c27c9a1-bf33-4325-8939-4609b3c786b6-serving-cert\") pod \"controller-manager-5d8454f85d-4km5d\" (UID: \"5c27c9a1-bf33-4325-8939-4609b3c786b6\") " pod="openshift-controller-manager/controller-manager-5d8454f85d-4km5d" Oct 12 07:47:23 crc kubenswrapper[4599]: I1012 07:47:23.488306 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fzh8\" (UniqueName: \"kubernetes.io/projected/5c27c9a1-bf33-4325-8939-4609b3c786b6-kube-api-access-4fzh8\") pod \"controller-manager-5d8454f85d-4km5d\" (UID: \"5c27c9a1-bf33-4325-8939-4609b3c786b6\") " pod="openshift-controller-manager/controller-manager-5d8454f85d-4km5d" Oct 12 07:47:23 crc kubenswrapper[4599]: I1012 07:47:23.553146 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c2f2023-e542-49b6-99f5-1051ee9b8e83" path="/var/lib/kubelet/pods/3c2f2023-e542-49b6-99f5-1051ee9b8e83/volumes" Oct 12 07:47:23 crc kubenswrapper[4599]: I1012 07:47:23.684795 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5d8454f85d-4km5d" Oct 12 07:47:24 crc kubenswrapper[4599]: I1012 07:47:24.075530 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5d8454f85d-4km5d"] Oct 12 07:47:24 crc kubenswrapper[4599]: I1012 07:47:24.332264 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5d8454f85d-4km5d" event={"ID":"5c27c9a1-bf33-4325-8939-4609b3c786b6","Type":"ContainerStarted","Data":"072f06568c67fe40672219f189520a218dc3cd99420cb27ab30680c059a4c700"} Oct 12 07:47:26 crc kubenswrapper[4599]: I1012 07:47:26.171222 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8r7fw" Oct 12 07:47:26 crc kubenswrapper[4599]: I1012 07:47:26.194450 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7ffcdff8c5-ts5ft"] Oct 12 07:47:26 crc kubenswrapper[4599]: E1012 07:47:26.194761 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69d5f53c-375a-424e-a120-93d89d06ae50" containerName="route-controller-manager" Oct 12 07:47:26 crc kubenswrapper[4599]: I1012 07:47:26.194781 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="69d5f53c-375a-424e-a120-93d89d06ae50" containerName="route-controller-manager" Oct 12 07:47:26 crc kubenswrapper[4599]: I1012 07:47:26.194949 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="69d5f53c-375a-424e-a120-93d89d06ae50" containerName="route-controller-manager" Oct 12 07:47:26 crc kubenswrapper[4599]: I1012 07:47:26.195510 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7ffcdff8c5-ts5ft" Oct 12 07:47:26 crc kubenswrapper[4599]: I1012 07:47:26.211024 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7ffcdff8c5-ts5ft"] Oct 12 07:47:26 crc kubenswrapper[4599]: I1012 07:47:26.215874 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7j2ww\" (UniqueName: \"kubernetes.io/projected/69d5f53c-375a-424e-a120-93d89d06ae50-kube-api-access-7j2ww\") pod \"69d5f53c-375a-424e-a120-93d89d06ae50\" (UID: \"69d5f53c-375a-424e-a120-93d89d06ae50\") " Oct 12 07:47:26 crc kubenswrapper[4599]: I1012 07:47:26.215936 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/69d5f53c-375a-424e-a120-93d89d06ae50-client-ca\") pod \"69d5f53c-375a-424e-a120-93d89d06ae50\" (UID: \"69d5f53c-375a-424e-a120-93d89d06ae50\") " Oct 12 07:47:26 crc kubenswrapper[4599]: I1012 07:47:26.216025 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69d5f53c-375a-424e-a120-93d89d06ae50-config\") pod \"69d5f53c-375a-424e-a120-93d89d06ae50\" (UID: \"69d5f53c-375a-424e-a120-93d89d06ae50\") " Oct 12 07:47:26 crc kubenswrapper[4599]: I1012 07:47:26.216059 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/69d5f53c-375a-424e-a120-93d89d06ae50-serving-cert\") pod \"69d5f53c-375a-424e-a120-93d89d06ae50\" (UID: \"69d5f53c-375a-424e-a120-93d89d06ae50\") " Oct 12 07:47:26 crc kubenswrapper[4599]: I1012 07:47:26.216246 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bj25j\" (UniqueName: \"kubernetes.io/projected/8a9268e6-3cea-4ccf-8d44-33a329bd781b-kube-api-access-bj25j\") pod \"route-controller-manager-7ffcdff8c5-ts5ft\" (UID: \"8a9268e6-3cea-4ccf-8d44-33a329bd781b\") " pod="openshift-route-controller-manager/route-controller-manager-7ffcdff8c5-ts5ft" Oct 12 07:47:26 crc kubenswrapper[4599]: I1012 07:47:26.216278 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8a9268e6-3cea-4ccf-8d44-33a329bd781b-client-ca\") pod \"route-controller-manager-7ffcdff8c5-ts5ft\" (UID: \"8a9268e6-3cea-4ccf-8d44-33a329bd781b\") " pod="openshift-route-controller-manager/route-controller-manager-7ffcdff8c5-ts5ft" Oct 12 07:47:26 crc kubenswrapper[4599]: I1012 07:47:26.216301 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8a9268e6-3cea-4ccf-8d44-33a329bd781b-serving-cert\") pod \"route-controller-manager-7ffcdff8c5-ts5ft\" (UID: \"8a9268e6-3cea-4ccf-8d44-33a329bd781b\") " pod="openshift-route-controller-manager/route-controller-manager-7ffcdff8c5-ts5ft" Oct 12 07:47:26 crc kubenswrapper[4599]: I1012 07:47:26.216435 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a9268e6-3cea-4ccf-8d44-33a329bd781b-config\") pod \"route-controller-manager-7ffcdff8c5-ts5ft\" (UID: \"8a9268e6-3cea-4ccf-8d44-33a329bd781b\") " pod="openshift-route-controller-manager/route-controller-manager-7ffcdff8c5-ts5ft" Oct 12 07:47:26 crc kubenswrapper[4599]: I1012 07:47:26.217095 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69d5f53c-375a-424e-a120-93d89d06ae50-client-ca" (OuterVolumeSpecName: "client-ca") pod "69d5f53c-375a-424e-a120-93d89d06ae50" (UID: "69d5f53c-375a-424e-a120-93d89d06ae50"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:47:26 crc kubenswrapper[4599]: I1012 07:47:26.217587 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69d5f53c-375a-424e-a120-93d89d06ae50-config" (OuterVolumeSpecName: "config") pod "69d5f53c-375a-424e-a120-93d89d06ae50" (UID: "69d5f53c-375a-424e-a120-93d89d06ae50"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:47:26 crc kubenswrapper[4599]: I1012 07:47:26.222633 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69d5f53c-375a-424e-a120-93d89d06ae50-kube-api-access-7j2ww" (OuterVolumeSpecName: "kube-api-access-7j2ww") pod "69d5f53c-375a-424e-a120-93d89d06ae50" (UID: "69d5f53c-375a-424e-a120-93d89d06ae50"). InnerVolumeSpecName "kube-api-access-7j2ww". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:47:26 crc kubenswrapper[4599]: I1012 07:47:26.232904 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69d5f53c-375a-424e-a120-93d89d06ae50-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "69d5f53c-375a-424e-a120-93d89d06ae50" (UID: "69d5f53c-375a-424e-a120-93d89d06ae50"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:47:26 crc kubenswrapper[4599]: I1012 07:47:26.318204 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a9268e6-3cea-4ccf-8d44-33a329bd781b-config\") pod \"route-controller-manager-7ffcdff8c5-ts5ft\" (UID: \"8a9268e6-3cea-4ccf-8d44-33a329bd781b\") " pod="openshift-route-controller-manager/route-controller-manager-7ffcdff8c5-ts5ft" Oct 12 07:47:26 crc kubenswrapper[4599]: I1012 07:47:26.318273 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bj25j\" (UniqueName: \"kubernetes.io/projected/8a9268e6-3cea-4ccf-8d44-33a329bd781b-kube-api-access-bj25j\") pod \"route-controller-manager-7ffcdff8c5-ts5ft\" (UID: \"8a9268e6-3cea-4ccf-8d44-33a329bd781b\") " pod="openshift-route-controller-manager/route-controller-manager-7ffcdff8c5-ts5ft" Oct 12 07:47:26 crc kubenswrapper[4599]: I1012 07:47:26.318298 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8a9268e6-3cea-4ccf-8d44-33a329bd781b-client-ca\") pod \"route-controller-manager-7ffcdff8c5-ts5ft\" (UID: \"8a9268e6-3cea-4ccf-8d44-33a329bd781b\") " pod="openshift-route-controller-manager/route-controller-manager-7ffcdff8c5-ts5ft" Oct 12 07:47:26 crc kubenswrapper[4599]: I1012 07:47:26.318321 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8a9268e6-3cea-4ccf-8d44-33a329bd781b-serving-cert\") pod \"route-controller-manager-7ffcdff8c5-ts5ft\" (UID: \"8a9268e6-3cea-4ccf-8d44-33a329bd781b\") " pod="openshift-route-controller-manager/route-controller-manager-7ffcdff8c5-ts5ft" Oct 12 07:47:26 crc kubenswrapper[4599]: I1012 07:47:26.318424 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7j2ww\" (UniqueName: \"kubernetes.io/projected/69d5f53c-375a-424e-a120-93d89d06ae50-kube-api-access-7j2ww\") on node \"crc\" DevicePath \"\"" Oct 12 07:47:26 crc kubenswrapper[4599]: I1012 07:47:26.318438 4599 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/69d5f53c-375a-424e-a120-93d89d06ae50-client-ca\") on node \"crc\" DevicePath \"\"" Oct 12 07:47:26 crc kubenswrapper[4599]: I1012 07:47:26.318447 4599 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69d5f53c-375a-424e-a120-93d89d06ae50-config\") on node \"crc\" DevicePath \"\"" Oct 12 07:47:26 crc kubenswrapper[4599]: I1012 07:47:26.318457 4599 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/69d5f53c-375a-424e-a120-93d89d06ae50-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 12 07:47:26 crc kubenswrapper[4599]: I1012 07:47:26.319252 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8a9268e6-3cea-4ccf-8d44-33a329bd781b-client-ca\") pod \"route-controller-manager-7ffcdff8c5-ts5ft\" (UID: \"8a9268e6-3cea-4ccf-8d44-33a329bd781b\") " pod="openshift-route-controller-manager/route-controller-manager-7ffcdff8c5-ts5ft" Oct 12 07:47:26 crc kubenswrapper[4599]: I1012 07:47:26.319532 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a9268e6-3cea-4ccf-8d44-33a329bd781b-config\") pod \"route-controller-manager-7ffcdff8c5-ts5ft\" (UID: \"8a9268e6-3cea-4ccf-8d44-33a329bd781b\") " pod="openshift-route-controller-manager/route-controller-manager-7ffcdff8c5-ts5ft" Oct 12 07:47:26 crc kubenswrapper[4599]: I1012 07:47:26.321605 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8a9268e6-3cea-4ccf-8d44-33a329bd781b-serving-cert\") pod \"route-controller-manager-7ffcdff8c5-ts5ft\" (UID: \"8a9268e6-3cea-4ccf-8d44-33a329bd781b\") " pod="openshift-route-controller-manager/route-controller-manager-7ffcdff8c5-ts5ft" Oct 12 07:47:26 crc kubenswrapper[4599]: I1012 07:47:26.331853 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bj25j\" (UniqueName: \"kubernetes.io/projected/8a9268e6-3cea-4ccf-8d44-33a329bd781b-kube-api-access-bj25j\") pod \"route-controller-manager-7ffcdff8c5-ts5ft\" (UID: \"8a9268e6-3cea-4ccf-8d44-33a329bd781b\") " pod="openshift-route-controller-manager/route-controller-manager-7ffcdff8c5-ts5ft" Oct 12 07:47:26 crc kubenswrapper[4599]: I1012 07:47:26.345742 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8r7fw" Oct 12 07:47:26 crc kubenswrapper[4599]: I1012 07:47:26.345922 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8r7fw" event={"ID":"69d5f53c-375a-424e-a120-93d89d06ae50","Type":"ContainerDied","Data":"9d315ae2011162774cd2fcf7e3ac2a6355c14eff33398988ee2867625b67447a"} Oct 12 07:47:26 crc kubenswrapper[4599]: I1012 07:47:26.346216 4599 scope.go:117] "RemoveContainer" containerID="1e2e2fbce08b456e2555df4f72f7e3bacfb9ae9768ff4a4d00995b004ec99e1f" Oct 12 07:47:26 crc kubenswrapper[4599]: I1012 07:47:26.346832 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5d8454f85d-4km5d" event={"ID":"5c27c9a1-bf33-4325-8939-4609b3c786b6","Type":"ContainerStarted","Data":"6ec3bb8094679e62f8c7367a23c5bf8560536488a9271d63fbe990b8b4dbf0c8"} Oct 12 07:47:26 crc kubenswrapper[4599]: I1012 07:47:26.347803 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5d8454f85d-4km5d" Oct 12 07:47:26 crc kubenswrapper[4599]: I1012 07:47:26.353843 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5d8454f85d-4km5d" Oct 12 07:47:26 crc kubenswrapper[4599]: I1012 07:47:26.368470 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5d8454f85d-4km5d" podStartSLOduration=5.3684543 podStartE2EDuration="5.3684543s" podCreationTimestamp="2025-10-12 07:47:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:47:26.367835513 +0000 UTC m=+743.157031014" watchObservedRunningTime="2025-10-12 07:47:26.3684543 +0000 UTC m=+743.157649792" Oct 12 07:47:26 crc kubenswrapper[4599]: I1012 07:47:26.383506 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-8r7fw"] Oct 12 07:47:26 crc kubenswrapper[4599]: I1012 07:47:26.388114 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-8r7fw"] Oct 12 07:47:26 crc kubenswrapper[4599]: I1012 07:47:26.507511 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7ffcdff8c5-ts5ft" Oct 12 07:47:26 crc kubenswrapper[4599]: I1012 07:47:26.879732 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7ffcdff8c5-ts5ft"] Oct 12 07:47:26 crc kubenswrapper[4599]: W1012 07:47:26.883286 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a9268e6_3cea_4ccf_8d44_33a329bd781b.slice/crio-10b4a8dbff022ce4c8cb2ebfa9ea4e10939a776475754e701d811279e5bc72d5 WatchSource:0}: Error finding container 10b4a8dbff022ce4c8cb2ebfa9ea4e10939a776475754e701d811279e5bc72d5: Status 404 returned error can't find the container with id 10b4a8dbff022ce4c8cb2ebfa9ea4e10939a776475754e701d811279e5bc72d5 Oct 12 07:47:27 crc kubenswrapper[4599]: I1012 07:47:27.356594 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7ffcdff8c5-ts5ft" event={"ID":"8a9268e6-3cea-4ccf-8d44-33a329bd781b","Type":"ContainerStarted","Data":"9ed42cee156a1dc12deed9a772f50d462f89d1ebcca9c015e00662eee9a4b815"} Oct 12 07:47:27 crc kubenswrapper[4599]: I1012 07:47:27.356997 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7ffcdff8c5-ts5ft" Oct 12 07:47:27 crc kubenswrapper[4599]: I1012 07:47:27.357015 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7ffcdff8c5-ts5ft" event={"ID":"8a9268e6-3cea-4ccf-8d44-33a329bd781b","Type":"ContainerStarted","Data":"10b4a8dbff022ce4c8cb2ebfa9ea4e10939a776475754e701d811279e5bc72d5"} Oct 12 07:47:27 crc kubenswrapper[4599]: I1012 07:47:27.372550 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7ffcdff8c5-ts5ft" podStartSLOduration=6.372528451 podStartE2EDuration="6.372528451s" podCreationTimestamp="2025-10-12 07:47:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:47:27.369574077 +0000 UTC m=+744.158769579" watchObservedRunningTime="2025-10-12 07:47:27.372528451 +0000 UTC m=+744.161723954" Oct 12 07:47:27 crc kubenswrapper[4599]: I1012 07:47:27.555880 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69d5f53c-375a-424e-a120-93d89d06ae50" path="/var/lib/kubelet/pods/69d5f53c-375a-424e-a120-93d89d06ae50/volumes" Oct 12 07:47:27 crc kubenswrapper[4599]: I1012 07:47:27.581596 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7ffcdff8c5-ts5ft" Oct 12 07:47:28 crc kubenswrapper[4599]: I1012 07:47:28.322026 4599 patch_prober.go:28] interesting pod/machine-config-daemon-5mz5c container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 07:47:28 crc kubenswrapper[4599]: I1012 07:47:28.322111 4599 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 07:47:28 crc kubenswrapper[4599]: I1012 07:47:28.322174 4599 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" Oct 12 07:47:28 crc kubenswrapper[4599]: I1012 07:47:28.322722 4599 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"62dd115f3eaf8ba983cf13f3b84adc51fbb09341d1c83aeb28106a411652e265"} pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 12 07:47:28 crc kubenswrapper[4599]: I1012 07:47:28.322798 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" containerName="machine-config-daemon" containerID="cri-o://62dd115f3eaf8ba983cf13f3b84adc51fbb09341d1c83aeb28106a411652e265" gracePeriod=600 Oct 12 07:47:29 crc kubenswrapper[4599]: I1012 07:47:29.373744 4599 generic.go:334] "Generic (PLEG): container finished" podID="cc694bce-8c25-4729-b452-29d44d3efe6e" containerID="62dd115f3eaf8ba983cf13f3b84adc51fbb09341d1c83aeb28106a411652e265" exitCode=0 Oct 12 07:47:29 crc kubenswrapper[4599]: I1012 07:47:29.373829 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" event={"ID":"cc694bce-8c25-4729-b452-29d44d3efe6e","Type":"ContainerDied","Data":"62dd115f3eaf8ba983cf13f3b84adc51fbb09341d1c83aeb28106a411652e265"} Oct 12 07:47:29 crc kubenswrapper[4599]: I1012 07:47:29.374367 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" event={"ID":"cc694bce-8c25-4729-b452-29d44d3efe6e","Type":"ContainerStarted","Data":"f791fc6fe233d5a2dcb3bd14d2fd8d76369bf4f0ae51317c4f0bb3b0e75a17de"} Oct 12 07:47:29 crc kubenswrapper[4599]: I1012 07:47:29.374391 4599 scope.go:117] "RemoveContainer" containerID="fa4ea3304924aa5e47754fb164316c2f3c9af596068fee357fa89cb1b44eb67a" Oct 12 07:47:30 crc kubenswrapper[4599]: I1012 07:47:30.534756 4599 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 12 07:47:34 crc kubenswrapper[4599]: I1012 07:47:34.912578 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-656586ff77-dprk5"] Oct 12 07:47:34 crc kubenswrapper[4599]: I1012 07:47:34.918198 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-656586ff77-dprk5" Oct 12 07:47:34 crc kubenswrapper[4599]: I1012 07:47:34.920617 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Oct 12 07:47:34 crc kubenswrapper[4599]: I1012 07:47:34.920625 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Oct 12 07:47:34 crc kubenswrapper[4599]: I1012 07:47:34.930053 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Oct 12 07:47:34 crc kubenswrapper[4599]: I1012 07:47:34.930813 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-dmg6d" Oct 12 07:47:34 crc kubenswrapper[4599]: I1012 07:47:34.931750 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-656586ff77-dprk5"] Oct 12 07:47:34 crc kubenswrapper[4599]: I1012 07:47:34.949405 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9vbh\" (UniqueName: \"kubernetes.io/projected/af7a3b11-bc49-4cd5-985e-7be1c153b44c-kube-api-access-h9vbh\") pod \"dnsmasq-dns-656586ff77-dprk5\" (UID: \"af7a3b11-bc49-4cd5-985e-7be1c153b44c\") " pod="openstack/dnsmasq-dns-656586ff77-dprk5" Oct 12 07:47:34 crc kubenswrapper[4599]: I1012 07:47:34.949523 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af7a3b11-bc49-4cd5-985e-7be1c153b44c-config\") pod \"dnsmasq-dns-656586ff77-dprk5\" (UID: \"af7a3b11-bc49-4cd5-985e-7be1c153b44c\") " pod="openstack/dnsmasq-dns-656586ff77-dprk5" Oct 12 07:47:34 crc kubenswrapper[4599]: I1012 07:47:34.973098 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7876f7ff45-r2kr5"] Oct 12 07:47:34 crc kubenswrapper[4599]: I1012 07:47:34.982043 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7876f7ff45-r2kr5" Oct 12 07:47:34 crc kubenswrapper[4599]: I1012 07:47:34.985779 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Oct 12 07:47:34 crc kubenswrapper[4599]: I1012 07:47:34.986049 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7876f7ff45-r2kr5"] Oct 12 07:47:35 crc kubenswrapper[4599]: I1012 07:47:35.051585 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9vbh\" (UniqueName: \"kubernetes.io/projected/af7a3b11-bc49-4cd5-985e-7be1c153b44c-kube-api-access-h9vbh\") pod \"dnsmasq-dns-656586ff77-dprk5\" (UID: \"af7a3b11-bc49-4cd5-985e-7be1c153b44c\") " pod="openstack/dnsmasq-dns-656586ff77-dprk5" Oct 12 07:47:35 crc kubenswrapper[4599]: I1012 07:47:35.051917 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2c5dp\" (UniqueName: \"kubernetes.io/projected/6f00e6e8-0dd1-4756-b7fc-8d00725a9644-kube-api-access-2c5dp\") pod \"dnsmasq-dns-7876f7ff45-r2kr5\" (UID: \"6f00e6e8-0dd1-4756-b7fc-8d00725a9644\") " pod="openstack/dnsmasq-dns-7876f7ff45-r2kr5" Oct 12 07:47:35 crc kubenswrapper[4599]: I1012 07:47:35.051975 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6f00e6e8-0dd1-4756-b7fc-8d00725a9644-dns-svc\") pod \"dnsmasq-dns-7876f7ff45-r2kr5\" (UID: \"6f00e6e8-0dd1-4756-b7fc-8d00725a9644\") " pod="openstack/dnsmasq-dns-7876f7ff45-r2kr5" Oct 12 07:47:35 crc kubenswrapper[4599]: I1012 07:47:35.052063 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f00e6e8-0dd1-4756-b7fc-8d00725a9644-config\") pod \"dnsmasq-dns-7876f7ff45-r2kr5\" (UID: \"6f00e6e8-0dd1-4756-b7fc-8d00725a9644\") " pod="openstack/dnsmasq-dns-7876f7ff45-r2kr5" Oct 12 07:47:35 crc kubenswrapper[4599]: I1012 07:47:35.052098 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af7a3b11-bc49-4cd5-985e-7be1c153b44c-config\") pod \"dnsmasq-dns-656586ff77-dprk5\" (UID: \"af7a3b11-bc49-4cd5-985e-7be1c153b44c\") " pod="openstack/dnsmasq-dns-656586ff77-dprk5" Oct 12 07:47:35 crc kubenswrapper[4599]: I1012 07:47:35.052889 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af7a3b11-bc49-4cd5-985e-7be1c153b44c-config\") pod \"dnsmasq-dns-656586ff77-dprk5\" (UID: \"af7a3b11-bc49-4cd5-985e-7be1c153b44c\") " pod="openstack/dnsmasq-dns-656586ff77-dprk5" Oct 12 07:47:35 crc kubenswrapper[4599]: I1012 07:47:35.071182 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9vbh\" (UniqueName: \"kubernetes.io/projected/af7a3b11-bc49-4cd5-985e-7be1c153b44c-kube-api-access-h9vbh\") pod \"dnsmasq-dns-656586ff77-dprk5\" (UID: \"af7a3b11-bc49-4cd5-985e-7be1c153b44c\") " pod="openstack/dnsmasq-dns-656586ff77-dprk5" Oct 12 07:47:35 crc kubenswrapper[4599]: I1012 07:47:35.153862 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f00e6e8-0dd1-4756-b7fc-8d00725a9644-config\") pod \"dnsmasq-dns-7876f7ff45-r2kr5\" (UID: \"6f00e6e8-0dd1-4756-b7fc-8d00725a9644\") " pod="openstack/dnsmasq-dns-7876f7ff45-r2kr5" Oct 12 07:47:35 crc kubenswrapper[4599]: I1012 07:47:35.154076 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2c5dp\" (UniqueName: \"kubernetes.io/projected/6f00e6e8-0dd1-4756-b7fc-8d00725a9644-kube-api-access-2c5dp\") pod \"dnsmasq-dns-7876f7ff45-r2kr5\" (UID: \"6f00e6e8-0dd1-4756-b7fc-8d00725a9644\") " pod="openstack/dnsmasq-dns-7876f7ff45-r2kr5" Oct 12 07:47:35 crc kubenswrapper[4599]: I1012 07:47:35.154117 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6f00e6e8-0dd1-4756-b7fc-8d00725a9644-dns-svc\") pod \"dnsmasq-dns-7876f7ff45-r2kr5\" (UID: \"6f00e6e8-0dd1-4756-b7fc-8d00725a9644\") " pod="openstack/dnsmasq-dns-7876f7ff45-r2kr5" Oct 12 07:47:35 crc kubenswrapper[4599]: I1012 07:47:35.154790 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f00e6e8-0dd1-4756-b7fc-8d00725a9644-config\") pod \"dnsmasq-dns-7876f7ff45-r2kr5\" (UID: \"6f00e6e8-0dd1-4756-b7fc-8d00725a9644\") " pod="openstack/dnsmasq-dns-7876f7ff45-r2kr5" Oct 12 07:47:35 crc kubenswrapper[4599]: I1012 07:47:35.154958 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6f00e6e8-0dd1-4756-b7fc-8d00725a9644-dns-svc\") pod \"dnsmasq-dns-7876f7ff45-r2kr5\" (UID: \"6f00e6e8-0dd1-4756-b7fc-8d00725a9644\") " pod="openstack/dnsmasq-dns-7876f7ff45-r2kr5" Oct 12 07:47:35 crc kubenswrapper[4599]: I1012 07:47:35.168651 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2c5dp\" (UniqueName: \"kubernetes.io/projected/6f00e6e8-0dd1-4756-b7fc-8d00725a9644-kube-api-access-2c5dp\") pod \"dnsmasq-dns-7876f7ff45-r2kr5\" (UID: \"6f00e6e8-0dd1-4756-b7fc-8d00725a9644\") " pod="openstack/dnsmasq-dns-7876f7ff45-r2kr5" Oct 12 07:47:35 crc kubenswrapper[4599]: I1012 07:47:35.251469 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-656586ff77-dprk5" Oct 12 07:47:35 crc kubenswrapper[4599]: I1012 07:47:35.302038 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7876f7ff45-r2kr5" Oct 12 07:47:35 crc kubenswrapper[4599]: I1012 07:47:35.631518 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-656586ff77-dprk5"] Oct 12 07:47:35 crc kubenswrapper[4599]: I1012 07:47:35.726276 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7876f7ff45-r2kr5"] Oct 12 07:47:35 crc kubenswrapper[4599]: W1012 07:47:35.729356 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6f00e6e8_0dd1_4756_b7fc_8d00725a9644.slice/crio-990d59d2a7c170930842d1dd60c8d0384d2c6419f00b58d6b8b81aae7bc5425e WatchSource:0}: Error finding container 990d59d2a7c170930842d1dd60c8d0384d2c6419f00b58d6b8b81aae7bc5425e: Status 404 returned error can't find the container with id 990d59d2a7c170930842d1dd60c8d0384d2c6419f00b58d6b8b81aae7bc5425e Oct 12 07:47:36 crc kubenswrapper[4599]: I1012 07:47:36.433790 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7876f7ff45-r2kr5" event={"ID":"6f00e6e8-0dd1-4756-b7fc-8d00725a9644","Type":"ContainerStarted","Data":"990d59d2a7c170930842d1dd60c8d0384d2c6419f00b58d6b8b81aae7bc5425e"} Oct 12 07:47:36 crc kubenswrapper[4599]: I1012 07:47:36.435407 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-656586ff77-dprk5" event={"ID":"af7a3b11-bc49-4cd5-985e-7be1c153b44c","Type":"ContainerStarted","Data":"d1709d37eac125f2c7b458783c944e0a8d5d4bc8a33f0a973193012e706ed1d3"} Oct 12 07:47:38 crc kubenswrapper[4599]: I1012 07:47:38.097523 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-656586ff77-dprk5"] Oct 12 07:47:38 crc kubenswrapper[4599]: I1012 07:47:38.120862 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7596fbdcc-5l45x"] Oct 12 07:47:38 crc kubenswrapper[4599]: I1012 07:47:38.122076 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7596fbdcc-5l45x" Oct 12 07:47:38 crc kubenswrapper[4599]: I1012 07:47:38.132654 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7596fbdcc-5l45x"] Oct 12 07:47:38 crc kubenswrapper[4599]: I1012 07:47:38.204780 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7ae0131-ce00-4e3f-a679-a9745562291d-config\") pod \"dnsmasq-dns-7596fbdcc-5l45x\" (UID: \"f7ae0131-ce00-4e3f-a679-a9745562291d\") " pod="openstack/dnsmasq-dns-7596fbdcc-5l45x" Oct 12 07:47:38 crc kubenswrapper[4599]: I1012 07:47:38.205132 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cd596\" (UniqueName: \"kubernetes.io/projected/f7ae0131-ce00-4e3f-a679-a9745562291d-kube-api-access-cd596\") pod \"dnsmasq-dns-7596fbdcc-5l45x\" (UID: \"f7ae0131-ce00-4e3f-a679-a9745562291d\") " pod="openstack/dnsmasq-dns-7596fbdcc-5l45x" Oct 12 07:47:38 crc kubenswrapper[4599]: I1012 07:47:38.205152 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f7ae0131-ce00-4e3f-a679-a9745562291d-dns-svc\") pod \"dnsmasq-dns-7596fbdcc-5l45x\" (UID: \"f7ae0131-ce00-4e3f-a679-a9745562291d\") " pod="openstack/dnsmasq-dns-7596fbdcc-5l45x" Oct 12 07:47:38 crc kubenswrapper[4599]: I1012 07:47:38.306806 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7ae0131-ce00-4e3f-a679-a9745562291d-config\") pod \"dnsmasq-dns-7596fbdcc-5l45x\" (UID: \"f7ae0131-ce00-4e3f-a679-a9745562291d\") " pod="openstack/dnsmasq-dns-7596fbdcc-5l45x" Oct 12 07:47:38 crc kubenswrapper[4599]: I1012 07:47:38.306868 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cd596\" (UniqueName: \"kubernetes.io/projected/f7ae0131-ce00-4e3f-a679-a9745562291d-kube-api-access-cd596\") pod \"dnsmasq-dns-7596fbdcc-5l45x\" (UID: \"f7ae0131-ce00-4e3f-a679-a9745562291d\") " pod="openstack/dnsmasq-dns-7596fbdcc-5l45x" Oct 12 07:47:38 crc kubenswrapper[4599]: I1012 07:47:38.306895 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f7ae0131-ce00-4e3f-a679-a9745562291d-dns-svc\") pod \"dnsmasq-dns-7596fbdcc-5l45x\" (UID: \"f7ae0131-ce00-4e3f-a679-a9745562291d\") " pod="openstack/dnsmasq-dns-7596fbdcc-5l45x" Oct 12 07:47:38 crc kubenswrapper[4599]: I1012 07:47:38.307754 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7ae0131-ce00-4e3f-a679-a9745562291d-config\") pod \"dnsmasq-dns-7596fbdcc-5l45x\" (UID: \"f7ae0131-ce00-4e3f-a679-a9745562291d\") " pod="openstack/dnsmasq-dns-7596fbdcc-5l45x" Oct 12 07:47:38 crc kubenswrapper[4599]: I1012 07:47:38.307754 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f7ae0131-ce00-4e3f-a679-a9745562291d-dns-svc\") pod \"dnsmasq-dns-7596fbdcc-5l45x\" (UID: \"f7ae0131-ce00-4e3f-a679-a9745562291d\") " pod="openstack/dnsmasq-dns-7596fbdcc-5l45x" Oct 12 07:47:38 crc kubenswrapper[4599]: I1012 07:47:38.344321 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cd596\" (UniqueName: \"kubernetes.io/projected/f7ae0131-ce00-4e3f-a679-a9745562291d-kube-api-access-cd596\") pod \"dnsmasq-dns-7596fbdcc-5l45x\" (UID: \"f7ae0131-ce00-4e3f-a679-a9745562291d\") " pod="openstack/dnsmasq-dns-7596fbdcc-5l45x" Oct 12 07:47:38 crc kubenswrapper[4599]: I1012 07:47:38.385421 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7876f7ff45-r2kr5"] Oct 12 07:47:38 crc kubenswrapper[4599]: I1012 07:47:38.394070 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-69bb789bb9-4nnl8"] Oct 12 07:47:38 crc kubenswrapper[4599]: I1012 07:47:38.402310 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69bb789bb9-4nnl8" Oct 12 07:47:38 crc kubenswrapper[4599]: I1012 07:47:38.407844 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-69bb789bb9-4nnl8"] Oct 12 07:47:38 crc kubenswrapper[4599]: I1012 07:47:38.438790 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7596fbdcc-5l45x" Oct 12 07:47:38 crc kubenswrapper[4599]: I1012 07:47:38.516216 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91fafba3-ee3c-4380-be34-b897a835c882-config\") pod \"dnsmasq-dns-69bb789bb9-4nnl8\" (UID: \"91fafba3-ee3c-4380-be34-b897a835c882\") " pod="openstack/dnsmasq-dns-69bb789bb9-4nnl8" Oct 12 07:47:38 crc kubenswrapper[4599]: I1012 07:47:38.516328 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lhqd\" (UniqueName: \"kubernetes.io/projected/91fafba3-ee3c-4380-be34-b897a835c882-kube-api-access-9lhqd\") pod \"dnsmasq-dns-69bb789bb9-4nnl8\" (UID: \"91fafba3-ee3c-4380-be34-b897a835c882\") " pod="openstack/dnsmasq-dns-69bb789bb9-4nnl8" Oct 12 07:47:38 crc kubenswrapper[4599]: I1012 07:47:38.516450 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/91fafba3-ee3c-4380-be34-b897a835c882-dns-svc\") pod \"dnsmasq-dns-69bb789bb9-4nnl8\" (UID: \"91fafba3-ee3c-4380-be34-b897a835c882\") " pod="openstack/dnsmasq-dns-69bb789bb9-4nnl8" Oct 12 07:47:38 crc kubenswrapper[4599]: I1012 07:47:38.618576 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/91fafba3-ee3c-4380-be34-b897a835c882-dns-svc\") pod \"dnsmasq-dns-69bb789bb9-4nnl8\" (UID: \"91fafba3-ee3c-4380-be34-b897a835c882\") " pod="openstack/dnsmasq-dns-69bb789bb9-4nnl8" Oct 12 07:47:38 crc kubenswrapper[4599]: I1012 07:47:38.619403 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91fafba3-ee3c-4380-be34-b897a835c882-config\") pod \"dnsmasq-dns-69bb789bb9-4nnl8\" (UID: \"91fafba3-ee3c-4380-be34-b897a835c882\") " pod="openstack/dnsmasq-dns-69bb789bb9-4nnl8" Oct 12 07:47:38 crc kubenswrapper[4599]: I1012 07:47:38.619496 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lhqd\" (UniqueName: \"kubernetes.io/projected/91fafba3-ee3c-4380-be34-b897a835c882-kube-api-access-9lhqd\") pod \"dnsmasq-dns-69bb789bb9-4nnl8\" (UID: \"91fafba3-ee3c-4380-be34-b897a835c882\") " pod="openstack/dnsmasq-dns-69bb789bb9-4nnl8" Oct 12 07:47:38 crc kubenswrapper[4599]: I1012 07:47:38.619566 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/91fafba3-ee3c-4380-be34-b897a835c882-dns-svc\") pod \"dnsmasq-dns-69bb789bb9-4nnl8\" (UID: \"91fafba3-ee3c-4380-be34-b897a835c882\") " pod="openstack/dnsmasq-dns-69bb789bb9-4nnl8" Oct 12 07:47:38 crc kubenswrapper[4599]: I1012 07:47:38.621047 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91fafba3-ee3c-4380-be34-b897a835c882-config\") pod \"dnsmasq-dns-69bb789bb9-4nnl8\" (UID: \"91fafba3-ee3c-4380-be34-b897a835c882\") " pod="openstack/dnsmasq-dns-69bb789bb9-4nnl8" Oct 12 07:47:38 crc kubenswrapper[4599]: I1012 07:47:38.639617 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lhqd\" (UniqueName: \"kubernetes.io/projected/91fafba3-ee3c-4380-be34-b897a835c882-kube-api-access-9lhqd\") pod \"dnsmasq-dns-69bb789bb9-4nnl8\" (UID: \"91fafba3-ee3c-4380-be34-b897a835c882\") " pod="openstack/dnsmasq-dns-69bb789bb9-4nnl8" Oct 12 07:47:38 crc kubenswrapper[4599]: I1012 07:47:38.727427 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69bb789bb9-4nnl8" Oct 12 07:47:38 crc kubenswrapper[4599]: I1012 07:47:38.905325 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7596fbdcc-5l45x"] Oct 12 07:47:38 crc kubenswrapper[4599]: W1012 07:47:38.912785 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf7ae0131_ce00_4e3f_a679_a9745562291d.slice/crio-993abbbe1b8a3cb890c897db4cac598f11c8a19d84ab00b34ff453feeed18552 WatchSource:0}: Error finding container 993abbbe1b8a3cb890c897db4cac598f11c8a19d84ab00b34ff453feeed18552: Status 404 returned error can't find the container with id 993abbbe1b8a3cb890c897db4cac598f11c8a19d84ab00b34ff453feeed18552 Oct 12 07:47:39 crc kubenswrapper[4599]: I1012 07:47:39.137938 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-69bb789bb9-4nnl8"] Oct 12 07:47:39 crc kubenswrapper[4599]: W1012 07:47:39.146306 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91fafba3_ee3c_4380_be34_b897a835c882.slice/crio-a85f7795ec0add658f7480a09e462840b1f1615108f33255a271de37b26d9300 WatchSource:0}: Error finding container a85f7795ec0add658f7480a09e462840b1f1615108f33255a271de37b26d9300: Status 404 returned error can't find the container with id a85f7795ec0add658f7480a09e462840b1f1615108f33255a271de37b26d9300 Oct 12 07:47:39 crc kubenswrapper[4599]: I1012 07:47:39.274283 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 12 07:47:39 crc kubenswrapper[4599]: I1012 07:47:39.278214 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 12 07:47:39 crc kubenswrapper[4599]: I1012 07:47:39.282815 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 12 07:47:39 crc kubenswrapper[4599]: I1012 07:47:39.282841 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 12 07:47:39 crc kubenswrapper[4599]: I1012 07:47:39.283189 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 12 07:47:39 crc kubenswrapper[4599]: I1012 07:47:39.283273 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 12 07:47:39 crc kubenswrapper[4599]: I1012 07:47:39.283361 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-gj9zn" Oct 12 07:47:39 crc kubenswrapper[4599]: I1012 07:47:39.283553 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 12 07:47:39 crc kubenswrapper[4599]: I1012 07:47:39.283605 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 12 07:47:39 crc kubenswrapper[4599]: I1012 07:47:39.286401 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 12 07:47:39 crc kubenswrapper[4599]: I1012 07:47:39.431553 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4612b7e8-a507-4c57-989d-3411e4e302dd-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"4612b7e8-a507-4c57-989d-3411e4e302dd\") " pod="openstack/rabbitmq-server-0" Oct 12 07:47:39 crc kubenswrapper[4599]: I1012 07:47:39.431951 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4612b7e8-a507-4c57-989d-3411e4e302dd-config-data\") pod \"rabbitmq-server-0\" (UID: \"4612b7e8-a507-4c57-989d-3411e4e302dd\") " pod="openstack/rabbitmq-server-0" Oct 12 07:47:39 crc kubenswrapper[4599]: I1012 07:47:39.432065 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4612b7e8-a507-4c57-989d-3411e4e302dd-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"4612b7e8-a507-4c57-989d-3411e4e302dd\") " pod="openstack/rabbitmq-server-0" Oct 12 07:47:39 crc kubenswrapper[4599]: I1012 07:47:39.432090 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4612b7e8-a507-4c57-989d-3411e4e302dd-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"4612b7e8-a507-4c57-989d-3411e4e302dd\") " pod="openstack/rabbitmq-server-0" Oct 12 07:47:39 crc kubenswrapper[4599]: I1012 07:47:39.432147 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4612b7e8-a507-4c57-989d-3411e4e302dd-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"4612b7e8-a507-4c57-989d-3411e4e302dd\") " pod="openstack/rabbitmq-server-0" Oct 12 07:47:39 crc kubenswrapper[4599]: I1012 07:47:39.432177 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x98v2\" (UniqueName: \"kubernetes.io/projected/4612b7e8-a507-4c57-989d-3411e4e302dd-kube-api-access-x98v2\") pod \"rabbitmq-server-0\" (UID: \"4612b7e8-a507-4c57-989d-3411e4e302dd\") " pod="openstack/rabbitmq-server-0" Oct 12 07:47:39 crc kubenswrapper[4599]: I1012 07:47:39.432248 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4612b7e8-a507-4c57-989d-3411e4e302dd-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"4612b7e8-a507-4c57-989d-3411e4e302dd\") " pod="openstack/rabbitmq-server-0" Oct 12 07:47:39 crc kubenswrapper[4599]: I1012 07:47:39.432308 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4612b7e8-a507-4c57-989d-3411e4e302dd-server-conf\") pod \"rabbitmq-server-0\" (UID: \"4612b7e8-a507-4c57-989d-3411e4e302dd\") " pod="openstack/rabbitmq-server-0" Oct 12 07:47:39 crc kubenswrapper[4599]: I1012 07:47:39.432373 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4612b7e8-a507-4c57-989d-3411e4e302dd-pod-info\") pod \"rabbitmq-server-0\" (UID: \"4612b7e8-a507-4c57-989d-3411e4e302dd\") " pod="openstack/rabbitmq-server-0" Oct 12 07:47:39 crc kubenswrapper[4599]: I1012 07:47:39.432400 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"4612b7e8-a507-4c57-989d-3411e4e302dd\") " pod="openstack/rabbitmq-server-0" Oct 12 07:47:39 crc kubenswrapper[4599]: I1012 07:47:39.432653 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4612b7e8-a507-4c57-989d-3411e4e302dd-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"4612b7e8-a507-4c57-989d-3411e4e302dd\") " pod="openstack/rabbitmq-server-0" Oct 12 07:47:39 crc kubenswrapper[4599]: I1012 07:47:39.485456 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7596fbdcc-5l45x" event={"ID":"f7ae0131-ce00-4e3f-a679-a9745562291d","Type":"ContainerStarted","Data":"993abbbe1b8a3cb890c897db4cac598f11c8a19d84ab00b34ff453feeed18552"} Oct 12 07:47:39 crc kubenswrapper[4599]: I1012 07:47:39.487407 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69bb789bb9-4nnl8" event={"ID":"91fafba3-ee3c-4380-be34-b897a835c882","Type":"ContainerStarted","Data":"a85f7795ec0add658f7480a09e462840b1f1615108f33255a271de37b26d9300"} Oct 12 07:47:39 crc kubenswrapper[4599]: I1012 07:47:39.534637 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4612b7e8-a507-4c57-989d-3411e4e302dd-config-data\") pod \"rabbitmq-server-0\" (UID: \"4612b7e8-a507-4c57-989d-3411e4e302dd\") " pod="openstack/rabbitmq-server-0" Oct 12 07:47:39 crc kubenswrapper[4599]: I1012 07:47:39.534770 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4612b7e8-a507-4c57-989d-3411e4e302dd-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"4612b7e8-a507-4c57-989d-3411e4e302dd\") " pod="openstack/rabbitmq-server-0" Oct 12 07:47:39 crc kubenswrapper[4599]: I1012 07:47:39.534800 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4612b7e8-a507-4c57-989d-3411e4e302dd-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"4612b7e8-a507-4c57-989d-3411e4e302dd\") " pod="openstack/rabbitmq-server-0" Oct 12 07:47:39 crc kubenswrapper[4599]: I1012 07:47:39.534821 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4612b7e8-a507-4c57-989d-3411e4e302dd-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"4612b7e8-a507-4c57-989d-3411e4e302dd\") " pod="openstack/rabbitmq-server-0" Oct 12 07:47:39 crc kubenswrapper[4599]: I1012 07:47:39.534854 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x98v2\" (UniqueName: \"kubernetes.io/projected/4612b7e8-a507-4c57-989d-3411e4e302dd-kube-api-access-x98v2\") pod \"rabbitmq-server-0\" (UID: \"4612b7e8-a507-4c57-989d-3411e4e302dd\") " pod="openstack/rabbitmq-server-0" Oct 12 07:47:39 crc kubenswrapper[4599]: I1012 07:47:39.534896 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4612b7e8-a507-4c57-989d-3411e4e302dd-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"4612b7e8-a507-4c57-989d-3411e4e302dd\") " pod="openstack/rabbitmq-server-0" Oct 12 07:47:39 crc kubenswrapper[4599]: I1012 07:47:39.534916 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4612b7e8-a507-4c57-989d-3411e4e302dd-server-conf\") pod \"rabbitmq-server-0\" (UID: \"4612b7e8-a507-4c57-989d-3411e4e302dd\") " pod="openstack/rabbitmq-server-0" Oct 12 07:47:39 crc kubenswrapper[4599]: I1012 07:47:39.534940 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4612b7e8-a507-4c57-989d-3411e4e302dd-pod-info\") pod \"rabbitmq-server-0\" (UID: \"4612b7e8-a507-4c57-989d-3411e4e302dd\") " pod="openstack/rabbitmq-server-0" Oct 12 07:47:39 crc kubenswrapper[4599]: I1012 07:47:39.534963 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"4612b7e8-a507-4c57-989d-3411e4e302dd\") " pod="openstack/rabbitmq-server-0" Oct 12 07:47:39 crc kubenswrapper[4599]: I1012 07:47:39.535029 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4612b7e8-a507-4c57-989d-3411e4e302dd-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"4612b7e8-a507-4c57-989d-3411e4e302dd\") " pod="openstack/rabbitmq-server-0" Oct 12 07:47:39 crc kubenswrapper[4599]: I1012 07:47:39.535059 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4612b7e8-a507-4c57-989d-3411e4e302dd-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"4612b7e8-a507-4c57-989d-3411e4e302dd\") " pod="openstack/rabbitmq-server-0" Oct 12 07:47:39 crc kubenswrapper[4599]: I1012 07:47:39.535366 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4612b7e8-a507-4c57-989d-3411e4e302dd-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"4612b7e8-a507-4c57-989d-3411e4e302dd\") " pod="openstack/rabbitmq-server-0" Oct 12 07:47:39 crc kubenswrapper[4599]: I1012 07:47:39.535550 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4612b7e8-a507-4c57-989d-3411e4e302dd-config-data\") pod \"rabbitmq-server-0\" (UID: \"4612b7e8-a507-4c57-989d-3411e4e302dd\") " pod="openstack/rabbitmq-server-0" Oct 12 07:47:39 crc kubenswrapper[4599]: I1012 07:47:39.535740 4599 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"4612b7e8-a507-4c57-989d-3411e4e302dd\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/rabbitmq-server-0" Oct 12 07:47:39 crc kubenswrapper[4599]: I1012 07:47:39.536919 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4612b7e8-a507-4c57-989d-3411e4e302dd-server-conf\") pod \"rabbitmq-server-0\" (UID: \"4612b7e8-a507-4c57-989d-3411e4e302dd\") " pod="openstack/rabbitmq-server-0" Oct 12 07:47:39 crc kubenswrapper[4599]: I1012 07:47:39.536493 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4612b7e8-a507-4c57-989d-3411e4e302dd-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"4612b7e8-a507-4c57-989d-3411e4e302dd\") " pod="openstack/rabbitmq-server-0" Oct 12 07:47:39 crc kubenswrapper[4599]: I1012 07:47:39.539649 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4612b7e8-a507-4c57-989d-3411e4e302dd-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"4612b7e8-a507-4c57-989d-3411e4e302dd\") " pod="openstack/rabbitmq-server-0" Oct 12 07:47:39 crc kubenswrapper[4599]: I1012 07:47:39.542225 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4612b7e8-a507-4c57-989d-3411e4e302dd-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"4612b7e8-a507-4c57-989d-3411e4e302dd\") " pod="openstack/rabbitmq-server-0" Oct 12 07:47:39 crc kubenswrapper[4599]: I1012 07:47:39.542855 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4612b7e8-a507-4c57-989d-3411e4e302dd-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"4612b7e8-a507-4c57-989d-3411e4e302dd\") " pod="openstack/rabbitmq-server-0" Oct 12 07:47:39 crc kubenswrapper[4599]: I1012 07:47:39.543266 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4612b7e8-a507-4c57-989d-3411e4e302dd-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"4612b7e8-a507-4c57-989d-3411e4e302dd\") " pod="openstack/rabbitmq-server-0" Oct 12 07:47:39 crc kubenswrapper[4599]: I1012 07:47:39.551946 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4612b7e8-a507-4c57-989d-3411e4e302dd-pod-info\") pod \"rabbitmq-server-0\" (UID: \"4612b7e8-a507-4c57-989d-3411e4e302dd\") " pod="openstack/rabbitmq-server-0" Oct 12 07:47:39 crc kubenswrapper[4599]: I1012 07:47:39.559048 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x98v2\" (UniqueName: \"kubernetes.io/projected/4612b7e8-a507-4c57-989d-3411e4e302dd-kube-api-access-x98v2\") pod \"rabbitmq-server-0\" (UID: \"4612b7e8-a507-4c57-989d-3411e4e302dd\") " pod="openstack/rabbitmq-server-0" Oct 12 07:47:39 crc kubenswrapper[4599]: I1012 07:47:39.561661 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"4612b7e8-a507-4c57-989d-3411e4e302dd\") " pod="openstack/rabbitmq-server-0" Oct 12 07:47:39 crc kubenswrapper[4599]: I1012 07:47:39.564002 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 12 07:47:39 crc kubenswrapper[4599]: I1012 07:47:39.565314 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 12 07:47:39 crc kubenswrapper[4599]: I1012 07:47:39.565421 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 12 07:47:39 crc kubenswrapper[4599]: I1012 07:47:39.566874 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-rgq2c" Oct 12 07:47:39 crc kubenswrapper[4599]: I1012 07:47:39.570743 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 12 07:47:39 crc kubenswrapper[4599]: I1012 07:47:39.570769 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 12 07:47:39 crc kubenswrapper[4599]: I1012 07:47:39.570821 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 12 07:47:39 crc kubenswrapper[4599]: I1012 07:47:39.570954 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 12 07:47:39 crc kubenswrapper[4599]: I1012 07:47:39.571200 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 12 07:47:39 crc kubenswrapper[4599]: I1012 07:47:39.571204 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 12 07:47:39 crc kubenswrapper[4599]: I1012 07:47:39.602142 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 12 07:47:39 crc kubenswrapper[4599]: I1012 07:47:39.738434 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6g59r\" (UniqueName: \"kubernetes.io/projected/2e036a1a-bc46-419f-88e4-312037490ec1-kube-api-access-6g59r\") pod \"rabbitmq-cell1-server-0\" (UID: \"2e036a1a-bc46-419f-88e4-312037490ec1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 07:47:39 crc kubenswrapper[4599]: I1012 07:47:39.738668 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2e036a1a-bc46-419f-88e4-312037490ec1-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"2e036a1a-bc46-419f-88e4-312037490ec1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 07:47:39 crc kubenswrapper[4599]: I1012 07:47:39.738693 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2e036a1a-bc46-419f-88e4-312037490ec1-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2e036a1a-bc46-419f-88e4-312037490ec1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 07:47:39 crc kubenswrapper[4599]: I1012 07:47:39.738723 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2e036a1a-bc46-419f-88e4-312037490ec1-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"2e036a1a-bc46-419f-88e4-312037490ec1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 07:47:39 crc kubenswrapper[4599]: I1012 07:47:39.738771 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2e036a1a-bc46-419f-88e4-312037490ec1-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2e036a1a-bc46-419f-88e4-312037490ec1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 07:47:39 crc kubenswrapper[4599]: I1012 07:47:39.738812 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2e036a1a-bc46-419f-88e4-312037490ec1-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"2e036a1a-bc46-419f-88e4-312037490ec1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 07:47:39 crc kubenswrapper[4599]: I1012 07:47:39.738840 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2e036a1a-bc46-419f-88e4-312037490ec1-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"2e036a1a-bc46-419f-88e4-312037490ec1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 07:47:39 crc kubenswrapper[4599]: I1012 07:47:39.738859 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2e036a1a-bc46-419f-88e4-312037490ec1-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"2e036a1a-bc46-419f-88e4-312037490ec1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 07:47:39 crc kubenswrapper[4599]: I1012 07:47:39.738945 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"2e036a1a-bc46-419f-88e4-312037490ec1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 07:47:39 crc kubenswrapper[4599]: I1012 07:47:39.738997 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2e036a1a-bc46-419f-88e4-312037490ec1-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"2e036a1a-bc46-419f-88e4-312037490ec1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 07:47:39 crc kubenswrapper[4599]: I1012 07:47:39.739036 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2e036a1a-bc46-419f-88e4-312037490ec1-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"2e036a1a-bc46-419f-88e4-312037490ec1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 07:47:39 crc kubenswrapper[4599]: I1012 07:47:39.840031 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2e036a1a-bc46-419f-88e4-312037490ec1-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"2e036a1a-bc46-419f-88e4-312037490ec1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 07:47:39 crc kubenswrapper[4599]: I1012 07:47:39.840371 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2e036a1a-bc46-419f-88e4-312037490ec1-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"2e036a1a-bc46-419f-88e4-312037490ec1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 07:47:39 crc kubenswrapper[4599]: I1012 07:47:39.840399 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2e036a1a-bc46-419f-88e4-312037490ec1-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"2e036a1a-bc46-419f-88e4-312037490ec1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 07:47:39 crc kubenswrapper[4599]: I1012 07:47:39.840449 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"2e036a1a-bc46-419f-88e4-312037490ec1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 07:47:39 crc kubenswrapper[4599]: I1012 07:47:39.840472 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2e036a1a-bc46-419f-88e4-312037490ec1-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"2e036a1a-bc46-419f-88e4-312037490ec1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 07:47:39 crc kubenswrapper[4599]: I1012 07:47:39.840500 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2e036a1a-bc46-419f-88e4-312037490ec1-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"2e036a1a-bc46-419f-88e4-312037490ec1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 07:47:39 crc kubenswrapper[4599]: I1012 07:47:39.840527 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6g59r\" (UniqueName: \"kubernetes.io/projected/2e036a1a-bc46-419f-88e4-312037490ec1-kube-api-access-6g59r\") pod \"rabbitmq-cell1-server-0\" (UID: \"2e036a1a-bc46-419f-88e4-312037490ec1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 07:47:39 crc kubenswrapper[4599]: I1012 07:47:39.840549 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2e036a1a-bc46-419f-88e4-312037490ec1-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"2e036a1a-bc46-419f-88e4-312037490ec1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 07:47:39 crc kubenswrapper[4599]: I1012 07:47:39.840566 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2e036a1a-bc46-419f-88e4-312037490ec1-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2e036a1a-bc46-419f-88e4-312037490ec1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 07:47:39 crc kubenswrapper[4599]: I1012 07:47:39.840584 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2e036a1a-bc46-419f-88e4-312037490ec1-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"2e036a1a-bc46-419f-88e4-312037490ec1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 07:47:39 crc kubenswrapper[4599]: I1012 07:47:39.840610 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2e036a1a-bc46-419f-88e4-312037490ec1-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2e036a1a-bc46-419f-88e4-312037490ec1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 07:47:39 crc kubenswrapper[4599]: I1012 07:47:39.840909 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2e036a1a-bc46-419f-88e4-312037490ec1-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"2e036a1a-bc46-419f-88e4-312037490ec1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 07:47:39 crc kubenswrapper[4599]: I1012 07:47:39.841154 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2e036a1a-bc46-419f-88e4-312037490ec1-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"2e036a1a-bc46-419f-88e4-312037490ec1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 07:47:39 crc kubenswrapper[4599]: I1012 07:47:39.841523 4599 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"2e036a1a-bc46-419f-88e4-312037490ec1\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-cell1-server-0" Oct 12 07:47:39 crc kubenswrapper[4599]: I1012 07:47:39.841877 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2e036a1a-bc46-419f-88e4-312037490ec1-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"2e036a1a-bc46-419f-88e4-312037490ec1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 07:47:39 crc kubenswrapper[4599]: I1012 07:47:39.842079 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2e036a1a-bc46-419f-88e4-312037490ec1-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2e036a1a-bc46-419f-88e4-312037490ec1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 07:47:39 crc kubenswrapper[4599]: I1012 07:47:39.842191 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2e036a1a-bc46-419f-88e4-312037490ec1-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2e036a1a-bc46-419f-88e4-312037490ec1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 07:47:39 crc kubenswrapper[4599]: I1012 07:47:39.849594 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2e036a1a-bc46-419f-88e4-312037490ec1-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"2e036a1a-bc46-419f-88e4-312037490ec1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 07:47:39 crc kubenswrapper[4599]: I1012 07:47:39.850741 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2e036a1a-bc46-419f-88e4-312037490ec1-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"2e036a1a-bc46-419f-88e4-312037490ec1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 07:47:39 crc kubenswrapper[4599]: I1012 07:47:39.851251 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2e036a1a-bc46-419f-88e4-312037490ec1-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"2e036a1a-bc46-419f-88e4-312037490ec1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 07:47:39 crc kubenswrapper[4599]: I1012 07:47:39.855283 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2e036a1a-bc46-419f-88e4-312037490ec1-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"2e036a1a-bc46-419f-88e4-312037490ec1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 07:47:39 crc kubenswrapper[4599]: I1012 07:47:39.855611 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6g59r\" (UniqueName: \"kubernetes.io/projected/2e036a1a-bc46-419f-88e4-312037490ec1-kube-api-access-6g59r\") pod \"rabbitmq-cell1-server-0\" (UID: \"2e036a1a-bc46-419f-88e4-312037490ec1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 07:47:39 crc kubenswrapper[4599]: I1012 07:47:39.862920 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"2e036a1a-bc46-419f-88e4-312037490ec1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 07:47:39 crc kubenswrapper[4599]: I1012 07:47:39.902308 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 12 07:47:40 crc kubenswrapper[4599]: I1012 07:47:40.027199 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 12 07:47:40 crc kubenswrapper[4599]: W1012 07:47:40.055697 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4612b7e8_a507_4c57_989d_3411e4e302dd.slice/crio-799743a26f0045717426a4c23e0ed1b66e3dd4db0002797f963c997b1a92d56d WatchSource:0}: Error finding container 799743a26f0045717426a4c23e0ed1b66e3dd4db0002797f963c997b1a92d56d: Status 404 returned error can't find the container with id 799743a26f0045717426a4c23e0ed1b66e3dd4db0002797f963c997b1a92d56d Oct 12 07:47:40 crc kubenswrapper[4599]: I1012 07:47:40.362835 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 12 07:47:40 crc kubenswrapper[4599]: W1012 07:47:40.372023 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e036a1a_bc46_419f_88e4_312037490ec1.slice/crio-8fe3e42b1dec5d7152fb84a3223b1d88da96ba8ea7f99ef467f1bf5d1e0cd136 WatchSource:0}: Error finding container 8fe3e42b1dec5d7152fb84a3223b1d88da96ba8ea7f99ef467f1bf5d1e0cd136: Status 404 returned error can't find the container with id 8fe3e42b1dec5d7152fb84a3223b1d88da96ba8ea7f99ef467f1bf5d1e0cd136 Oct 12 07:47:40 crc kubenswrapper[4599]: I1012 07:47:40.496355 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4612b7e8-a507-4c57-989d-3411e4e302dd","Type":"ContainerStarted","Data":"799743a26f0045717426a4c23e0ed1b66e3dd4db0002797f963c997b1a92d56d"} Oct 12 07:47:40 crc kubenswrapper[4599]: I1012 07:47:40.498143 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"2e036a1a-bc46-419f-88e4-312037490ec1","Type":"ContainerStarted","Data":"8fe3e42b1dec5d7152fb84a3223b1d88da96ba8ea7f99ef467f1bf5d1e0cd136"} Oct 12 07:47:41 crc kubenswrapper[4599]: I1012 07:47:41.033049 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Oct 12 07:47:41 crc kubenswrapper[4599]: I1012 07:47:41.044537 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 12 07:47:41 crc kubenswrapper[4599]: I1012 07:47:41.049906 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Oct 12 07:47:41 crc kubenswrapper[4599]: I1012 07:47:41.050436 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Oct 12 07:47:41 crc kubenswrapper[4599]: I1012 07:47:41.050440 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-bscvd" Oct 12 07:47:41 crc kubenswrapper[4599]: I1012 07:47:41.051254 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 12 07:47:41 crc kubenswrapper[4599]: I1012 07:47:41.057416 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Oct 12 07:47:41 crc kubenswrapper[4599]: I1012 07:47:41.057470 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Oct 12 07:47:41 crc kubenswrapper[4599]: I1012 07:47:41.076164 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Oct 12 07:47:41 crc kubenswrapper[4599]: I1012 07:47:41.169703 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a035bca-ccfe-4dc6-949a-44d2ddf0fa26-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"4a035bca-ccfe-4dc6-949a-44d2ddf0fa26\") " pod="openstack/openstack-galera-0" Oct 12 07:47:41 crc kubenswrapper[4599]: I1012 07:47:41.169924 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a035bca-ccfe-4dc6-949a-44d2ddf0fa26-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"4a035bca-ccfe-4dc6-949a-44d2ddf0fa26\") " pod="openstack/openstack-galera-0" Oct 12 07:47:41 crc kubenswrapper[4599]: I1012 07:47:41.171078 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k696h\" (UniqueName: \"kubernetes.io/projected/4a035bca-ccfe-4dc6-949a-44d2ddf0fa26-kube-api-access-k696h\") pod \"openstack-galera-0\" (UID: \"4a035bca-ccfe-4dc6-949a-44d2ddf0fa26\") " pod="openstack/openstack-galera-0" Oct 12 07:47:41 crc kubenswrapper[4599]: I1012 07:47:41.171225 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"4a035bca-ccfe-4dc6-949a-44d2ddf0fa26\") " pod="openstack/openstack-galera-0" Oct 12 07:47:41 crc kubenswrapper[4599]: I1012 07:47:41.171286 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4a035bca-ccfe-4dc6-949a-44d2ddf0fa26-config-data-default\") pod \"openstack-galera-0\" (UID: \"4a035bca-ccfe-4dc6-949a-44d2ddf0fa26\") " pod="openstack/openstack-galera-0" Oct 12 07:47:41 crc kubenswrapper[4599]: I1012 07:47:41.171400 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4a035bca-ccfe-4dc6-949a-44d2ddf0fa26-config-data-generated\") pod \"openstack-galera-0\" (UID: \"4a035bca-ccfe-4dc6-949a-44d2ddf0fa26\") " pod="openstack/openstack-galera-0" Oct 12 07:47:41 crc kubenswrapper[4599]: I1012 07:47:41.171439 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4a035bca-ccfe-4dc6-949a-44d2ddf0fa26-kolla-config\") pod \"openstack-galera-0\" (UID: \"4a035bca-ccfe-4dc6-949a-44d2ddf0fa26\") " pod="openstack/openstack-galera-0" Oct 12 07:47:41 crc kubenswrapper[4599]: I1012 07:47:41.171492 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a035bca-ccfe-4dc6-949a-44d2ddf0fa26-operator-scripts\") pod \"openstack-galera-0\" (UID: \"4a035bca-ccfe-4dc6-949a-44d2ddf0fa26\") " pod="openstack/openstack-galera-0" Oct 12 07:47:41 crc kubenswrapper[4599]: I1012 07:47:41.171532 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/4a035bca-ccfe-4dc6-949a-44d2ddf0fa26-secrets\") pod \"openstack-galera-0\" (UID: \"4a035bca-ccfe-4dc6-949a-44d2ddf0fa26\") " pod="openstack/openstack-galera-0" Oct 12 07:47:41 crc kubenswrapper[4599]: I1012 07:47:41.273117 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a035bca-ccfe-4dc6-949a-44d2ddf0fa26-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"4a035bca-ccfe-4dc6-949a-44d2ddf0fa26\") " pod="openstack/openstack-galera-0" Oct 12 07:47:41 crc kubenswrapper[4599]: I1012 07:47:41.273402 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k696h\" (UniqueName: \"kubernetes.io/projected/4a035bca-ccfe-4dc6-949a-44d2ddf0fa26-kube-api-access-k696h\") pod \"openstack-galera-0\" (UID: \"4a035bca-ccfe-4dc6-949a-44d2ddf0fa26\") " pod="openstack/openstack-galera-0" Oct 12 07:47:41 crc kubenswrapper[4599]: I1012 07:47:41.273456 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"4a035bca-ccfe-4dc6-949a-44d2ddf0fa26\") " pod="openstack/openstack-galera-0" Oct 12 07:47:41 crc kubenswrapper[4599]: I1012 07:47:41.273507 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4a035bca-ccfe-4dc6-949a-44d2ddf0fa26-config-data-default\") pod \"openstack-galera-0\" (UID: \"4a035bca-ccfe-4dc6-949a-44d2ddf0fa26\") " pod="openstack/openstack-galera-0" Oct 12 07:47:41 crc kubenswrapper[4599]: I1012 07:47:41.273547 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4a035bca-ccfe-4dc6-949a-44d2ddf0fa26-config-data-generated\") pod \"openstack-galera-0\" (UID: \"4a035bca-ccfe-4dc6-949a-44d2ddf0fa26\") " pod="openstack/openstack-galera-0" Oct 12 07:47:41 crc kubenswrapper[4599]: I1012 07:47:41.273597 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4a035bca-ccfe-4dc6-949a-44d2ddf0fa26-kolla-config\") pod \"openstack-galera-0\" (UID: \"4a035bca-ccfe-4dc6-949a-44d2ddf0fa26\") " pod="openstack/openstack-galera-0" Oct 12 07:47:41 crc kubenswrapper[4599]: I1012 07:47:41.273692 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a035bca-ccfe-4dc6-949a-44d2ddf0fa26-operator-scripts\") pod \"openstack-galera-0\" (UID: \"4a035bca-ccfe-4dc6-949a-44d2ddf0fa26\") " pod="openstack/openstack-galera-0" Oct 12 07:47:41 crc kubenswrapper[4599]: I1012 07:47:41.273738 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/4a035bca-ccfe-4dc6-949a-44d2ddf0fa26-secrets\") pod \"openstack-galera-0\" (UID: \"4a035bca-ccfe-4dc6-949a-44d2ddf0fa26\") " pod="openstack/openstack-galera-0" Oct 12 07:47:41 crc kubenswrapper[4599]: I1012 07:47:41.273793 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a035bca-ccfe-4dc6-949a-44d2ddf0fa26-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"4a035bca-ccfe-4dc6-949a-44d2ddf0fa26\") " pod="openstack/openstack-galera-0" Oct 12 07:47:41 crc kubenswrapper[4599]: I1012 07:47:41.273801 4599 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"4a035bca-ccfe-4dc6-949a-44d2ddf0fa26\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/openstack-galera-0" Oct 12 07:47:41 crc kubenswrapper[4599]: I1012 07:47:41.274954 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4a035bca-ccfe-4dc6-949a-44d2ddf0fa26-config-data-default\") pod \"openstack-galera-0\" (UID: \"4a035bca-ccfe-4dc6-949a-44d2ddf0fa26\") " pod="openstack/openstack-galera-0" Oct 12 07:47:41 crc kubenswrapper[4599]: I1012 07:47:41.275135 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4a035bca-ccfe-4dc6-949a-44d2ddf0fa26-kolla-config\") pod \"openstack-galera-0\" (UID: \"4a035bca-ccfe-4dc6-949a-44d2ddf0fa26\") " pod="openstack/openstack-galera-0" Oct 12 07:47:41 crc kubenswrapper[4599]: I1012 07:47:41.275176 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4a035bca-ccfe-4dc6-949a-44d2ddf0fa26-config-data-generated\") pod \"openstack-galera-0\" (UID: \"4a035bca-ccfe-4dc6-949a-44d2ddf0fa26\") " pod="openstack/openstack-galera-0" Oct 12 07:47:41 crc kubenswrapper[4599]: I1012 07:47:41.276260 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a035bca-ccfe-4dc6-949a-44d2ddf0fa26-operator-scripts\") pod \"openstack-galera-0\" (UID: \"4a035bca-ccfe-4dc6-949a-44d2ddf0fa26\") " pod="openstack/openstack-galera-0" Oct 12 07:47:41 crc kubenswrapper[4599]: I1012 07:47:41.281183 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a035bca-ccfe-4dc6-949a-44d2ddf0fa26-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"4a035bca-ccfe-4dc6-949a-44d2ddf0fa26\") " pod="openstack/openstack-galera-0" Oct 12 07:47:41 crc kubenswrapper[4599]: I1012 07:47:41.283537 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a035bca-ccfe-4dc6-949a-44d2ddf0fa26-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"4a035bca-ccfe-4dc6-949a-44d2ddf0fa26\") " pod="openstack/openstack-galera-0" Oct 12 07:47:41 crc kubenswrapper[4599]: I1012 07:47:41.286356 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/4a035bca-ccfe-4dc6-949a-44d2ddf0fa26-secrets\") pod \"openstack-galera-0\" (UID: \"4a035bca-ccfe-4dc6-949a-44d2ddf0fa26\") " pod="openstack/openstack-galera-0" Oct 12 07:47:41 crc kubenswrapper[4599]: I1012 07:47:41.302089 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k696h\" (UniqueName: \"kubernetes.io/projected/4a035bca-ccfe-4dc6-949a-44d2ddf0fa26-kube-api-access-k696h\") pod \"openstack-galera-0\" (UID: \"4a035bca-ccfe-4dc6-949a-44d2ddf0fa26\") " pod="openstack/openstack-galera-0" Oct 12 07:47:41 crc kubenswrapper[4599]: I1012 07:47:41.302420 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"4a035bca-ccfe-4dc6-949a-44d2ddf0fa26\") " pod="openstack/openstack-galera-0" Oct 12 07:47:41 crc kubenswrapper[4599]: I1012 07:47:41.371945 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 12 07:47:42 crc kubenswrapper[4599]: I1012 07:47:42.081707 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 12 07:47:42 crc kubenswrapper[4599]: I1012 07:47:42.394258 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 12 07:47:42 crc kubenswrapper[4599]: I1012 07:47:42.395795 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 12 07:47:42 crc kubenswrapper[4599]: I1012 07:47:42.397722 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Oct 12 07:47:42 crc kubenswrapper[4599]: I1012 07:47:42.398266 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-s5gst" Oct 12 07:47:42 crc kubenswrapper[4599]: I1012 07:47:42.398443 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Oct 12 07:47:42 crc kubenswrapper[4599]: I1012 07:47:42.398591 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Oct 12 07:47:42 crc kubenswrapper[4599]: I1012 07:47:42.403388 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 12 07:47:42 crc kubenswrapper[4599]: I1012 07:47:42.497083 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e76c3f9c-bea3-4b35-852c-65d48f177d8a-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"e76c3f9c-bea3-4b35-852c-65d48f177d8a\") " pod="openstack/openstack-cell1-galera-0" Oct 12 07:47:42 crc kubenswrapper[4599]: I1012 07:47:42.497176 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-cell1-galera-0\" (UID: \"e76c3f9c-bea3-4b35-852c-65d48f177d8a\") " pod="openstack/openstack-cell1-galera-0" Oct 12 07:47:42 crc kubenswrapper[4599]: I1012 07:47:42.497256 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fl7v\" (UniqueName: \"kubernetes.io/projected/e76c3f9c-bea3-4b35-852c-65d48f177d8a-kube-api-access-4fl7v\") pod \"openstack-cell1-galera-0\" (UID: \"e76c3f9c-bea3-4b35-852c-65d48f177d8a\") " pod="openstack/openstack-cell1-galera-0" Oct 12 07:47:42 crc kubenswrapper[4599]: I1012 07:47:42.497299 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e76c3f9c-bea3-4b35-852c-65d48f177d8a-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"e76c3f9c-bea3-4b35-852c-65d48f177d8a\") " pod="openstack/openstack-cell1-galera-0" Oct 12 07:47:42 crc kubenswrapper[4599]: I1012 07:47:42.497437 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e76c3f9c-bea3-4b35-852c-65d48f177d8a-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"e76c3f9c-bea3-4b35-852c-65d48f177d8a\") " pod="openstack/openstack-cell1-galera-0" Oct 12 07:47:42 crc kubenswrapper[4599]: I1012 07:47:42.497495 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e76c3f9c-bea3-4b35-852c-65d48f177d8a-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"e76c3f9c-bea3-4b35-852c-65d48f177d8a\") " pod="openstack/openstack-cell1-galera-0" Oct 12 07:47:42 crc kubenswrapper[4599]: I1012 07:47:42.497509 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e76c3f9c-bea3-4b35-852c-65d48f177d8a-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"e76c3f9c-bea3-4b35-852c-65d48f177d8a\") " pod="openstack/openstack-cell1-galera-0" Oct 12 07:47:42 crc kubenswrapper[4599]: I1012 07:47:42.497538 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e76c3f9c-bea3-4b35-852c-65d48f177d8a-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"e76c3f9c-bea3-4b35-852c-65d48f177d8a\") " pod="openstack/openstack-cell1-galera-0" Oct 12 07:47:42 crc kubenswrapper[4599]: I1012 07:47:42.497601 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/e76c3f9c-bea3-4b35-852c-65d48f177d8a-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"e76c3f9c-bea3-4b35-852c-65d48f177d8a\") " pod="openstack/openstack-cell1-galera-0" Oct 12 07:47:42 crc kubenswrapper[4599]: I1012 07:47:42.527308 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"4a035bca-ccfe-4dc6-949a-44d2ddf0fa26","Type":"ContainerStarted","Data":"27d62348ebb9b16cbdd76e3ff76ac7ac0cef3380c28e966cd595f23c84845af4"} Oct 12 07:47:42 crc kubenswrapper[4599]: I1012 07:47:42.599318 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e76c3f9c-bea3-4b35-852c-65d48f177d8a-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"e76c3f9c-bea3-4b35-852c-65d48f177d8a\") " pod="openstack/openstack-cell1-galera-0" Oct 12 07:47:42 crc kubenswrapper[4599]: I1012 07:47:42.599431 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e76c3f9c-bea3-4b35-852c-65d48f177d8a-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"e76c3f9c-bea3-4b35-852c-65d48f177d8a\") " pod="openstack/openstack-cell1-galera-0" Oct 12 07:47:42 crc kubenswrapper[4599]: I1012 07:47:42.599457 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e76c3f9c-bea3-4b35-852c-65d48f177d8a-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"e76c3f9c-bea3-4b35-852c-65d48f177d8a\") " pod="openstack/openstack-cell1-galera-0" Oct 12 07:47:42 crc kubenswrapper[4599]: I1012 07:47:42.599494 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e76c3f9c-bea3-4b35-852c-65d48f177d8a-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"e76c3f9c-bea3-4b35-852c-65d48f177d8a\") " pod="openstack/openstack-cell1-galera-0" Oct 12 07:47:42 crc kubenswrapper[4599]: I1012 07:47:42.599553 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/e76c3f9c-bea3-4b35-852c-65d48f177d8a-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"e76c3f9c-bea3-4b35-852c-65d48f177d8a\") " pod="openstack/openstack-cell1-galera-0" Oct 12 07:47:42 crc kubenswrapper[4599]: I1012 07:47:42.599626 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e76c3f9c-bea3-4b35-852c-65d48f177d8a-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"e76c3f9c-bea3-4b35-852c-65d48f177d8a\") " pod="openstack/openstack-cell1-galera-0" Oct 12 07:47:42 crc kubenswrapper[4599]: I1012 07:47:42.599667 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-cell1-galera-0\" (UID: \"e76c3f9c-bea3-4b35-852c-65d48f177d8a\") " pod="openstack/openstack-cell1-galera-0" Oct 12 07:47:42 crc kubenswrapper[4599]: I1012 07:47:42.599690 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fl7v\" (UniqueName: \"kubernetes.io/projected/e76c3f9c-bea3-4b35-852c-65d48f177d8a-kube-api-access-4fl7v\") pod \"openstack-cell1-galera-0\" (UID: \"e76c3f9c-bea3-4b35-852c-65d48f177d8a\") " pod="openstack/openstack-cell1-galera-0" Oct 12 07:47:42 crc kubenswrapper[4599]: I1012 07:47:42.599728 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e76c3f9c-bea3-4b35-852c-65d48f177d8a-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"e76c3f9c-bea3-4b35-852c-65d48f177d8a\") " pod="openstack/openstack-cell1-galera-0" Oct 12 07:47:42 crc kubenswrapper[4599]: I1012 07:47:42.601796 4599 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-cell1-galera-0\" (UID: \"e76c3f9c-bea3-4b35-852c-65d48f177d8a\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/openstack-cell1-galera-0" Oct 12 07:47:42 crc kubenswrapper[4599]: I1012 07:47:42.602911 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e76c3f9c-bea3-4b35-852c-65d48f177d8a-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"e76c3f9c-bea3-4b35-852c-65d48f177d8a\") " pod="openstack/openstack-cell1-galera-0" Oct 12 07:47:42 crc kubenswrapper[4599]: I1012 07:47:42.603304 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e76c3f9c-bea3-4b35-852c-65d48f177d8a-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"e76c3f9c-bea3-4b35-852c-65d48f177d8a\") " pod="openstack/openstack-cell1-galera-0" Oct 12 07:47:42 crc kubenswrapper[4599]: I1012 07:47:42.603544 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e76c3f9c-bea3-4b35-852c-65d48f177d8a-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"e76c3f9c-bea3-4b35-852c-65d48f177d8a\") " pod="openstack/openstack-cell1-galera-0" Oct 12 07:47:42 crc kubenswrapper[4599]: I1012 07:47:42.604618 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e76c3f9c-bea3-4b35-852c-65d48f177d8a-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"e76c3f9c-bea3-4b35-852c-65d48f177d8a\") " pod="openstack/openstack-cell1-galera-0" Oct 12 07:47:42 crc kubenswrapper[4599]: I1012 07:47:42.608300 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e76c3f9c-bea3-4b35-852c-65d48f177d8a-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"e76c3f9c-bea3-4b35-852c-65d48f177d8a\") " pod="openstack/openstack-cell1-galera-0" Oct 12 07:47:42 crc kubenswrapper[4599]: I1012 07:47:42.608702 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e76c3f9c-bea3-4b35-852c-65d48f177d8a-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"e76c3f9c-bea3-4b35-852c-65d48f177d8a\") " pod="openstack/openstack-cell1-galera-0" Oct 12 07:47:42 crc kubenswrapper[4599]: I1012 07:47:42.619281 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/e76c3f9c-bea3-4b35-852c-65d48f177d8a-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"e76c3f9c-bea3-4b35-852c-65d48f177d8a\") " pod="openstack/openstack-cell1-galera-0" Oct 12 07:47:42 crc kubenswrapper[4599]: I1012 07:47:42.624377 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fl7v\" (UniqueName: \"kubernetes.io/projected/e76c3f9c-bea3-4b35-852c-65d48f177d8a-kube-api-access-4fl7v\") pod \"openstack-cell1-galera-0\" (UID: \"e76c3f9c-bea3-4b35-852c-65d48f177d8a\") " pod="openstack/openstack-cell1-galera-0" Oct 12 07:47:42 crc kubenswrapper[4599]: I1012 07:47:42.637117 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-cell1-galera-0\" (UID: \"e76c3f9c-bea3-4b35-852c-65d48f177d8a\") " pod="openstack/openstack-cell1-galera-0" Oct 12 07:47:42 crc kubenswrapper[4599]: I1012 07:47:42.708852 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Oct 12 07:47:42 crc kubenswrapper[4599]: I1012 07:47:42.709823 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 12 07:47:42 crc kubenswrapper[4599]: I1012 07:47:42.716450 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-rgn6k" Oct 12 07:47:42 crc kubenswrapper[4599]: I1012 07:47:42.716863 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Oct 12 07:47:42 crc kubenswrapper[4599]: I1012 07:47:42.717612 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Oct 12 07:47:42 crc kubenswrapper[4599]: I1012 07:47:42.721055 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 12 07:47:42 crc kubenswrapper[4599]: I1012 07:47:42.733366 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 12 07:47:42 crc kubenswrapper[4599]: I1012 07:47:42.806424 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/fe3787ed-cb03-457b-ad65-b33044cccffd-kolla-config\") pod \"memcached-0\" (UID: \"fe3787ed-cb03-457b-ad65-b33044cccffd\") " pod="openstack/memcached-0" Oct 12 07:47:42 crc kubenswrapper[4599]: I1012 07:47:42.806806 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fe3787ed-cb03-457b-ad65-b33044cccffd-config-data\") pod \"memcached-0\" (UID: \"fe3787ed-cb03-457b-ad65-b33044cccffd\") " pod="openstack/memcached-0" Oct 12 07:47:42 crc kubenswrapper[4599]: I1012 07:47:42.806861 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe3787ed-cb03-457b-ad65-b33044cccffd-combined-ca-bundle\") pod \"memcached-0\" (UID: \"fe3787ed-cb03-457b-ad65-b33044cccffd\") " pod="openstack/memcached-0" Oct 12 07:47:42 crc kubenswrapper[4599]: I1012 07:47:42.806897 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe3787ed-cb03-457b-ad65-b33044cccffd-memcached-tls-certs\") pod \"memcached-0\" (UID: \"fe3787ed-cb03-457b-ad65-b33044cccffd\") " pod="openstack/memcached-0" Oct 12 07:47:42 crc kubenswrapper[4599]: I1012 07:47:42.807047 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6xzg\" (UniqueName: \"kubernetes.io/projected/fe3787ed-cb03-457b-ad65-b33044cccffd-kube-api-access-d6xzg\") pod \"memcached-0\" (UID: \"fe3787ed-cb03-457b-ad65-b33044cccffd\") " pod="openstack/memcached-0" Oct 12 07:47:42 crc kubenswrapper[4599]: I1012 07:47:42.908560 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fe3787ed-cb03-457b-ad65-b33044cccffd-config-data\") pod \"memcached-0\" (UID: \"fe3787ed-cb03-457b-ad65-b33044cccffd\") " pod="openstack/memcached-0" Oct 12 07:47:42 crc kubenswrapper[4599]: I1012 07:47:42.908650 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe3787ed-cb03-457b-ad65-b33044cccffd-combined-ca-bundle\") pod \"memcached-0\" (UID: \"fe3787ed-cb03-457b-ad65-b33044cccffd\") " pod="openstack/memcached-0" Oct 12 07:47:42 crc kubenswrapper[4599]: I1012 07:47:42.908684 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe3787ed-cb03-457b-ad65-b33044cccffd-memcached-tls-certs\") pod \"memcached-0\" (UID: \"fe3787ed-cb03-457b-ad65-b33044cccffd\") " pod="openstack/memcached-0" Oct 12 07:47:42 crc kubenswrapper[4599]: I1012 07:47:42.908805 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6xzg\" (UniqueName: \"kubernetes.io/projected/fe3787ed-cb03-457b-ad65-b33044cccffd-kube-api-access-d6xzg\") pod \"memcached-0\" (UID: \"fe3787ed-cb03-457b-ad65-b33044cccffd\") " pod="openstack/memcached-0" Oct 12 07:47:42 crc kubenswrapper[4599]: I1012 07:47:42.908869 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/fe3787ed-cb03-457b-ad65-b33044cccffd-kolla-config\") pod \"memcached-0\" (UID: \"fe3787ed-cb03-457b-ad65-b33044cccffd\") " pod="openstack/memcached-0" Oct 12 07:47:42 crc kubenswrapper[4599]: I1012 07:47:42.909721 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/fe3787ed-cb03-457b-ad65-b33044cccffd-kolla-config\") pod \"memcached-0\" (UID: \"fe3787ed-cb03-457b-ad65-b33044cccffd\") " pod="openstack/memcached-0" Oct 12 07:47:42 crc kubenswrapper[4599]: I1012 07:47:42.912360 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fe3787ed-cb03-457b-ad65-b33044cccffd-config-data\") pod \"memcached-0\" (UID: \"fe3787ed-cb03-457b-ad65-b33044cccffd\") " pod="openstack/memcached-0" Oct 12 07:47:42 crc kubenswrapper[4599]: I1012 07:47:42.918145 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe3787ed-cb03-457b-ad65-b33044cccffd-combined-ca-bundle\") pod \"memcached-0\" (UID: \"fe3787ed-cb03-457b-ad65-b33044cccffd\") " pod="openstack/memcached-0" Oct 12 07:47:42 crc kubenswrapper[4599]: I1012 07:47:42.919028 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe3787ed-cb03-457b-ad65-b33044cccffd-memcached-tls-certs\") pod \"memcached-0\" (UID: \"fe3787ed-cb03-457b-ad65-b33044cccffd\") " pod="openstack/memcached-0" Oct 12 07:47:42 crc kubenswrapper[4599]: I1012 07:47:42.931715 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6xzg\" (UniqueName: \"kubernetes.io/projected/fe3787ed-cb03-457b-ad65-b33044cccffd-kube-api-access-d6xzg\") pod \"memcached-0\" (UID: \"fe3787ed-cb03-457b-ad65-b33044cccffd\") " pod="openstack/memcached-0" Oct 12 07:47:43 crc kubenswrapper[4599]: I1012 07:47:43.037764 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 12 07:47:43 crc kubenswrapper[4599]: I1012 07:47:43.318614 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 12 07:47:43 crc kubenswrapper[4599]: I1012 07:47:43.540091 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"e76c3f9c-bea3-4b35-852c-65d48f177d8a","Type":"ContainerStarted","Data":"7629064a55901d602bb24d646ce8c6e590d6bf7b9f22c73eac7798d260587aac"} Oct 12 07:47:43 crc kubenswrapper[4599]: I1012 07:47:43.542867 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 12 07:47:43 crc kubenswrapper[4599]: W1012 07:47:43.552405 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfe3787ed_cb03_457b_ad65_b33044cccffd.slice/crio-5436f42a3e951f1f479f9f9bfb03376960c55500cfdc08f2f862203215080ad4 WatchSource:0}: Error finding container 5436f42a3e951f1f479f9f9bfb03376960c55500cfdc08f2f862203215080ad4: Status 404 returned error can't find the container with id 5436f42a3e951f1f479f9f9bfb03376960c55500cfdc08f2f862203215080ad4 Oct 12 07:47:44 crc kubenswrapper[4599]: I1012 07:47:44.425370 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 12 07:47:44 crc kubenswrapper[4599]: I1012 07:47:44.427003 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 12 07:47:44 crc kubenswrapper[4599]: I1012 07:47:44.431607 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-cpl5m" Oct 12 07:47:44 crc kubenswrapper[4599]: I1012 07:47:44.440095 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 12 07:47:44 crc kubenswrapper[4599]: I1012 07:47:44.544189 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xvtt\" (UniqueName: \"kubernetes.io/projected/0ca657cf-7ac2-4cb5-894b-c3a149ee4101-kube-api-access-5xvtt\") pod \"kube-state-metrics-0\" (UID: \"0ca657cf-7ac2-4cb5-894b-c3a149ee4101\") " pod="openstack/kube-state-metrics-0" Oct 12 07:47:44 crc kubenswrapper[4599]: I1012 07:47:44.553134 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"fe3787ed-cb03-457b-ad65-b33044cccffd","Type":"ContainerStarted","Data":"5436f42a3e951f1f479f9f9bfb03376960c55500cfdc08f2f862203215080ad4"} Oct 12 07:47:44 crc kubenswrapper[4599]: I1012 07:47:44.646585 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xvtt\" (UniqueName: \"kubernetes.io/projected/0ca657cf-7ac2-4cb5-894b-c3a149ee4101-kube-api-access-5xvtt\") pod \"kube-state-metrics-0\" (UID: \"0ca657cf-7ac2-4cb5-894b-c3a149ee4101\") " pod="openstack/kube-state-metrics-0" Oct 12 07:47:44 crc kubenswrapper[4599]: I1012 07:47:44.665088 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xvtt\" (UniqueName: \"kubernetes.io/projected/0ca657cf-7ac2-4cb5-894b-c3a149ee4101-kube-api-access-5xvtt\") pod \"kube-state-metrics-0\" (UID: \"0ca657cf-7ac2-4cb5-894b-c3a149ee4101\") " pod="openstack/kube-state-metrics-0" Oct 12 07:47:44 crc kubenswrapper[4599]: I1012 07:47:44.758988 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 12 07:47:48 crc kubenswrapper[4599]: I1012 07:47:48.084903 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-9rbk6"] Oct 12 07:47:48 crc kubenswrapper[4599]: I1012 07:47:48.106003 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-9rbk6"] Oct 12 07:47:48 crc kubenswrapper[4599]: I1012 07:47:48.106113 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9rbk6" Oct 12 07:47:48 crc kubenswrapper[4599]: I1012 07:47:48.109401 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Oct 12 07:47:48 crc kubenswrapper[4599]: I1012 07:47:48.109469 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-8s62q"] Oct 12 07:47:48 crc kubenswrapper[4599]: I1012 07:47:48.109643 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-jmfbc" Oct 12 07:47:48 crc kubenswrapper[4599]: I1012 07:47:48.110656 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-8s62q" Oct 12 07:47:48 crc kubenswrapper[4599]: I1012 07:47:48.113090 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Oct 12 07:47:48 crc kubenswrapper[4599]: I1012 07:47:48.113370 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-8s62q"] Oct 12 07:47:48 crc kubenswrapper[4599]: I1012 07:47:48.220467 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/78cb767a-31ee-4e29-b075-e773a43272c2-var-run\") pod \"ovn-controller-ovs-8s62q\" (UID: \"78cb767a-31ee-4e29-b075-e773a43272c2\") " pod="openstack/ovn-controller-ovs-8s62q" Oct 12 07:47:48 crc kubenswrapper[4599]: I1012 07:47:48.220603 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ff6de4a7-bd76-46bc-a376-b1ec8c5ab712-var-run-ovn\") pod \"ovn-controller-9rbk6\" (UID: \"ff6de4a7-bd76-46bc-a376-b1ec8c5ab712\") " pod="openstack/ovn-controller-9rbk6" Oct 12 07:47:48 crc kubenswrapper[4599]: I1012 07:47:48.220656 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ff6de4a7-bd76-46bc-a376-b1ec8c5ab712-scripts\") pod \"ovn-controller-9rbk6\" (UID: \"ff6de4a7-bd76-46bc-a376-b1ec8c5ab712\") " pod="openstack/ovn-controller-9rbk6" Oct 12 07:47:48 crc kubenswrapper[4599]: I1012 07:47:48.220699 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ff6de4a7-bd76-46bc-a376-b1ec8c5ab712-var-log-ovn\") pod \"ovn-controller-9rbk6\" (UID: \"ff6de4a7-bd76-46bc-a376-b1ec8c5ab712\") " pod="openstack/ovn-controller-9rbk6" Oct 12 07:47:48 crc kubenswrapper[4599]: I1012 07:47:48.220720 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff6de4a7-bd76-46bc-a376-b1ec8c5ab712-ovn-controller-tls-certs\") pod \"ovn-controller-9rbk6\" (UID: \"ff6de4a7-bd76-46bc-a376-b1ec8c5ab712\") " pod="openstack/ovn-controller-9rbk6" Oct 12 07:47:48 crc kubenswrapper[4599]: I1012 07:47:48.220827 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/78cb767a-31ee-4e29-b075-e773a43272c2-var-log\") pod \"ovn-controller-ovs-8s62q\" (UID: \"78cb767a-31ee-4e29-b075-e773a43272c2\") " pod="openstack/ovn-controller-ovs-8s62q" Oct 12 07:47:48 crc kubenswrapper[4599]: I1012 07:47:48.220885 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff6de4a7-bd76-46bc-a376-b1ec8c5ab712-combined-ca-bundle\") pod \"ovn-controller-9rbk6\" (UID: \"ff6de4a7-bd76-46bc-a376-b1ec8c5ab712\") " pod="openstack/ovn-controller-9rbk6" Oct 12 07:47:48 crc kubenswrapper[4599]: I1012 07:47:48.220968 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7znf8\" (UniqueName: \"kubernetes.io/projected/78cb767a-31ee-4e29-b075-e773a43272c2-kube-api-access-7znf8\") pod \"ovn-controller-ovs-8s62q\" (UID: \"78cb767a-31ee-4e29-b075-e773a43272c2\") " pod="openstack/ovn-controller-ovs-8s62q" Oct 12 07:47:48 crc kubenswrapper[4599]: I1012 07:47:48.221005 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/78cb767a-31ee-4e29-b075-e773a43272c2-scripts\") pod \"ovn-controller-ovs-8s62q\" (UID: \"78cb767a-31ee-4e29-b075-e773a43272c2\") " pod="openstack/ovn-controller-ovs-8s62q" Oct 12 07:47:48 crc kubenswrapper[4599]: I1012 07:47:48.221028 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/78cb767a-31ee-4e29-b075-e773a43272c2-var-lib\") pod \"ovn-controller-ovs-8s62q\" (UID: \"78cb767a-31ee-4e29-b075-e773a43272c2\") " pod="openstack/ovn-controller-ovs-8s62q" Oct 12 07:47:48 crc kubenswrapper[4599]: I1012 07:47:48.221138 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9p2b8\" (UniqueName: \"kubernetes.io/projected/ff6de4a7-bd76-46bc-a376-b1ec8c5ab712-kube-api-access-9p2b8\") pod \"ovn-controller-9rbk6\" (UID: \"ff6de4a7-bd76-46bc-a376-b1ec8c5ab712\") " pod="openstack/ovn-controller-9rbk6" Oct 12 07:47:48 crc kubenswrapper[4599]: I1012 07:47:48.221239 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/78cb767a-31ee-4e29-b075-e773a43272c2-etc-ovs\") pod \"ovn-controller-ovs-8s62q\" (UID: \"78cb767a-31ee-4e29-b075-e773a43272c2\") " pod="openstack/ovn-controller-ovs-8s62q" Oct 12 07:47:48 crc kubenswrapper[4599]: I1012 07:47:48.221297 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ff6de4a7-bd76-46bc-a376-b1ec8c5ab712-var-run\") pod \"ovn-controller-9rbk6\" (UID: \"ff6de4a7-bd76-46bc-a376-b1ec8c5ab712\") " pod="openstack/ovn-controller-9rbk6" Oct 12 07:47:48 crc kubenswrapper[4599]: I1012 07:47:48.325771 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7znf8\" (UniqueName: \"kubernetes.io/projected/78cb767a-31ee-4e29-b075-e773a43272c2-kube-api-access-7znf8\") pod \"ovn-controller-ovs-8s62q\" (UID: \"78cb767a-31ee-4e29-b075-e773a43272c2\") " pod="openstack/ovn-controller-ovs-8s62q" Oct 12 07:47:48 crc kubenswrapper[4599]: I1012 07:47:48.325830 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/78cb767a-31ee-4e29-b075-e773a43272c2-scripts\") pod \"ovn-controller-ovs-8s62q\" (UID: \"78cb767a-31ee-4e29-b075-e773a43272c2\") " pod="openstack/ovn-controller-ovs-8s62q" Oct 12 07:47:48 crc kubenswrapper[4599]: I1012 07:47:48.325879 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/78cb767a-31ee-4e29-b075-e773a43272c2-var-lib\") pod \"ovn-controller-ovs-8s62q\" (UID: \"78cb767a-31ee-4e29-b075-e773a43272c2\") " pod="openstack/ovn-controller-ovs-8s62q" Oct 12 07:47:48 crc kubenswrapper[4599]: I1012 07:47:48.325902 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9p2b8\" (UniqueName: \"kubernetes.io/projected/ff6de4a7-bd76-46bc-a376-b1ec8c5ab712-kube-api-access-9p2b8\") pod \"ovn-controller-9rbk6\" (UID: \"ff6de4a7-bd76-46bc-a376-b1ec8c5ab712\") " pod="openstack/ovn-controller-9rbk6" Oct 12 07:47:48 crc kubenswrapper[4599]: I1012 07:47:48.326556 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/78cb767a-31ee-4e29-b075-e773a43272c2-var-lib\") pod \"ovn-controller-ovs-8s62q\" (UID: \"78cb767a-31ee-4e29-b075-e773a43272c2\") " pod="openstack/ovn-controller-ovs-8s62q" Oct 12 07:47:48 crc kubenswrapper[4599]: I1012 07:47:48.328461 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/78cb767a-31ee-4e29-b075-e773a43272c2-scripts\") pod \"ovn-controller-ovs-8s62q\" (UID: \"78cb767a-31ee-4e29-b075-e773a43272c2\") " pod="openstack/ovn-controller-ovs-8s62q" Oct 12 07:47:48 crc kubenswrapper[4599]: I1012 07:47:48.328609 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/78cb767a-31ee-4e29-b075-e773a43272c2-etc-ovs\") pod \"ovn-controller-ovs-8s62q\" (UID: \"78cb767a-31ee-4e29-b075-e773a43272c2\") " pod="openstack/ovn-controller-ovs-8s62q" Oct 12 07:47:48 crc kubenswrapper[4599]: I1012 07:47:48.337595 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ff6de4a7-bd76-46bc-a376-b1ec8c5ab712-var-run\") pod \"ovn-controller-9rbk6\" (UID: \"ff6de4a7-bd76-46bc-a376-b1ec8c5ab712\") " pod="openstack/ovn-controller-9rbk6" Oct 12 07:47:48 crc kubenswrapper[4599]: I1012 07:47:48.338008 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ff6de4a7-bd76-46bc-a376-b1ec8c5ab712-var-run\") pod \"ovn-controller-9rbk6\" (UID: \"ff6de4a7-bd76-46bc-a376-b1ec8c5ab712\") " pod="openstack/ovn-controller-9rbk6" Oct 12 07:47:48 crc kubenswrapper[4599]: I1012 07:47:48.338000 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/78cb767a-31ee-4e29-b075-e773a43272c2-etc-ovs\") pod \"ovn-controller-ovs-8s62q\" (UID: \"78cb767a-31ee-4e29-b075-e773a43272c2\") " pod="openstack/ovn-controller-ovs-8s62q" Oct 12 07:47:48 crc kubenswrapper[4599]: I1012 07:47:48.338206 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/78cb767a-31ee-4e29-b075-e773a43272c2-var-run\") pod \"ovn-controller-ovs-8s62q\" (UID: \"78cb767a-31ee-4e29-b075-e773a43272c2\") " pod="openstack/ovn-controller-ovs-8s62q" Oct 12 07:47:48 crc kubenswrapper[4599]: I1012 07:47:48.338270 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ff6de4a7-bd76-46bc-a376-b1ec8c5ab712-var-run-ovn\") pod \"ovn-controller-9rbk6\" (UID: \"ff6de4a7-bd76-46bc-a376-b1ec8c5ab712\") " pod="openstack/ovn-controller-9rbk6" Oct 12 07:47:48 crc kubenswrapper[4599]: I1012 07:47:48.338436 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ff6de4a7-bd76-46bc-a376-b1ec8c5ab712-var-run-ovn\") pod \"ovn-controller-9rbk6\" (UID: \"ff6de4a7-bd76-46bc-a376-b1ec8c5ab712\") " pod="openstack/ovn-controller-9rbk6" Oct 12 07:47:48 crc kubenswrapper[4599]: I1012 07:47:48.338465 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/78cb767a-31ee-4e29-b075-e773a43272c2-var-run\") pod \"ovn-controller-ovs-8s62q\" (UID: \"78cb767a-31ee-4e29-b075-e773a43272c2\") " pod="openstack/ovn-controller-ovs-8s62q" Oct 12 07:47:48 crc kubenswrapper[4599]: I1012 07:47:48.338650 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ff6de4a7-bd76-46bc-a376-b1ec8c5ab712-scripts\") pod \"ovn-controller-9rbk6\" (UID: \"ff6de4a7-bd76-46bc-a376-b1ec8c5ab712\") " pod="openstack/ovn-controller-9rbk6" Oct 12 07:47:48 crc kubenswrapper[4599]: I1012 07:47:48.338690 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ff6de4a7-bd76-46bc-a376-b1ec8c5ab712-var-log-ovn\") pod \"ovn-controller-9rbk6\" (UID: \"ff6de4a7-bd76-46bc-a376-b1ec8c5ab712\") " pod="openstack/ovn-controller-9rbk6" Oct 12 07:47:48 crc kubenswrapper[4599]: I1012 07:47:48.338906 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ff6de4a7-bd76-46bc-a376-b1ec8c5ab712-var-log-ovn\") pod \"ovn-controller-9rbk6\" (UID: \"ff6de4a7-bd76-46bc-a376-b1ec8c5ab712\") " pod="openstack/ovn-controller-9rbk6" Oct 12 07:47:48 crc kubenswrapper[4599]: I1012 07:47:48.338721 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff6de4a7-bd76-46bc-a376-b1ec8c5ab712-ovn-controller-tls-certs\") pod \"ovn-controller-9rbk6\" (UID: \"ff6de4a7-bd76-46bc-a376-b1ec8c5ab712\") " pod="openstack/ovn-controller-9rbk6" Oct 12 07:47:48 crc kubenswrapper[4599]: I1012 07:47:48.339044 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/78cb767a-31ee-4e29-b075-e773a43272c2-var-log\") pod \"ovn-controller-ovs-8s62q\" (UID: \"78cb767a-31ee-4e29-b075-e773a43272c2\") " pod="openstack/ovn-controller-ovs-8s62q" Oct 12 07:47:48 crc kubenswrapper[4599]: I1012 07:47:48.339186 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/78cb767a-31ee-4e29-b075-e773a43272c2-var-log\") pod \"ovn-controller-ovs-8s62q\" (UID: \"78cb767a-31ee-4e29-b075-e773a43272c2\") " pod="openstack/ovn-controller-ovs-8s62q" Oct 12 07:47:48 crc kubenswrapper[4599]: I1012 07:47:48.339259 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff6de4a7-bd76-46bc-a376-b1ec8c5ab712-combined-ca-bundle\") pod \"ovn-controller-9rbk6\" (UID: \"ff6de4a7-bd76-46bc-a376-b1ec8c5ab712\") " pod="openstack/ovn-controller-9rbk6" Oct 12 07:47:48 crc kubenswrapper[4599]: I1012 07:47:48.344225 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff6de4a7-bd76-46bc-a376-b1ec8c5ab712-combined-ca-bundle\") pod \"ovn-controller-9rbk6\" (UID: \"ff6de4a7-bd76-46bc-a376-b1ec8c5ab712\") " pod="openstack/ovn-controller-9rbk6" Oct 12 07:47:48 crc kubenswrapper[4599]: I1012 07:47:48.344259 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7znf8\" (UniqueName: \"kubernetes.io/projected/78cb767a-31ee-4e29-b075-e773a43272c2-kube-api-access-7znf8\") pod \"ovn-controller-ovs-8s62q\" (UID: \"78cb767a-31ee-4e29-b075-e773a43272c2\") " pod="openstack/ovn-controller-ovs-8s62q" Oct 12 07:47:48 crc kubenswrapper[4599]: I1012 07:47:48.344270 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ff6de4a7-bd76-46bc-a376-b1ec8c5ab712-scripts\") pod \"ovn-controller-9rbk6\" (UID: \"ff6de4a7-bd76-46bc-a376-b1ec8c5ab712\") " pod="openstack/ovn-controller-9rbk6" Oct 12 07:47:48 crc kubenswrapper[4599]: I1012 07:47:48.353778 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff6de4a7-bd76-46bc-a376-b1ec8c5ab712-ovn-controller-tls-certs\") pod \"ovn-controller-9rbk6\" (UID: \"ff6de4a7-bd76-46bc-a376-b1ec8c5ab712\") " pod="openstack/ovn-controller-9rbk6" Oct 12 07:47:48 crc kubenswrapper[4599]: I1012 07:47:48.354206 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9p2b8\" (UniqueName: \"kubernetes.io/projected/ff6de4a7-bd76-46bc-a376-b1ec8c5ab712-kube-api-access-9p2b8\") pod \"ovn-controller-9rbk6\" (UID: \"ff6de4a7-bd76-46bc-a376-b1ec8c5ab712\") " pod="openstack/ovn-controller-9rbk6" Oct 12 07:47:48 crc kubenswrapper[4599]: I1012 07:47:48.427777 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9rbk6" Oct 12 07:47:48 crc kubenswrapper[4599]: I1012 07:47:48.443923 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-8s62q" Oct 12 07:47:49 crc kubenswrapper[4599]: I1012 07:47:49.439328 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 12 07:47:49 crc kubenswrapper[4599]: I1012 07:47:49.441078 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 12 07:47:49 crc kubenswrapper[4599]: I1012 07:47:49.442971 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-mf6xx" Oct 12 07:47:49 crc kubenswrapper[4599]: I1012 07:47:49.443622 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Oct 12 07:47:49 crc kubenswrapper[4599]: I1012 07:47:49.443628 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Oct 12 07:47:49 crc kubenswrapper[4599]: I1012 07:47:49.444368 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Oct 12 07:47:49 crc kubenswrapper[4599]: I1012 07:47:49.444562 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Oct 12 07:47:49 crc kubenswrapper[4599]: I1012 07:47:49.450703 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 12 07:47:49 crc kubenswrapper[4599]: I1012 07:47:49.566129 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d3345d9-bc30-42dc-98e0-bfd24fee35ab-config\") pod \"ovsdbserver-nb-0\" (UID: \"9d3345d9-bc30-42dc-98e0-bfd24fee35ab\") " pod="openstack/ovsdbserver-nb-0" Oct 12 07:47:49 crc kubenswrapper[4599]: I1012 07:47:49.566216 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9d3345d9-bc30-42dc-98e0-bfd24fee35ab-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"9d3345d9-bc30-42dc-98e0-bfd24fee35ab\") " pod="openstack/ovsdbserver-nb-0" Oct 12 07:47:49 crc kubenswrapper[4599]: I1012 07:47:49.566313 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d3345d9-bc30-42dc-98e0-bfd24fee35ab-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"9d3345d9-bc30-42dc-98e0-bfd24fee35ab\") " pod="openstack/ovsdbserver-nb-0" Oct 12 07:47:49 crc kubenswrapper[4599]: I1012 07:47:49.566390 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"9d3345d9-bc30-42dc-98e0-bfd24fee35ab\") " pod="openstack/ovsdbserver-nb-0" Oct 12 07:47:49 crc kubenswrapper[4599]: I1012 07:47:49.566599 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9dk9\" (UniqueName: \"kubernetes.io/projected/9d3345d9-bc30-42dc-98e0-bfd24fee35ab-kube-api-access-v9dk9\") pod \"ovsdbserver-nb-0\" (UID: \"9d3345d9-bc30-42dc-98e0-bfd24fee35ab\") " pod="openstack/ovsdbserver-nb-0" Oct 12 07:47:49 crc kubenswrapper[4599]: I1012 07:47:49.566726 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d3345d9-bc30-42dc-98e0-bfd24fee35ab-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"9d3345d9-bc30-42dc-98e0-bfd24fee35ab\") " pod="openstack/ovsdbserver-nb-0" Oct 12 07:47:49 crc kubenswrapper[4599]: I1012 07:47:49.566811 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9d3345d9-bc30-42dc-98e0-bfd24fee35ab-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"9d3345d9-bc30-42dc-98e0-bfd24fee35ab\") " pod="openstack/ovsdbserver-nb-0" Oct 12 07:47:49 crc kubenswrapper[4599]: I1012 07:47:49.566962 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d3345d9-bc30-42dc-98e0-bfd24fee35ab-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"9d3345d9-bc30-42dc-98e0-bfd24fee35ab\") " pod="openstack/ovsdbserver-nb-0" Oct 12 07:47:49 crc kubenswrapper[4599]: I1012 07:47:49.668604 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d3345d9-bc30-42dc-98e0-bfd24fee35ab-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"9d3345d9-bc30-42dc-98e0-bfd24fee35ab\") " pod="openstack/ovsdbserver-nb-0" Oct 12 07:47:49 crc kubenswrapper[4599]: I1012 07:47:49.668662 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d3345d9-bc30-42dc-98e0-bfd24fee35ab-config\") pod \"ovsdbserver-nb-0\" (UID: \"9d3345d9-bc30-42dc-98e0-bfd24fee35ab\") " pod="openstack/ovsdbserver-nb-0" Oct 12 07:47:49 crc kubenswrapper[4599]: I1012 07:47:49.668700 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9d3345d9-bc30-42dc-98e0-bfd24fee35ab-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"9d3345d9-bc30-42dc-98e0-bfd24fee35ab\") " pod="openstack/ovsdbserver-nb-0" Oct 12 07:47:49 crc kubenswrapper[4599]: I1012 07:47:49.668725 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d3345d9-bc30-42dc-98e0-bfd24fee35ab-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"9d3345d9-bc30-42dc-98e0-bfd24fee35ab\") " pod="openstack/ovsdbserver-nb-0" Oct 12 07:47:49 crc kubenswrapper[4599]: I1012 07:47:49.668760 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"9d3345d9-bc30-42dc-98e0-bfd24fee35ab\") " pod="openstack/ovsdbserver-nb-0" Oct 12 07:47:49 crc kubenswrapper[4599]: I1012 07:47:49.668799 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9dk9\" (UniqueName: \"kubernetes.io/projected/9d3345d9-bc30-42dc-98e0-bfd24fee35ab-kube-api-access-v9dk9\") pod \"ovsdbserver-nb-0\" (UID: \"9d3345d9-bc30-42dc-98e0-bfd24fee35ab\") " pod="openstack/ovsdbserver-nb-0" Oct 12 07:47:49 crc kubenswrapper[4599]: I1012 07:47:49.668821 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d3345d9-bc30-42dc-98e0-bfd24fee35ab-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"9d3345d9-bc30-42dc-98e0-bfd24fee35ab\") " pod="openstack/ovsdbserver-nb-0" Oct 12 07:47:49 crc kubenswrapper[4599]: I1012 07:47:49.668846 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9d3345d9-bc30-42dc-98e0-bfd24fee35ab-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"9d3345d9-bc30-42dc-98e0-bfd24fee35ab\") " pod="openstack/ovsdbserver-nb-0" Oct 12 07:47:49 crc kubenswrapper[4599]: I1012 07:47:49.669231 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9d3345d9-bc30-42dc-98e0-bfd24fee35ab-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"9d3345d9-bc30-42dc-98e0-bfd24fee35ab\") " pod="openstack/ovsdbserver-nb-0" Oct 12 07:47:49 crc kubenswrapper[4599]: I1012 07:47:49.671702 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d3345d9-bc30-42dc-98e0-bfd24fee35ab-config\") pod \"ovsdbserver-nb-0\" (UID: \"9d3345d9-bc30-42dc-98e0-bfd24fee35ab\") " pod="openstack/ovsdbserver-nb-0" Oct 12 07:47:49 crc kubenswrapper[4599]: I1012 07:47:49.671850 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9d3345d9-bc30-42dc-98e0-bfd24fee35ab-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"9d3345d9-bc30-42dc-98e0-bfd24fee35ab\") " pod="openstack/ovsdbserver-nb-0" Oct 12 07:47:49 crc kubenswrapper[4599]: I1012 07:47:49.672308 4599 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"9d3345d9-bc30-42dc-98e0-bfd24fee35ab\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/ovsdbserver-nb-0" Oct 12 07:47:49 crc kubenswrapper[4599]: I1012 07:47:49.675239 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d3345d9-bc30-42dc-98e0-bfd24fee35ab-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"9d3345d9-bc30-42dc-98e0-bfd24fee35ab\") " pod="openstack/ovsdbserver-nb-0" Oct 12 07:47:49 crc kubenswrapper[4599]: I1012 07:47:49.680271 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d3345d9-bc30-42dc-98e0-bfd24fee35ab-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"9d3345d9-bc30-42dc-98e0-bfd24fee35ab\") " pod="openstack/ovsdbserver-nb-0" Oct 12 07:47:49 crc kubenswrapper[4599]: I1012 07:47:49.686638 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9dk9\" (UniqueName: \"kubernetes.io/projected/9d3345d9-bc30-42dc-98e0-bfd24fee35ab-kube-api-access-v9dk9\") pod \"ovsdbserver-nb-0\" (UID: \"9d3345d9-bc30-42dc-98e0-bfd24fee35ab\") " pod="openstack/ovsdbserver-nb-0" Oct 12 07:47:49 crc kubenswrapper[4599]: I1012 07:47:49.697099 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d3345d9-bc30-42dc-98e0-bfd24fee35ab-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"9d3345d9-bc30-42dc-98e0-bfd24fee35ab\") " pod="openstack/ovsdbserver-nb-0" Oct 12 07:47:49 crc kubenswrapper[4599]: I1012 07:47:49.699328 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"9d3345d9-bc30-42dc-98e0-bfd24fee35ab\") " pod="openstack/ovsdbserver-nb-0" Oct 12 07:47:49 crc kubenswrapper[4599]: I1012 07:47:49.759932 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 12 07:47:51 crc kubenswrapper[4599]: I1012 07:47:51.267230 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 12 07:47:51 crc kubenswrapper[4599]: I1012 07:47:51.268619 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 12 07:47:51 crc kubenswrapper[4599]: I1012 07:47:51.272165 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Oct 12 07:47:51 crc kubenswrapper[4599]: I1012 07:47:51.273614 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-7x7pf" Oct 12 07:47:51 crc kubenswrapper[4599]: I1012 07:47:51.274310 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Oct 12 07:47:51 crc kubenswrapper[4599]: I1012 07:47:51.276154 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Oct 12 07:47:51 crc kubenswrapper[4599]: I1012 07:47:51.279602 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 12 07:47:51 crc kubenswrapper[4599]: I1012 07:47:51.410733 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7b21b67-7112-4507-a5d2-9036f09a3cdf-config\") pod \"ovsdbserver-sb-0\" (UID: \"b7b21b67-7112-4507-a5d2-9036f09a3cdf\") " pod="openstack/ovsdbserver-sb-0" Oct 12 07:47:51 crc kubenswrapper[4599]: I1012 07:47:51.410902 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b7b21b67-7112-4507-a5d2-9036f09a3cdf-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"b7b21b67-7112-4507-a5d2-9036f09a3cdf\") " pod="openstack/ovsdbserver-sb-0" Oct 12 07:47:51 crc kubenswrapper[4599]: I1012 07:47:51.410945 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b7b21b67-7112-4507-a5d2-9036f09a3cdf-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"b7b21b67-7112-4507-a5d2-9036f09a3cdf\") " pod="openstack/ovsdbserver-sb-0" Oct 12 07:47:51 crc kubenswrapper[4599]: I1012 07:47:51.410975 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-sb-0\" (UID: \"b7b21b67-7112-4507-a5d2-9036f09a3cdf\") " pod="openstack/ovsdbserver-sb-0" Oct 12 07:47:51 crc kubenswrapper[4599]: I1012 07:47:51.411015 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7b21b67-7112-4507-a5d2-9036f09a3cdf-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b7b21b67-7112-4507-a5d2-9036f09a3cdf\") " pod="openstack/ovsdbserver-sb-0" Oct 12 07:47:51 crc kubenswrapper[4599]: I1012 07:47:51.411033 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7b21b67-7112-4507-a5d2-9036f09a3cdf-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"b7b21b67-7112-4507-a5d2-9036f09a3cdf\") " pod="openstack/ovsdbserver-sb-0" Oct 12 07:47:51 crc kubenswrapper[4599]: I1012 07:47:51.411069 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2nt6\" (UniqueName: \"kubernetes.io/projected/b7b21b67-7112-4507-a5d2-9036f09a3cdf-kube-api-access-n2nt6\") pod \"ovsdbserver-sb-0\" (UID: \"b7b21b67-7112-4507-a5d2-9036f09a3cdf\") " pod="openstack/ovsdbserver-sb-0" Oct 12 07:47:51 crc kubenswrapper[4599]: I1012 07:47:51.411097 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7b21b67-7112-4507-a5d2-9036f09a3cdf-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b7b21b67-7112-4507-a5d2-9036f09a3cdf\") " pod="openstack/ovsdbserver-sb-0" Oct 12 07:47:51 crc kubenswrapper[4599]: I1012 07:47:51.512033 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2nt6\" (UniqueName: \"kubernetes.io/projected/b7b21b67-7112-4507-a5d2-9036f09a3cdf-kube-api-access-n2nt6\") pod \"ovsdbserver-sb-0\" (UID: \"b7b21b67-7112-4507-a5d2-9036f09a3cdf\") " pod="openstack/ovsdbserver-sb-0" Oct 12 07:47:51 crc kubenswrapper[4599]: I1012 07:47:51.512088 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7b21b67-7112-4507-a5d2-9036f09a3cdf-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b7b21b67-7112-4507-a5d2-9036f09a3cdf\") " pod="openstack/ovsdbserver-sb-0" Oct 12 07:47:51 crc kubenswrapper[4599]: I1012 07:47:51.512135 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7b21b67-7112-4507-a5d2-9036f09a3cdf-config\") pod \"ovsdbserver-sb-0\" (UID: \"b7b21b67-7112-4507-a5d2-9036f09a3cdf\") " pod="openstack/ovsdbserver-sb-0" Oct 12 07:47:51 crc kubenswrapper[4599]: I1012 07:47:51.512162 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b7b21b67-7112-4507-a5d2-9036f09a3cdf-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"b7b21b67-7112-4507-a5d2-9036f09a3cdf\") " pod="openstack/ovsdbserver-sb-0" Oct 12 07:47:51 crc kubenswrapper[4599]: I1012 07:47:51.512198 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b7b21b67-7112-4507-a5d2-9036f09a3cdf-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"b7b21b67-7112-4507-a5d2-9036f09a3cdf\") " pod="openstack/ovsdbserver-sb-0" Oct 12 07:47:51 crc kubenswrapper[4599]: I1012 07:47:51.512248 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-sb-0\" (UID: \"b7b21b67-7112-4507-a5d2-9036f09a3cdf\") " pod="openstack/ovsdbserver-sb-0" Oct 12 07:47:51 crc kubenswrapper[4599]: I1012 07:47:51.512288 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7b21b67-7112-4507-a5d2-9036f09a3cdf-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b7b21b67-7112-4507-a5d2-9036f09a3cdf\") " pod="openstack/ovsdbserver-sb-0" Oct 12 07:47:51 crc kubenswrapper[4599]: I1012 07:47:51.512306 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7b21b67-7112-4507-a5d2-9036f09a3cdf-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"b7b21b67-7112-4507-a5d2-9036f09a3cdf\") " pod="openstack/ovsdbserver-sb-0" Oct 12 07:47:51 crc kubenswrapper[4599]: I1012 07:47:51.512552 4599 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-sb-0\" (UID: \"b7b21b67-7112-4507-a5d2-9036f09a3cdf\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/ovsdbserver-sb-0" Oct 12 07:47:51 crc kubenswrapper[4599]: I1012 07:47:51.512829 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b7b21b67-7112-4507-a5d2-9036f09a3cdf-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"b7b21b67-7112-4507-a5d2-9036f09a3cdf\") " pod="openstack/ovsdbserver-sb-0" Oct 12 07:47:51 crc kubenswrapper[4599]: I1012 07:47:51.513597 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7b21b67-7112-4507-a5d2-9036f09a3cdf-config\") pod \"ovsdbserver-sb-0\" (UID: \"b7b21b67-7112-4507-a5d2-9036f09a3cdf\") " pod="openstack/ovsdbserver-sb-0" Oct 12 07:47:51 crc kubenswrapper[4599]: I1012 07:47:51.513922 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b7b21b67-7112-4507-a5d2-9036f09a3cdf-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"b7b21b67-7112-4507-a5d2-9036f09a3cdf\") " pod="openstack/ovsdbserver-sb-0" Oct 12 07:47:51 crc kubenswrapper[4599]: I1012 07:47:51.517954 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7b21b67-7112-4507-a5d2-9036f09a3cdf-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b7b21b67-7112-4507-a5d2-9036f09a3cdf\") " pod="openstack/ovsdbserver-sb-0" Oct 12 07:47:51 crc kubenswrapper[4599]: I1012 07:47:51.518376 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7b21b67-7112-4507-a5d2-9036f09a3cdf-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"b7b21b67-7112-4507-a5d2-9036f09a3cdf\") " pod="openstack/ovsdbserver-sb-0" Oct 12 07:47:51 crc kubenswrapper[4599]: I1012 07:47:51.520780 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7b21b67-7112-4507-a5d2-9036f09a3cdf-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b7b21b67-7112-4507-a5d2-9036f09a3cdf\") " pod="openstack/ovsdbserver-sb-0" Oct 12 07:47:51 crc kubenswrapper[4599]: I1012 07:47:51.530038 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2nt6\" (UniqueName: \"kubernetes.io/projected/b7b21b67-7112-4507-a5d2-9036f09a3cdf-kube-api-access-n2nt6\") pod \"ovsdbserver-sb-0\" (UID: \"b7b21b67-7112-4507-a5d2-9036f09a3cdf\") " pod="openstack/ovsdbserver-sb-0" Oct 12 07:47:51 crc kubenswrapper[4599]: I1012 07:47:51.538034 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-sb-0\" (UID: \"b7b21b67-7112-4507-a5d2-9036f09a3cdf\") " pod="openstack/ovsdbserver-sb-0" Oct 12 07:47:51 crc kubenswrapper[4599]: I1012 07:47:51.593645 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 12 07:47:56 crc kubenswrapper[4599]: I1012 07:47:56.027516 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-lbmsk"] Oct 12 07:47:56 crc kubenswrapper[4599]: I1012 07:47:56.033095 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lbmsk" Oct 12 07:47:56 crc kubenswrapper[4599]: I1012 07:47:56.040244 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lbmsk"] Oct 12 07:47:56 crc kubenswrapper[4599]: I1012 07:47:56.190914 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f139cdb1-1669-49a2-8bc4-f68a6bfd67c7-catalog-content\") pod \"redhat-operators-lbmsk\" (UID: \"f139cdb1-1669-49a2-8bc4-f68a6bfd67c7\") " pod="openshift-marketplace/redhat-operators-lbmsk" Oct 12 07:47:56 crc kubenswrapper[4599]: I1012 07:47:56.191056 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7v7h\" (UniqueName: \"kubernetes.io/projected/f139cdb1-1669-49a2-8bc4-f68a6bfd67c7-kube-api-access-q7v7h\") pod \"redhat-operators-lbmsk\" (UID: \"f139cdb1-1669-49a2-8bc4-f68a6bfd67c7\") " pod="openshift-marketplace/redhat-operators-lbmsk" Oct 12 07:47:56 crc kubenswrapper[4599]: I1012 07:47:56.191129 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f139cdb1-1669-49a2-8bc4-f68a6bfd67c7-utilities\") pod \"redhat-operators-lbmsk\" (UID: \"f139cdb1-1669-49a2-8bc4-f68a6bfd67c7\") " pod="openshift-marketplace/redhat-operators-lbmsk" Oct 12 07:47:56 crc kubenswrapper[4599]: I1012 07:47:56.293136 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f139cdb1-1669-49a2-8bc4-f68a6bfd67c7-utilities\") pod \"redhat-operators-lbmsk\" (UID: \"f139cdb1-1669-49a2-8bc4-f68a6bfd67c7\") " pod="openshift-marketplace/redhat-operators-lbmsk" Oct 12 07:47:56 crc kubenswrapper[4599]: I1012 07:47:56.293291 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f139cdb1-1669-49a2-8bc4-f68a6bfd67c7-catalog-content\") pod \"redhat-operators-lbmsk\" (UID: \"f139cdb1-1669-49a2-8bc4-f68a6bfd67c7\") " pod="openshift-marketplace/redhat-operators-lbmsk" Oct 12 07:47:56 crc kubenswrapper[4599]: I1012 07:47:56.293311 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7v7h\" (UniqueName: \"kubernetes.io/projected/f139cdb1-1669-49a2-8bc4-f68a6bfd67c7-kube-api-access-q7v7h\") pod \"redhat-operators-lbmsk\" (UID: \"f139cdb1-1669-49a2-8bc4-f68a6bfd67c7\") " pod="openshift-marketplace/redhat-operators-lbmsk" Oct 12 07:47:56 crc kubenswrapper[4599]: I1012 07:47:56.293946 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f139cdb1-1669-49a2-8bc4-f68a6bfd67c7-utilities\") pod \"redhat-operators-lbmsk\" (UID: \"f139cdb1-1669-49a2-8bc4-f68a6bfd67c7\") " pod="openshift-marketplace/redhat-operators-lbmsk" Oct 12 07:47:56 crc kubenswrapper[4599]: I1012 07:47:56.294176 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f139cdb1-1669-49a2-8bc4-f68a6bfd67c7-catalog-content\") pod \"redhat-operators-lbmsk\" (UID: \"f139cdb1-1669-49a2-8bc4-f68a6bfd67c7\") " pod="openshift-marketplace/redhat-operators-lbmsk" Oct 12 07:47:56 crc kubenswrapper[4599]: I1012 07:47:56.315159 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7v7h\" (UniqueName: \"kubernetes.io/projected/f139cdb1-1669-49a2-8bc4-f68a6bfd67c7-kube-api-access-q7v7h\") pod \"redhat-operators-lbmsk\" (UID: \"f139cdb1-1669-49a2-8bc4-f68a6bfd67c7\") " pod="openshift-marketplace/redhat-operators-lbmsk" Oct 12 07:47:56 crc kubenswrapper[4599]: I1012 07:47:56.363465 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lbmsk" Oct 12 07:48:01 crc kubenswrapper[4599]: E1012 07:48:01.828005 4599 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:92672cd85fd36317d65faa0525acf849" Oct 12 07:48:01 crc kubenswrapper[4599]: E1012 07:48:01.828181 4599 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:92672cd85fd36317d65faa0525acf849" Oct 12 07:48:01 crc kubenswrapper[4599]: E1012 07:48:01.828355 4599 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:92672cd85fd36317d65faa0525acf849,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-h9vbh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-656586ff77-dprk5_openstack(af7a3b11-bc49-4cd5-985e-7be1c153b44c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 12 07:48:01 crc kubenswrapper[4599]: E1012 07:48:01.829531 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-656586ff77-dprk5" podUID="af7a3b11-bc49-4cd5-985e-7be1c153b44c" Oct 12 07:48:01 crc kubenswrapper[4599]: E1012 07:48:01.836827 4599 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:92672cd85fd36317d65faa0525acf849" Oct 12 07:48:01 crc kubenswrapper[4599]: E1012 07:48:01.836867 4599 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:92672cd85fd36317d65faa0525acf849" Oct 12 07:48:01 crc kubenswrapper[4599]: E1012 07:48:01.837024 4599 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:92672cd85fd36317d65faa0525acf849,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2c5dp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-7876f7ff45-r2kr5_openstack(6f00e6e8-0dd1-4756-b7fc-8d00725a9644): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 12 07:48:01 crc kubenswrapper[4599]: E1012 07:48:01.838764 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-7876f7ff45-r2kr5" podUID="6f00e6e8-0dd1-4756-b7fc-8d00725a9644" Oct 12 07:48:03 crc kubenswrapper[4599]: E1012 07:48:03.966706 4599 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-mariadb:92672cd85fd36317d65faa0525acf849" Oct 12 07:48:03 crc kubenswrapper[4599]: E1012 07:48:03.967150 4599 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-mariadb:92672cd85fd36317d65faa0525acf849" Oct 12 07:48:03 crc kubenswrapper[4599]: E1012 07:48:03.967328 4599 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-mariadb:92672cd85fd36317d65faa0525acf849,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:DB_ROOT_PASSWORD,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:osp-secret,},Key:DbRootPassword,Optional:nil,},},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:secrets,ReadOnly:true,MountPath:/var/lib/secrets,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-k696h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-galera-0_openstack(4a035bca-ccfe-4dc6-949a-44d2ddf0fa26): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 12 07:48:03 crc kubenswrapper[4599]: E1012 07:48:03.968840 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-galera-0" podUID="4a035bca-ccfe-4dc6-949a-44d2ddf0fa26" Oct 12 07:48:03 crc kubenswrapper[4599]: E1012 07:48:03.984984 4599 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-rabbitmq:92672cd85fd36317d65faa0525acf849" Oct 12 07:48:03 crc kubenswrapper[4599]: E1012 07:48:03.985386 4599 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-rabbitmq:92672cd85fd36317d65faa0525acf849" Oct 12 07:48:03 crc kubenswrapper[4599]: E1012 07:48:03.985544 4599 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-rabbitmq:92672cd85fd36317d65faa0525acf849,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6g59r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cell1-server-0_openstack(2e036a1a-bc46-419f-88e4-312037490ec1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 12 07:48:03 crc kubenswrapper[4599]: E1012 07:48:03.987132 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-cell1-server-0" podUID="2e036a1a-bc46-419f-88e4-312037490ec1" Oct 12 07:48:04 crc kubenswrapper[4599]: I1012 07:48:04.625171 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-656586ff77-dprk5" Oct 12 07:48:04 crc kubenswrapper[4599]: I1012 07:48:04.633283 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7876f7ff45-r2kr5" Oct 12 07:48:04 crc kubenswrapper[4599]: I1012 07:48:04.719362 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7876f7ff45-r2kr5" event={"ID":"6f00e6e8-0dd1-4756-b7fc-8d00725a9644","Type":"ContainerDied","Data":"990d59d2a7c170930842d1dd60c8d0384d2c6419f00b58d6b8b81aae7bc5425e"} Oct 12 07:48:04 crc kubenswrapper[4599]: I1012 07:48:04.719464 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7876f7ff45-r2kr5" Oct 12 07:48:04 crc kubenswrapper[4599]: I1012 07:48:04.722367 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-656586ff77-dprk5" Oct 12 07:48:04 crc kubenswrapper[4599]: I1012 07:48:04.722366 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-656586ff77-dprk5" event={"ID":"af7a3b11-bc49-4cd5-985e-7be1c153b44c","Type":"ContainerDied","Data":"d1709d37eac125f2c7b458783c944e0a8d5d4bc8a33f0a973193012e706ed1d3"} Oct 12 07:48:04 crc kubenswrapper[4599]: I1012 07:48:04.746028 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af7a3b11-bc49-4cd5-985e-7be1c153b44c-config\") pod \"af7a3b11-bc49-4cd5-985e-7be1c153b44c\" (UID: \"af7a3b11-bc49-4cd5-985e-7be1c153b44c\") " Oct 12 07:48:04 crc kubenswrapper[4599]: I1012 07:48:04.746114 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2c5dp\" (UniqueName: \"kubernetes.io/projected/6f00e6e8-0dd1-4756-b7fc-8d00725a9644-kube-api-access-2c5dp\") pod \"6f00e6e8-0dd1-4756-b7fc-8d00725a9644\" (UID: \"6f00e6e8-0dd1-4756-b7fc-8d00725a9644\") " Oct 12 07:48:04 crc kubenswrapper[4599]: I1012 07:48:04.746183 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f00e6e8-0dd1-4756-b7fc-8d00725a9644-config\") pod \"6f00e6e8-0dd1-4756-b7fc-8d00725a9644\" (UID: \"6f00e6e8-0dd1-4756-b7fc-8d00725a9644\") " Oct 12 07:48:04 crc kubenswrapper[4599]: I1012 07:48:04.746205 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h9vbh\" (UniqueName: \"kubernetes.io/projected/af7a3b11-bc49-4cd5-985e-7be1c153b44c-kube-api-access-h9vbh\") pod \"af7a3b11-bc49-4cd5-985e-7be1c153b44c\" (UID: \"af7a3b11-bc49-4cd5-985e-7be1c153b44c\") " Oct 12 07:48:04 crc kubenswrapper[4599]: I1012 07:48:04.746420 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6f00e6e8-0dd1-4756-b7fc-8d00725a9644-dns-svc\") pod \"6f00e6e8-0dd1-4756-b7fc-8d00725a9644\" (UID: \"6f00e6e8-0dd1-4756-b7fc-8d00725a9644\") " Oct 12 07:48:04 crc kubenswrapper[4599]: I1012 07:48:04.746553 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af7a3b11-bc49-4cd5-985e-7be1c153b44c-config" (OuterVolumeSpecName: "config") pod "af7a3b11-bc49-4cd5-985e-7be1c153b44c" (UID: "af7a3b11-bc49-4cd5-985e-7be1c153b44c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:48:04 crc kubenswrapper[4599]: I1012 07:48:04.746869 4599 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af7a3b11-bc49-4cd5-985e-7be1c153b44c-config\") on node \"crc\" DevicePath \"\"" Oct 12 07:48:04 crc kubenswrapper[4599]: I1012 07:48:04.748329 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f00e6e8-0dd1-4756-b7fc-8d00725a9644-config" (OuterVolumeSpecName: "config") pod "6f00e6e8-0dd1-4756-b7fc-8d00725a9644" (UID: "6f00e6e8-0dd1-4756-b7fc-8d00725a9644"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:48:04 crc kubenswrapper[4599]: I1012 07:48:04.749181 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f00e6e8-0dd1-4756-b7fc-8d00725a9644-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6f00e6e8-0dd1-4756-b7fc-8d00725a9644" (UID: "6f00e6e8-0dd1-4756-b7fc-8d00725a9644"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:48:04 crc kubenswrapper[4599]: I1012 07:48:04.756770 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af7a3b11-bc49-4cd5-985e-7be1c153b44c-kube-api-access-h9vbh" (OuterVolumeSpecName: "kube-api-access-h9vbh") pod "af7a3b11-bc49-4cd5-985e-7be1c153b44c" (UID: "af7a3b11-bc49-4cd5-985e-7be1c153b44c"). InnerVolumeSpecName "kube-api-access-h9vbh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:48:04 crc kubenswrapper[4599]: I1012 07:48:04.759275 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f00e6e8-0dd1-4756-b7fc-8d00725a9644-kube-api-access-2c5dp" (OuterVolumeSpecName: "kube-api-access-2c5dp") pod "6f00e6e8-0dd1-4756-b7fc-8d00725a9644" (UID: "6f00e6e8-0dd1-4756-b7fc-8d00725a9644"). InnerVolumeSpecName "kube-api-access-2c5dp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:48:04 crc kubenswrapper[4599]: I1012 07:48:04.849357 4599 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6f00e6e8-0dd1-4756-b7fc-8d00725a9644-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 12 07:48:04 crc kubenswrapper[4599]: I1012 07:48:04.849382 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2c5dp\" (UniqueName: \"kubernetes.io/projected/6f00e6e8-0dd1-4756-b7fc-8d00725a9644-kube-api-access-2c5dp\") on node \"crc\" DevicePath \"\"" Oct 12 07:48:04 crc kubenswrapper[4599]: I1012 07:48:04.849397 4599 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f00e6e8-0dd1-4756-b7fc-8d00725a9644-config\") on node \"crc\" DevicePath \"\"" Oct 12 07:48:04 crc kubenswrapper[4599]: I1012 07:48:04.849410 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h9vbh\" (UniqueName: \"kubernetes.io/projected/af7a3b11-bc49-4cd5-985e-7be1c153b44c-kube-api-access-h9vbh\") on node \"crc\" DevicePath \"\"" Oct 12 07:48:04 crc kubenswrapper[4599]: I1012 07:48:04.998389 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 12 07:48:05 crc kubenswrapper[4599]: W1012 07:48:05.006986 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0ca657cf_7ac2_4cb5_894b_c3a149ee4101.slice/crio-48ead8091005c34ed45cd6ad65b8725f721aa243504bbec10149ec01246e58f1 WatchSource:0}: Error finding container 48ead8091005c34ed45cd6ad65b8725f721aa243504bbec10149ec01246e58f1: Status 404 returned error can't find the container with id 48ead8091005c34ed45cd6ad65b8725f721aa243504bbec10149ec01246e58f1 Oct 12 07:48:05 crc kubenswrapper[4599]: I1012 07:48:05.077399 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-9rbk6"] Oct 12 07:48:05 crc kubenswrapper[4599]: I1012 07:48:05.147417 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7876f7ff45-r2kr5"] Oct 12 07:48:05 crc kubenswrapper[4599]: I1012 07:48:05.155669 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7876f7ff45-r2kr5"] Oct 12 07:48:05 crc kubenswrapper[4599]: I1012 07:48:05.182169 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-656586ff77-dprk5"] Oct 12 07:48:05 crc kubenswrapper[4599]: I1012 07:48:05.186119 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-656586ff77-dprk5"] Oct 12 07:48:05 crc kubenswrapper[4599]: I1012 07:48:05.194757 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 12 07:48:05 crc kubenswrapper[4599]: I1012 07:48:05.259416 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lbmsk"] Oct 12 07:48:05 crc kubenswrapper[4599]: W1012 07:48:05.271102 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9d3345d9_bc30_42dc_98e0_bfd24fee35ab.slice/crio-7e07e30765ab77c9b34337b3297d502194b8765bd49dfd0a35e834a7752cb1f7 WatchSource:0}: Error finding container 7e07e30765ab77c9b34337b3297d502194b8765bd49dfd0a35e834a7752cb1f7: Status 404 returned error can't find the container with id 7e07e30765ab77c9b34337b3297d502194b8765bd49dfd0a35e834a7752cb1f7 Oct 12 07:48:05 crc kubenswrapper[4599]: W1012 07:48:05.275496 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf139cdb1_1669_49a2_8bc4_f68a6bfd67c7.slice/crio-93ca963b42f9995879fcaf7ee68fc22db633ceeac305f07b1f88f1331891cd8b WatchSource:0}: Error finding container 93ca963b42f9995879fcaf7ee68fc22db633ceeac305f07b1f88f1331891cd8b: Status 404 returned error can't find the container with id 93ca963b42f9995879fcaf7ee68fc22db633ceeac305f07b1f88f1331891cd8b Oct 12 07:48:05 crc kubenswrapper[4599]: I1012 07:48:05.297308 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-8s62q"] Oct 12 07:48:05 crc kubenswrapper[4599]: W1012 07:48:05.301113 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod78cb767a_31ee_4e29_b075_e773a43272c2.slice/crio-76d87299e649480e865c2b47e574c70b92b47464e34336478f5783b9e2d591e9 WatchSource:0}: Error finding container 76d87299e649480e865c2b47e574c70b92b47464e34336478f5783b9e2d591e9: Status 404 returned error can't find the container with id 76d87299e649480e865c2b47e574c70b92b47464e34336478f5783b9e2d591e9 Oct 12 07:48:05 crc kubenswrapper[4599]: I1012 07:48:05.558199 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f00e6e8-0dd1-4756-b7fc-8d00725a9644" path="/var/lib/kubelet/pods/6f00e6e8-0dd1-4756-b7fc-8d00725a9644/volumes" Oct 12 07:48:05 crc kubenswrapper[4599]: I1012 07:48:05.558572 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af7a3b11-bc49-4cd5-985e-7be1c153b44c" path="/var/lib/kubelet/pods/af7a3b11-bc49-4cd5-985e-7be1c153b44c/volumes" Oct 12 07:48:05 crc kubenswrapper[4599]: I1012 07:48:05.733487 4599 generic.go:334] "Generic (PLEG): container finished" podID="91fafba3-ee3c-4380-be34-b897a835c882" containerID="a946191ced61b07ba1d89c37e81acb84650d5f7b40f563b56bbdb52beb01d1a4" exitCode=0 Oct 12 07:48:05 crc kubenswrapper[4599]: I1012 07:48:05.733604 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69bb789bb9-4nnl8" event={"ID":"91fafba3-ee3c-4380-be34-b897a835c882","Type":"ContainerDied","Data":"a946191ced61b07ba1d89c37e81acb84650d5f7b40f563b56bbdb52beb01d1a4"} Oct 12 07:48:05 crc kubenswrapper[4599]: I1012 07:48:05.737499 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4612b7e8-a507-4c57-989d-3411e4e302dd","Type":"ContainerStarted","Data":"5dcb020f2b51b1d9b71cd795b65e6f8bc68eb5b0896d6d3e8026f5bbb0c957cb"} Oct 12 07:48:05 crc kubenswrapper[4599]: I1012 07:48:05.738982 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"0ca657cf-7ac2-4cb5-894b-c3a149ee4101","Type":"ContainerStarted","Data":"48ead8091005c34ed45cd6ad65b8725f721aa243504bbec10149ec01246e58f1"} Oct 12 07:48:05 crc kubenswrapper[4599]: I1012 07:48:05.740880 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"fe3787ed-cb03-457b-ad65-b33044cccffd","Type":"ContainerStarted","Data":"2beab65196f5e587f6a7a6e3756f9cb6fddfea084efb63194e2fab061a3c22f6"} Oct 12 07:48:05 crc kubenswrapper[4599]: I1012 07:48:05.741024 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Oct 12 07:48:05 crc kubenswrapper[4599]: I1012 07:48:05.742485 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"e76c3f9c-bea3-4b35-852c-65d48f177d8a","Type":"ContainerStarted","Data":"ceb23feeb5d7a6534f6ab80c7ed64473431485657f6cf0406b2ad661df540d99"} Oct 12 07:48:05 crc kubenswrapper[4599]: I1012 07:48:05.744445 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"4a035bca-ccfe-4dc6-949a-44d2ddf0fa26","Type":"ContainerStarted","Data":"3defc0014d797b3ad71ef0ad6d0eb678f9eb07af1986182b2919e353507e53c8"} Oct 12 07:48:05 crc kubenswrapper[4599]: I1012 07:48:05.759785 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-8s62q" event={"ID":"78cb767a-31ee-4e29-b075-e773a43272c2","Type":"ContainerStarted","Data":"76d87299e649480e865c2b47e574c70b92b47464e34336478f5783b9e2d591e9"} Oct 12 07:48:05 crc kubenswrapper[4599]: I1012 07:48:05.764455 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"9d3345d9-bc30-42dc-98e0-bfd24fee35ab","Type":"ContainerStarted","Data":"7e07e30765ab77c9b34337b3297d502194b8765bd49dfd0a35e834a7752cb1f7"} Oct 12 07:48:05 crc kubenswrapper[4599]: I1012 07:48:05.778381 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9rbk6" event={"ID":"ff6de4a7-bd76-46bc-a376-b1ec8c5ab712","Type":"ContainerStarted","Data":"3729e5a23e1e8d63f7d82067b9b75851f06cde4d7d30ea479dbe90a0de8f384c"} Oct 12 07:48:05 crc kubenswrapper[4599]: I1012 07:48:05.780770 4599 generic.go:334] "Generic (PLEG): container finished" podID="f139cdb1-1669-49a2-8bc4-f68a6bfd67c7" containerID="e0779ce472de984f1c08ec40b9e84f227723d1ccc1bd774eae855920872fc430" exitCode=0 Oct 12 07:48:05 crc kubenswrapper[4599]: I1012 07:48:05.780903 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lbmsk" event={"ID":"f139cdb1-1669-49a2-8bc4-f68a6bfd67c7","Type":"ContainerDied","Data":"e0779ce472de984f1c08ec40b9e84f227723d1ccc1bd774eae855920872fc430"} Oct 12 07:48:05 crc kubenswrapper[4599]: I1012 07:48:05.780959 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lbmsk" event={"ID":"f139cdb1-1669-49a2-8bc4-f68a6bfd67c7","Type":"ContainerStarted","Data":"93ca963b42f9995879fcaf7ee68fc22db633ceeac305f07b1f88f1331891cd8b"} Oct 12 07:48:05 crc kubenswrapper[4599]: I1012 07:48:05.784717 4599 generic.go:334] "Generic (PLEG): container finished" podID="f7ae0131-ce00-4e3f-a679-a9745562291d" containerID="2b5d399a848bfc99c5322ef3076b6dc7bc0144b289965a03cd4f9f5e57ca7488" exitCode=0 Oct 12 07:48:05 crc kubenswrapper[4599]: I1012 07:48:05.784744 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7596fbdcc-5l45x" event={"ID":"f7ae0131-ce00-4e3f-a679-a9745562291d","Type":"ContainerDied","Data":"2b5d399a848bfc99c5322ef3076b6dc7bc0144b289965a03cd4f9f5e57ca7488"} Oct 12 07:48:05 crc kubenswrapper[4599]: I1012 07:48:05.795465 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=2.771587894 podStartE2EDuration="23.795446826s" podCreationTimestamp="2025-10-12 07:47:42 +0000 UTC" firstStartedPulling="2025-10-12 07:47:43.555080903 +0000 UTC m=+760.344276405" lastFinishedPulling="2025-10-12 07:48:04.578939834 +0000 UTC m=+781.368135337" observedRunningTime="2025-10-12 07:48:05.790393751 +0000 UTC m=+782.579589263" watchObservedRunningTime="2025-10-12 07:48:05.795446826 +0000 UTC m=+782.584642328" Oct 12 07:48:05 crc kubenswrapper[4599]: I1012 07:48:05.844556 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 12 07:48:05 crc kubenswrapper[4599]: W1012 07:48:05.893130 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb7b21b67_7112_4507_a5d2_9036f09a3cdf.slice/crio-e8ba9c017ad609b1b1d02c60097b0261c422cc285e3455d853dd7a247fc7194e WatchSource:0}: Error finding container e8ba9c017ad609b1b1d02c60097b0261c422cc285e3455d853dd7a247fc7194e: Status 404 returned error can't find the container with id e8ba9c017ad609b1b1d02c60097b0261c422cc285e3455d853dd7a247fc7194e Oct 12 07:48:06 crc kubenswrapper[4599]: I1012 07:48:06.803369 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"2e036a1a-bc46-419f-88e4-312037490ec1","Type":"ContainerStarted","Data":"8fe013c90596b20721f39bf5f7fa9a8546584a44b02099db74e730f6de411d84"} Oct 12 07:48:06 crc kubenswrapper[4599]: I1012 07:48:06.805923 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"b7b21b67-7112-4507-a5d2-9036f09a3cdf","Type":"ContainerStarted","Data":"e8ba9c017ad609b1b1d02c60097b0261c422cc285e3455d853dd7a247fc7194e"} Oct 12 07:48:07 crc kubenswrapper[4599]: I1012 07:48:07.814386 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"0ca657cf-7ac2-4cb5-894b-c3a149ee4101","Type":"ContainerStarted","Data":"4c62790da42dd9b8491acfb336b6b41933eda934dab99f342dc2b627fcb3872a"} Oct 12 07:48:07 crc kubenswrapper[4599]: I1012 07:48:07.814706 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 12 07:48:07 crc kubenswrapper[4599]: I1012 07:48:07.817197 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69bb789bb9-4nnl8" event={"ID":"91fafba3-ee3c-4380-be34-b897a835c882","Type":"ContainerStarted","Data":"07c16b8ba9e98a9dc2495ed15740955c70c56ee6d1d92468562577005f712803"} Oct 12 07:48:07 crc kubenswrapper[4599]: I1012 07:48:07.817329 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-69bb789bb9-4nnl8" Oct 12 07:48:07 crc kubenswrapper[4599]: I1012 07:48:07.821933 4599 generic.go:334] "Generic (PLEG): container finished" podID="f139cdb1-1669-49a2-8bc4-f68a6bfd67c7" containerID="1c9db235d81e75c4898b424e4ebc71c758f8229e2b5fb73c74c204082b8f8b04" exitCode=0 Oct 12 07:48:07 crc kubenswrapper[4599]: I1012 07:48:07.822005 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lbmsk" event={"ID":"f139cdb1-1669-49a2-8bc4-f68a6bfd67c7","Type":"ContainerDied","Data":"1c9db235d81e75c4898b424e4ebc71c758f8229e2b5fb73c74c204082b8f8b04"} Oct 12 07:48:07 crc kubenswrapper[4599]: I1012 07:48:07.826616 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7596fbdcc-5l45x" event={"ID":"f7ae0131-ce00-4e3f-a679-a9745562291d","Type":"ContainerStarted","Data":"7796354560d808f5774d70e92b9f8274020617e577d26c1b767b6fec5b5abddf"} Oct 12 07:48:07 crc kubenswrapper[4599]: I1012 07:48:07.827252 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7596fbdcc-5l45x" Oct 12 07:48:07 crc kubenswrapper[4599]: I1012 07:48:07.827738 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=22.075908726 podStartE2EDuration="23.827719739s" podCreationTimestamp="2025-10-12 07:47:44 +0000 UTC" firstStartedPulling="2025-10-12 07:48:05.012985645 +0000 UTC m=+781.802181146" lastFinishedPulling="2025-10-12 07:48:06.764796647 +0000 UTC m=+783.553992159" observedRunningTime="2025-10-12 07:48:07.825899775 +0000 UTC m=+784.615095277" watchObservedRunningTime="2025-10-12 07:48:07.827719739 +0000 UTC m=+784.616915241" Oct 12 07:48:07 crc kubenswrapper[4599]: I1012 07:48:07.848042 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-69bb789bb9-4nnl8" podStartSLOduration=4.461626574 podStartE2EDuration="29.848023689s" podCreationTimestamp="2025-10-12 07:47:38 +0000 UTC" firstStartedPulling="2025-10-12 07:47:39.148549771 +0000 UTC m=+755.937745273" lastFinishedPulling="2025-10-12 07:48:04.534946886 +0000 UTC m=+781.324142388" observedRunningTime="2025-10-12 07:48:07.841450928 +0000 UTC m=+784.630646430" watchObservedRunningTime="2025-10-12 07:48:07.848023689 +0000 UTC m=+784.637219191" Oct 12 07:48:07 crc kubenswrapper[4599]: I1012 07:48:07.872827 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7596fbdcc-5l45x" podStartSLOduration=4.175971405 podStartE2EDuration="29.872803165s" podCreationTimestamp="2025-10-12 07:47:38 +0000 UTC" firstStartedPulling="2025-10-12 07:47:38.916685336 +0000 UTC m=+755.705880838" lastFinishedPulling="2025-10-12 07:48:04.613517096 +0000 UTC m=+781.402712598" observedRunningTime="2025-10-12 07:48:07.872327938 +0000 UTC m=+784.661523440" watchObservedRunningTime="2025-10-12 07:48:07.872803165 +0000 UTC m=+784.661998667" Oct 12 07:48:08 crc kubenswrapper[4599]: I1012 07:48:08.836120 4599 generic.go:334] "Generic (PLEG): container finished" podID="4a035bca-ccfe-4dc6-949a-44d2ddf0fa26" containerID="3defc0014d797b3ad71ef0ad6d0eb678f9eb07af1986182b2919e353507e53c8" exitCode=0 Oct 12 07:48:08 crc kubenswrapper[4599]: I1012 07:48:08.836172 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"4a035bca-ccfe-4dc6-949a-44d2ddf0fa26","Type":"ContainerDied","Data":"3defc0014d797b3ad71ef0ad6d0eb678f9eb07af1986182b2919e353507e53c8"} Oct 12 07:48:08 crc kubenswrapper[4599]: I1012 07:48:08.839734 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lbmsk" event={"ID":"f139cdb1-1669-49a2-8bc4-f68a6bfd67c7","Type":"ContainerStarted","Data":"cdabc263cabe01010fcde8d9787fb081990b3ea1bea0c7f033d69ad11403df57"} Oct 12 07:48:08 crc kubenswrapper[4599]: I1012 07:48:08.842461 4599 generic.go:334] "Generic (PLEG): container finished" podID="e76c3f9c-bea3-4b35-852c-65d48f177d8a" containerID="ceb23feeb5d7a6534f6ab80c7ed64473431485657f6cf0406b2ad661df540d99" exitCode=0 Oct 12 07:48:08 crc kubenswrapper[4599]: I1012 07:48:08.842580 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"e76c3f9c-bea3-4b35-852c-65d48f177d8a","Type":"ContainerDied","Data":"ceb23feeb5d7a6534f6ab80c7ed64473431485657f6cf0406b2ad661df540d99"} Oct 12 07:48:08 crc kubenswrapper[4599]: I1012 07:48:08.885371 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-lbmsk" podStartSLOduration=10.427682251 podStartE2EDuration="12.885353429s" podCreationTimestamp="2025-10-12 07:47:56 +0000 UTC" firstStartedPulling="2025-10-12 07:48:05.883580982 +0000 UTC m=+782.672776484" lastFinishedPulling="2025-10-12 07:48:08.34125216 +0000 UTC m=+785.130447662" observedRunningTime="2025-10-12 07:48:08.881605878 +0000 UTC m=+785.670801380" watchObservedRunningTime="2025-10-12 07:48:08.885353429 +0000 UTC m=+785.674548931" Oct 12 07:48:10 crc kubenswrapper[4599]: I1012 07:48:10.865980 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"e76c3f9c-bea3-4b35-852c-65d48f177d8a","Type":"ContainerStarted","Data":"47f071991b888e416d25d6009e227b5797cc9d69b1bc91279c2be9eab3a62a55"} Oct 12 07:48:10 crc kubenswrapper[4599]: I1012 07:48:10.869295 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"4a035bca-ccfe-4dc6-949a-44d2ddf0fa26","Type":"ContainerStarted","Data":"9389c3dae9952d0020b6ce122063bc7a574021792d31c9181721a7ab7a45ad2a"} Oct 12 07:48:10 crc kubenswrapper[4599]: I1012 07:48:10.871378 4599 generic.go:334] "Generic (PLEG): container finished" podID="78cb767a-31ee-4e29-b075-e773a43272c2" containerID="a024ba3b15d9a909f4a697e5b7c25b79f89ee4ee24234d0bad8f029d9c737138" exitCode=0 Oct 12 07:48:10 crc kubenswrapper[4599]: I1012 07:48:10.871445 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-8s62q" event={"ID":"78cb767a-31ee-4e29-b075-e773a43272c2","Type":"ContainerDied","Data":"a024ba3b15d9a909f4a697e5b7c25b79f89ee4ee24234d0bad8f029d9c737138"} Oct 12 07:48:10 crc kubenswrapper[4599]: I1012 07:48:10.872984 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"9d3345d9-bc30-42dc-98e0-bfd24fee35ab","Type":"ContainerStarted","Data":"0f94c2b606955397d2f58134fe37ebe1f47010b83846a5e23466a2f502fc315a"} Oct 12 07:48:10 crc kubenswrapper[4599]: I1012 07:48:10.876647 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"b7b21b67-7112-4507-a5d2-9036f09a3cdf","Type":"ContainerStarted","Data":"6800d8b3ff55f3a63d11ff8f56c9b86266a000dde76fcaef3d69b5c4358d065d"} Oct 12 07:48:10 crc kubenswrapper[4599]: I1012 07:48:10.878512 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9rbk6" event={"ID":"ff6de4a7-bd76-46bc-a376-b1ec8c5ab712","Type":"ContainerStarted","Data":"aef802f7182a871deca4511a07d5126407b9de40e627defc4cecbbd3c947edda"} Oct 12 07:48:10 crc kubenswrapper[4599]: I1012 07:48:10.878690 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-9rbk6" Oct 12 07:48:10 crc kubenswrapper[4599]: I1012 07:48:10.888907 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=8.614981416 podStartE2EDuration="29.888897116s" podCreationTimestamp="2025-10-12 07:47:41 +0000 UTC" firstStartedPulling="2025-10-12 07:47:43.339648364 +0000 UTC m=+760.128843866" lastFinishedPulling="2025-10-12 07:48:04.613564064 +0000 UTC m=+781.402759566" observedRunningTime="2025-10-12 07:48:10.881886477 +0000 UTC m=+787.671081979" watchObservedRunningTime="2025-10-12 07:48:10.888897116 +0000 UTC m=+787.678092617" Oct 12 07:48:10 crc kubenswrapper[4599]: I1012 07:48:10.906822 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-9rbk6" podStartSLOduration=17.556338813 podStartE2EDuration="22.906807188s" podCreationTimestamp="2025-10-12 07:47:48 +0000 UTC" firstStartedPulling="2025-10-12 07:48:05.104644372 +0000 UTC m=+781.893839874" lastFinishedPulling="2025-10-12 07:48:10.455112747 +0000 UTC m=+787.244308249" observedRunningTime="2025-10-12 07:48:10.900940158 +0000 UTC m=+787.690135660" watchObservedRunningTime="2025-10-12 07:48:10.906807188 +0000 UTC m=+787.696002690" Oct 12 07:48:10 crc kubenswrapper[4599]: I1012 07:48:10.958215 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=-9223372005.896584 podStartE2EDuration="30.958193086s" podCreationTimestamp="2025-10-12 07:47:40 +0000 UTC" firstStartedPulling="2025-10-12 07:47:42.125669372 +0000 UTC m=+758.914864873" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:48:10.95408795 +0000 UTC m=+787.743283452" watchObservedRunningTime="2025-10-12 07:48:10.958193086 +0000 UTC m=+787.747388588" Oct 12 07:48:11 crc kubenswrapper[4599]: I1012 07:48:11.372753 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Oct 12 07:48:11 crc kubenswrapper[4599]: I1012 07:48:11.372922 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Oct 12 07:48:11 crc kubenswrapper[4599]: I1012 07:48:11.893114 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-8s62q" event={"ID":"78cb767a-31ee-4e29-b075-e773a43272c2","Type":"ContainerStarted","Data":"9a0c197f367292eab65096dd707093dca408ea4d87e61bbb0f96806f335fe074"} Oct 12 07:48:11 crc kubenswrapper[4599]: I1012 07:48:11.893417 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-8s62q" event={"ID":"78cb767a-31ee-4e29-b075-e773a43272c2","Type":"ContainerStarted","Data":"a25e2a9f9c9c1094e5bbb6db6bd04c3b43780b3a0e90297f9cb45c8d7803855d"} Oct 12 07:48:11 crc kubenswrapper[4599]: I1012 07:48:11.893890 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-8s62q" Oct 12 07:48:11 crc kubenswrapper[4599]: I1012 07:48:11.893915 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-8s62q" Oct 12 07:48:11 crc kubenswrapper[4599]: I1012 07:48:11.917468 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-8s62q" podStartSLOduration=19.42382038 podStartE2EDuration="23.917449163s" podCreationTimestamp="2025-10-12 07:47:48 +0000 UTC" firstStartedPulling="2025-10-12 07:48:05.30946053 +0000 UTC m=+782.098656032" lastFinishedPulling="2025-10-12 07:48:09.803089313 +0000 UTC m=+786.592284815" observedRunningTime="2025-10-12 07:48:11.911197898 +0000 UTC m=+788.700393390" watchObservedRunningTime="2025-10-12 07:48:11.917449163 +0000 UTC m=+788.706644664" Oct 12 07:48:12 crc kubenswrapper[4599]: I1012 07:48:12.723220 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Oct 12 07:48:12 crc kubenswrapper[4599]: I1012 07:48:12.723483 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Oct 12 07:48:13 crc kubenswrapper[4599]: I1012 07:48:13.039587 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Oct 12 07:48:13 crc kubenswrapper[4599]: I1012 07:48:13.440629 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7596fbdcc-5l45x" Oct 12 07:48:13 crc kubenswrapper[4599]: I1012 07:48:13.729364 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-69bb789bb9-4nnl8" Oct 12 07:48:13 crc kubenswrapper[4599]: I1012 07:48:13.770388 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7596fbdcc-5l45x"] Oct 12 07:48:13 crc kubenswrapper[4599]: I1012 07:48:13.916405 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"9d3345d9-bc30-42dc-98e0-bfd24fee35ab","Type":"ContainerStarted","Data":"52dad2fe746d17209ecd7a4ab102a18be4c6b069b7322329dddcea9de3563401"} Oct 12 07:48:13 crc kubenswrapper[4599]: I1012 07:48:13.919419 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"b7b21b67-7112-4507-a5d2-9036f09a3cdf","Type":"ContainerStarted","Data":"abc581fb5e8b23f68734b792645d0b8071e0a20f483b68fd903f850e8febd376"} Oct 12 07:48:13 crc kubenswrapper[4599]: I1012 07:48:13.919561 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7596fbdcc-5l45x" podUID="f7ae0131-ce00-4e3f-a679-a9745562291d" containerName="dnsmasq-dns" containerID="cri-o://7796354560d808f5774d70e92b9f8274020617e577d26c1b767b6fec5b5abddf" gracePeriod=10 Oct 12 07:48:13 crc kubenswrapper[4599]: I1012 07:48:13.932880 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=18.129196784 podStartE2EDuration="25.932862675s" podCreationTimestamp="2025-10-12 07:47:48 +0000 UTC" firstStartedPulling="2025-10-12 07:48:05.281433337 +0000 UTC m=+782.070628839" lastFinishedPulling="2025-10-12 07:48:13.085099229 +0000 UTC m=+789.874294730" observedRunningTime="2025-10-12 07:48:13.931032812 +0000 UTC m=+790.720228314" watchObservedRunningTime="2025-10-12 07:48:13.932862675 +0000 UTC m=+790.722058176" Oct 12 07:48:13 crc kubenswrapper[4599]: I1012 07:48:13.948797 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=16.76666769 podStartE2EDuration="23.94878572s" podCreationTimestamp="2025-10-12 07:47:50 +0000 UTC" firstStartedPulling="2025-10-12 07:48:05.896462056 +0000 UTC m=+782.685657559" lastFinishedPulling="2025-10-12 07:48:13.078580088 +0000 UTC m=+789.867775589" observedRunningTime="2025-10-12 07:48:13.948119232 +0000 UTC m=+790.737314734" watchObservedRunningTime="2025-10-12 07:48:13.94878572 +0000 UTC m=+790.737981221" Oct 12 07:48:14 crc kubenswrapper[4599]: I1012 07:48:14.356485 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7596fbdcc-5l45x" Oct 12 07:48:14 crc kubenswrapper[4599]: I1012 07:48:14.455717 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cd596\" (UniqueName: \"kubernetes.io/projected/f7ae0131-ce00-4e3f-a679-a9745562291d-kube-api-access-cd596\") pod \"f7ae0131-ce00-4e3f-a679-a9745562291d\" (UID: \"f7ae0131-ce00-4e3f-a679-a9745562291d\") " Oct 12 07:48:14 crc kubenswrapper[4599]: I1012 07:48:14.455811 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7ae0131-ce00-4e3f-a679-a9745562291d-config\") pod \"f7ae0131-ce00-4e3f-a679-a9745562291d\" (UID: \"f7ae0131-ce00-4e3f-a679-a9745562291d\") " Oct 12 07:48:14 crc kubenswrapper[4599]: I1012 07:48:14.456049 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f7ae0131-ce00-4e3f-a679-a9745562291d-dns-svc\") pod \"f7ae0131-ce00-4e3f-a679-a9745562291d\" (UID: \"f7ae0131-ce00-4e3f-a679-a9745562291d\") " Oct 12 07:48:14 crc kubenswrapper[4599]: I1012 07:48:14.460802 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7ae0131-ce00-4e3f-a679-a9745562291d-kube-api-access-cd596" (OuterVolumeSpecName: "kube-api-access-cd596") pod "f7ae0131-ce00-4e3f-a679-a9745562291d" (UID: "f7ae0131-ce00-4e3f-a679-a9745562291d"). InnerVolumeSpecName "kube-api-access-cd596". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:48:14 crc kubenswrapper[4599]: I1012 07:48:14.484651 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7ae0131-ce00-4e3f-a679-a9745562291d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f7ae0131-ce00-4e3f-a679-a9745562291d" (UID: "f7ae0131-ce00-4e3f-a679-a9745562291d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:48:14 crc kubenswrapper[4599]: I1012 07:48:14.486567 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7ae0131-ce00-4e3f-a679-a9745562291d-config" (OuterVolumeSpecName: "config") pod "f7ae0131-ce00-4e3f-a679-a9745562291d" (UID: "f7ae0131-ce00-4e3f-a679-a9745562291d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:48:14 crc kubenswrapper[4599]: I1012 07:48:14.558060 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cd596\" (UniqueName: \"kubernetes.io/projected/f7ae0131-ce00-4e3f-a679-a9745562291d-kube-api-access-cd596\") on node \"crc\" DevicePath \"\"" Oct 12 07:48:14 crc kubenswrapper[4599]: I1012 07:48:14.558102 4599 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7ae0131-ce00-4e3f-a679-a9745562291d-config\") on node \"crc\" DevicePath \"\"" Oct 12 07:48:14 crc kubenswrapper[4599]: I1012 07:48:14.558113 4599 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f7ae0131-ce00-4e3f-a679-a9745562291d-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 12 07:48:14 crc kubenswrapper[4599]: I1012 07:48:14.749688 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-54f5b6d497-tl42t"] Oct 12 07:48:14 crc kubenswrapper[4599]: E1012 07:48:14.750021 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7ae0131-ce00-4e3f-a679-a9745562291d" containerName="dnsmasq-dns" Oct 12 07:48:14 crc kubenswrapper[4599]: I1012 07:48:14.750035 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7ae0131-ce00-4e3f-a679-a9745562291d" containerName="dnsmasq-dns" Oct 12 07:48:14 crc kubenswrapper[4599]: E1012 07:48:14.750063 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7ae0131-ce00-4e3f-a679-a9745562291d" containerName="init" Oct 12 07:48:14 crc kubenswrapper[4599]: I1012 07:48:14.750068 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7ae0131-ce00-4e3f-a679-a9745562291d" containerName="init" Oct 12 07:48:14 crc kubenswrapper[4599]: I1012 07:48:14.750229 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7ae0131-ce00-4e3f-a679-a9745562291d" containerName="dnsmasq-dns" Oct 12 07:48:14 crc kubenswrapper[4599]: I1012 07:48:14.751042 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54f5b6d497-tl42t" Oct 12 07:48:14 crc kubenswrapper[4599]: I1012 07:48:14.760583 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Oct 12 07:48:14 crc kubenswrapper[4599]: I1012 07:48:14.764835 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54f5b6d497-tl42t"] Oct 12 07:48:14 crc kubenswrapper[4599]: I1012 07:48:14.769936 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 12 07:48:14 crc kubenswrapper[4599]: I1012 07:48:14.843313 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Oct 12 07:48:14 crc kubenswrapper[4599]: I1012 07:48:14.863097 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kn8zx\" (UniqueName: \"kubernetes.io/projected/d9eb733c-6880-4f6e-a004-cc4028582ca9-kube-api-access-kn8zx\") pod \"dnsmasq-dns-54f5b6d497-tl42t\" (UID: \"d9eb733c-6880-4f6e-a004-cc4028582ca9\") " pod="openstack/dnsmasq-dns-54f5b6d497-tl42t" Oct 12 07:48:14 crc kubenswrapper[4599]: I1012 07:48:14.863228 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9eb733c-6880-4f6e-a004-cc4028582ca9-config\") pod \"dnsmasq-dns-54f5b6d497-tl42t\" (UID: \"d9eb733c-6880-4f6e-a004-cc4028582ca9\") " pod="openstack/dnsmasq-dns-54f5b6d497-tl42t" Oct 12 07:48:14 crc kubenswrapper[4599]: I1012 07:48:14.863273 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d9eb733c-6880-4f6e-a004-cc4028582ca9-dns-svc\") pod \"dnsmasq-dns-54f5b6d497-tl42t\" (UID: \"d9eb733c-6880-4f6e-a004-cc4028582ca9\") " pod="openstack/dnsmasq-dns-54f5b6d497-tl42t" Oct 12 07:48:14 crc kubenswrapper[4599]: I1012 07:48:14.896183 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Oct 12 07:48:14 crc kubenswrapper[4599]: I1012 07:48:14.955177 4599 generic.go:334] "Generic (PLEG): container finished" podID="f7ae0131-ce00-4e3f-a679-a9745562291d" containerID="7796354560d808f5774d70e92b9f8274020617e577d26c1b767b6fec5b5abddf" exitCode=0 Oct 12 07:48:14 crc kubenswrapper[4599]: I1012 07:48:14.956151 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7596fbdcc-5l45x" Oct 12 07:48:14 crc kubenswrapper[4599]: I1012 07:48:14.960021 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7596fbdcc-5l45x" event={"ID":"f7ae0131-ce00-4e3f-a679-a9745562291d","Type":"ContainerDied","Data":"7796354560d808f5774d70e92b9f8274020617e577d26c1b767b6fec5b5abddf"} Oct 12 07:48:14 crc kubenswrapper[4599]: I1012 07:48:14.960091 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7596fbdcc-5l45x" event={"ID":"f7ae0131-ce00-4e3f-a679-a9745562291d","Type":"ContainerDied","Data":"993abbbe1b8a3cb890c897db4cac598f11c8a19d84ab00b34ff453feeed18552"} Oct 12 07:48:14 crc kubenswrapper[4599]: I1012 07:48:14.960113 4599 scope.go:117] "RemoveContainer" containerID="7796354560d808f5774d70e92b9f8274020617e577d26c1b767b6fec5b5abddf" Oct 12 07:48:14 crc kubenswrapper[4599]: I1012 07:48:14.964654 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9eb733c-6880-4f6e-a004-cc4028582ca9-config\") pod \"dnsmasq-dns-54f5b6d497-tl42t\" (UID: \"d9eb733c-6880-4f6e-a004-cc4028582ca9\") " pod="openstack/dnsmasq-dns-54f5b6d497-tl42t" Oct 12 07:48:14 crc kubenswrapper[4599]: I1012 07:48:14.964775 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d9eb733c-6880-4f6e-a004-cc4028582ca9-dns-svc\") pod \"dnsmasq-dns-54f5b6d497-tl42t\" (UID: \"d9eb733c-6880-4f6e-a004-cc4028582ca9\") " pod="openstack/dnsmasq-dns-54f5b6d497-tl42t" Oct 12 07:48:14 crc kubenswrapper[4599]: I1012 07:48:14.964821 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kn8zx\" (UniqueName: \"kubernetes.io/projected/d9eb733c-6880-4f6e-a004-cc4028582ca9-kube-api-access-kn8zx\") pod \"dnsmasq-dns-54f5b6d497-tl42t\" (UID: \"d9eb733c-6880-4f6e-a004-cc4028582ca9\") " pod="openstack/dnsmasq-dns-54f5b6d497-tl42t" Oct 12 07:48:14 crc kubenswrapper[4599]: I1012 07:48:14.965834 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9eb733c-6880-4f6e-a004-cc4028582ca9-config\") pod \"dnsmasq-dns-54f5b6d497-tl42t\" (UID: \"d9eb733c-6880-4f6e-a004-cc4028582ca9\") " pod="openstack/dnsmasq-dns-54f5b6d497-tl42t" Oct 12 07:48:14 crc kubenswrapper[4599]: I1012 07:48:14.967132 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d9eb733c-6880-4f6e-a004-cc4028582ca9-dns-svc\") pod \"dnsmasq-dns-54f5b6d497-tl42t\" (UID: \"d9eb733c-6880-4f6e-a004-cc4028582ca9\") " pod="openstack/dnsmasq-dns-54f5b6d497-tl42t" Oct 12 07:48:14 crc kubenswrapper[4599]: I1012 07:48:14.986312 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kn8zx\" (UniqueName: \"kubernetes.io/projected/d9eb733c-6880-4f6e-a004-cc4028582ca9-kube-api-access-kn8zx\") pod \"dnsmasq-dns-54f5b6d497-tl42t\" (UID: \"d9eb733c-6880-4f6e-a004-cc4028582ca9\") " pod="openstack/dnsmasq-dns-54f5b6d497-tl42t" Oct 12 07:48:14 crc kubenswrapper[4599]: I1012 07:48:14.986568 4599 scope.go:117] "RemoveContainer" containerID="2b5d399a848bfc99c5322ef3076b6dc7bc0144b289965a03cd4f9f5e57ca7488" Oct 12 07:48:15 crc kubenswrapper[4599]: I1012 07:48:15.006850 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7596fbdcc-5l45x"] Oct 12 07:48:15 crc kubenswrapper[4599]: I1012 07:48:15.011779 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7596fbdcc-5l45x"] Oct 12 07:48:15 crc kubenswrapper[4599]: I1012 07:48:15.023426 4599 scope.go:117] "RemoveContainer" containerID="7796354560d808f5774d70e92b9f8274020617e577d26c1b767b6fec5b5abddf" Oct 12 07:48:15 crc kubenswrapper[4599]: E1012 07:48:15.023932 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7796354560d808f5774d70e92b9f8274020617e577d26c1b767b6fec5b5abddf\": container with ID starting with 7796354560d808f5774d70e92b9f8274020617e577d26c1b767b6fec5b5abddf not found: ID does not exist" containerID="7796354560d808f5774d70e92b9f8274020617e577d26c1b767b6fec5b5abddf" Oct 12 07:48:15 crc kubenswrapper[4599]: I1012 07:48:15.023987 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7796354560d808f5774d70e92b9f8274020617e577d26c1b767b6fec5b5abddf"} err="failed to get container status \"7796354560d808f5774d70e92b9f8274020617e577d26c1b767b6fec5b5abddf\": rpc error: code = NotFound desc = could not find container \"7796354560d808f5774d70e92b9f8274020617e577d26c1b767b6fec5b5abddf\": container with ID starting with 7796354560d808f5774d70e92b9f8274020617e577d26c1b767b6fec5b5abddf not found: ID does not exist" Oct 12 07:48:15 crc kubenswrapper[4599]: I1012 07:48:15.024035 4599 scope.go:117] "RemoveContainer" containerID="2b5d399a848bfc99c5322ef3076b6dc7bc0144b289965a03cd4f9f5e57ca7488" Oct 12 07:48:15 crc kubenswrapper[4599]: E1012 07:48:15.024465 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b5d399a848bfc99c5322ef3076b6dc7bc0144b289965a03cd4f9f5e57ca7488\": container with ID starting with 2b5d399a848bfc99c5322ef3076b6dc7bc0144b289965a03cd4f9f5e57ca7488 not found: ID does not exist" containerID="2b5d399a848bfc99c5322ef3076b6dc7bc0144b289965a03cd4f9f5e57ca7488" Oct 12 07:48:15 crc kubenswrapper[4599]: I1012 07:48:15.024504 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b5d399a848bfc99c5322ef3076b6dc7bc0144b289965a03cd4f9f5e57ca7488"} err="failed to get container status \"2b5d399a848bfc99c5322ef3076b6dc7bc0144b289965a03cd4f9f5e57ca7488\": rpc error: code = NotFound desc = could not find container \"2b5d399a848bfc99c5322ef3076b6dc7bc0144b289965a03cd4f9f5e57ca7488\": container with ID starting with 2b5d399a848bfc99c5322ef3076b6dc7bc0144b289965a03cd4f9f5e57ca7488 not found: ID does not exist" Oct 12 07:48:15 crc kubenswrapper[4599]: I1012 07:48:15.069008 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54f5b6d497-tl42t" Oct 12 07:48:15 crc kubenswrapper[4599]: I1012 07:48:15.447680 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54f5b6d497-tl42t"] Oct 12 07:48:15 crc kubenswrapper[4599]: I1012 07:48:15.556079 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7ae0131-ce00-4e3f-a679-a9745562291d" path="/var/lib/kubelet/pods/f7ae0131-ce00-4e3f-a679-a9745562291d/volumes" Oct 12 07:48:15 crc kubenswrapper[4599]: I1012 07:48:15.594238 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Oct 12 07:48:15 crc kubenswrapper[4599]: I1012 07:48:15.625542 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Oct 12 07:48:15 crc kubenswrapper[4599]: I1012 07:48:15.870191 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Oct 12 07:48:15 crc kubenswrapper[4599]: I1012 07:48:15.874897 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 12 07:48:15 crc kubenswrapper[4599]: I1012 07:48:15.877068 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Oct 12 07:48:15 crc kubenswrapper[4599]: I1012 07:48:15.877362 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-77b85" Oct 12 07:48:15 crc kubenswrapper[4599]: I1012 07:48:15.877568 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Oct 12 07:48:15 crc kubenswrapper[4599]: I1012 07:48:15.877621 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Oct 12 07:48:15 crc kubenswrapper[4599]: I1012 07:48:15.887444 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 12 07:48:15 crc kubenswrapper[4599]: I1012 07:48:15.963221 4599 generic.go:334] "Generic (PLEG): container finished" podID="d9eb733c-6880-4f6e-a004-cc4028582ca9" containerID="f721268981052aee61add8db6d2f6afacb4f9864a830bd3c38c0f9fe0bea9044" exitCode=0 Oct 12 07:48:15 crc kubenswrapper[4599]: I1012 07:48:15.963435 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54f5b6d497-tl42t" event={"ID":"d9eb733c-6880-4f6e-a004-cc4028582ca9","Type":"ContainerDied","Data":"f721268981052aee61add8db6d2f6afacb4f9864a830bd3c38c0f9fe0bea9044"} Oct 12 07:48:15 crc kubenswrapper[4599]: I1012 07:48:15.963497 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Oct 12 07:48:15 crc kubenswrapper[4599]: I1012 07:48:15.963512 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54f5b6d497-tl42t" event={"ID":"d9eb733c-6880-4f6e-a004-cc4028582ca9","Type":"ContainerStarted","Data":"b1d7204e87c2600aa199f6b0d8229509b446d20e9908076032d46f0041b2d798"} Oct 12 07:48:15 crc kubenswrapper[4599]: I1012 07:48:15.982104 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"7f1c5e37-2b6d-4058-86e1-466baaa0f6c4\") " pod="openstack/swift-storage-0" Oct 12 07:48:15 crc kubenswrapper[4599]: I1012 07:48:15.982315 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/7f1c5e37-2b6d-4058-86e1-466baaa0f6c4-lock\") pod \"swift-storage-0\" (UID: \"7f1c5e37-2b6d-4058-86e1-466baaa0f6c4\") " pod="openstack/swift-storage-0" Oct 12 07:48:15 crc kubenswrapper[4599]: I1012 07:48:15.982413 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/7f1c5e37-2b6d-4058-86e1-466baaa0f6c4-cache\") pod \"swift-storage-0\" (UID: \"7f1c5e37-2b6d-4058-86e1-466baaa0f6c4\") " pod="openstack/swift-storage-0" Oct 12 07:48:15 crc kubenswrapper[4599]: I1012 07:48:15.982540 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9wvh\" (UniqueName: \"kubernetes.io/projected/7f1c5e37-2b6d-4058-86e1-466baaa0f6c4-kube-api-access-k9wvh\") pod \"swift-storage-0\" (UID: \"7f1c5e37-2b6d-4058-86e1-466baaa0f6c4\") " pod="openstack/swift-storage-0" Oct 12 07:48:15 crc kubenswrapper[4599]: I1012 07:48:15.982628 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7f1c5e37-2b6d-4058-86e1-466baaa0f6c4-etc-swift\") pod \"swift-storage-0\" (UID: \"7f1c5e37-2b6d-4058-86e1-466baaa0f6c4\") " pod="openstack/swift-storage-0" Oct 12 07:48:16 crc kubenswrapper[4599]: I1012 07:48:16.004759 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Oct 12 07:48:16 crc kubenswrapper[4599]: I1012 07:48:16.084797 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7f1c5e37-2b6d-4058-86e1-466baaa0f6c4-etc-swift\") pod \"swift-storage-0\" (UID: \"7f1c5e37-2b6d-4058-86e1-466baaa0f6c4\") " pod="openstack/swift-storage-0" Oct 12 07:48:16 crc kubenswrapper[4599]: I1012 07:48:16.085015 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"7f1c5e37-2b6d-4058-86e1-466baaa0f6c4\") " pod="openstack/swift-storage-0" Oct 12 07:48:16 crc kubenswrapper[4599]: E1012 07:48:16.085068 4599 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 12 07:48:16 crc kubenswrapper[4599]: E1012 07:48:16.085104 4599 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 12 07:48:16 crc kubenswrapper[4599]: I1012 07:48:16.085155 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/7f1c5e37-2b6d-4058-86e1-466baaa0f6c4-lock\") pod \"swift-storage-0\" (UID: \"7f1c5e37-2b6d-4058-86e1-466baaa0f6c4\") " pod="openstack/swift-storage-0" Oct 12 07:48:16 crc kubenswrapper[4599]: E1012 07:48:16.085181 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7f1c5e37-2b6d-4058-86e1-466baaa0f6c4-etc-swift podName:7f1c5e37-2b6d-4058-86e1-466baaa0f6c4 nodeName:}" failed. No retries permitted until 2025-10-12 07:48:16.585160077 +0000 UTC m=+793.374355579 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/7f1c5e37-2b6d-4058-86e1-466baaa0f6c4-etc-swift") pod "swift-storage-0" (UID: "7f1c5e37-2b6d-4058-86e1-466baaa0f6c4") : configmap "swift-ring-files" not found Oct 12 07:48:16 crc kubenswrapper[4599]: I1012 07:48:16.085265 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/7f1c5e37-2b6d-4058-86e1-466baaa0f6c4-cache\") pod \"swift-storage-0\" (UID: \"7f1c5e37-2b6d-4058-86e1-466baaa0f6c4\") " pod="openstack/swift-storage-0" Oct 12 07:48:16 crc kubenswrapper[4599]: I1012 07:48:16.085356 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9wvh\" (UniqueName: \"kubernetes.io/projected/7f1c5e37-2b6d-4058-86e1-466baaa0f6c4-kube-api-access-k9wvh\") pod \"swift-storage-0\" (UID: \"7f1c5e37-2b6d-4058-86e1-466baaa0f6c4\") " pod="openstack/swift-storage-0" Oct 12 07:48:16 crc kubenswrapper[4599]: I1012 07:48:16.085491 4599 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"7f1c5e37-2b6d-4058-86e1-466baaa0f6c4\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/swift-storage-0" Oct 12 07:48:16 crc kubenswrapper[4599]: I1012 07:48:16.085868 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/7f1c5e37-2b6d-4058-86e1-466baaa0f6c4-lock\") pod \"swift-storage-0\" (UID: \"7f1c5e37-2b6d-4058-86e1-466baaa0f6c4\") " pod="openstack/swift-storage-0" Oct 12 07:48:16 crc kubenswrapper[4599]: I1012 07:48:16.086989 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/7f1c5e37-2b6d-4058-86e1-466baaa0f6c4-cache\") pod \"swift-storage-0\" (UID: \"7f1c5e37-2b6d-4058-86e1-466baaa0f6c4\") " pod="openstack/swift-storage-0" Oct 12 07:48:16 crc kubenswrapper[4599]: I1012 07:48:16.109048 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9wvh\" (UniqueName: \"kubernetes.io/projected/7f1c5e37-2b6d-4058-86e1-466baaa0f6c4-kube-api-access-k9wvh\") pod \"swift-storage-0\" (UID: \"7f1c5e37-2b6d-4058-86e1-466baaa0f6c4\") " pod="openstack/swift-storage-0" Oct 12 07:48:16 crc kubenswrapper[4599]: I1012 07:48:16.112641 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"7f1c5e37-2b6d-4058-86e1-466baaa0f6c4\") " pod="openstack/swift-storage-0" Oct 12 07:48:16 crc kubenswrapper[4599]: I1012 07:48:16.283624 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54f5b6d497-tl42t"] Oct 12 07:48:16 crc kubenswrapper[4599]: I1012 07:48:16.316473 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-85dd5d7485-f9tjt"] Oct 12 07:48:16 crc kubenswrapper[4599]: I1012 07:48:16.317870 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85dd5d7485-f9tjt" Oct 12 07:48:16 crc kubenswrapper[4599]: I1012 07:48:16.319602 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Oct 12 07:48:16 crc kubenswrapper[4599]: I1012 07:48:16.327967 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85dd5d7485-f9tjt"] Oct 12 07:48:16 crc kubenswrapper[4599]: I1012 07:48:16.357320 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-w8g6f"] Oct 12 07:48:16 crc kubenswrapper[4599]: I1012 07:48:16.358272 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-w8g6f" Oct 12 07:48:16 crc kubenswrapper[4599]: I1012 07:48:16.363390 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-w8g6f"] Oct 12 07:48:16 crc kubenswrapper[4599]: I1012 07:48:16.363909 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 12 07:48:16 crc kubenswrapper[4599]: I1012 07:48:16.364103 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Oct 12 07:48:16 crc kubenswrapper[4599]: I1012 07:48:16.364228 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Oct 12 07:48:16 crc kubenswrapper[4599]: I1012 07:48:16.364484 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-lbmsk" Oct 12 07:48:16 crc kubenswrapper[4599]: I1012 07:48:16.364559 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-lbmsk" Oct 12 07:48:16 crc kubenswrapper[4599]: I1012 07:48:16.414975 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-lbmsk" Oct 12 07:48:16 crc kubenswrapper[4599]: I1012 07:48:16.459756 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-4dn56"] Oct 12 07:48:16 crc kubenswrapper[4599]: I1012 07:48:16.461744 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-4dn56" Oct 12 07:48:16 crc kubenswrapper[4599]: I1012 07:48:16.465214 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Oct 12 07:48:16 crc kubenswrapper[4599]: I1012 07:48:16.487294 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-4dn56"] Oct 12 07:48:16 crc kubenswrapper[4599]: I1012 07:48:16.492064 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxh86\" (UniqueName: \"kubernetes.io/projected/67c4df83-6f28-4ad0-b53c-383ebe12642f-kube-api-access-pxh86\") pod \"dnsmasq-dns-85dd5d7485-f9tjt\" (UID: \"67c4df83-6f28-4ad0-b53c-383ebe12642f\") " pod="openstack/dnsmasq-dns-85dd5d7485-f9tjt" Oct 12 07:48:16 crc kubenswrapper[4599]: I1012 07:48:16.492110 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5llh\" (UniqueName: \"kubernetes.io/projected/3f44cfe9-f015-4084-b100-fbb08f528667-kube-api-access-t5llh\") pod \"swift-ring-rebalance-w8g6f\" (UID: \"3f44cfe9-f015-4084-b100-fbb08f528667\") " pod="openstack/swift-ring-rebalance-w8g6f" Oct 12 07:48:16 crc kubenswrapper[4599]: I1012 07:48:16.492158 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3f44cfe9-f015-4084-b100-fbb08f528667-ring-data-devices\") pod \"swift-ring-rebalance-w8g6f\" (UID: \"3f44cfe9-f015-4084-b100-fbb08f528667\") " pod="openstack/swift-ring-rebalance-w8g6f" Oct 12 07:48:16 crc kubenswrapper[4599]: I1012 07:48:16.492179 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/67c4df83-6f28-4ad0-b53c-383ebe12642f-dns-svc\") pod \"dnsmasq-dns-85dd5d7485-f9tjt\" (UID: \"67c4df83-6f28-4ad0-b53c-383ebe12642f\") " pod="openstack/dnsmasq-dns-85dd5d7485-f9tjt" Oct 12 07:48:16 crc kubenswrapper[4599]: I1012 07:48:16.492216 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3f44cfe9-f015-4084-b100-fbb08f528667-scripts\") pod \"swift-ring-rebalance-w8g6f\" (UID: \"3f44cfe9-f015-4084-b100-fbb08f528667\") " pod="openstack/swift-ring-rebalance-w8g6f" Oct 12 07:48:16 crc kubenswrapper[4599]: I1012 07:48:16.492238 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67c4df83-6f28-4ad0-b53c-383ebe12642f-config\") pod \"dnsmasq-dns-85dd5d7485-f9tjt\" (UID: \"67c4df83-6f28-4ad0-b53c-383ebe12642f\") " pod="openstack/dnsmasq-dns-85dd5d7485-f9tjt" Oct 12 07:48:16 crc kubenswrapper[4599]: I1012 07:48:16.492260 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3f44cfe9-f015-4084-b100-fbb08f528667-dispersionconf\") pod \"swift-ring-rebalance-w8g6f\" (UID: \"3f44cfe9-f015-4084-b100-fbb08f528667\") " pod="openstack/swift-ring-rebalance-w8g6f" Oct 12 07:48:16 crc kubenswrapper[4599]: I1012 07:48:16.492277 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3f44cfe9-f015-4084-b100-fbb08f528667-swiftconf\") pod \"swift-ring-rebalance-w8g6f\" (UID: \"3f44cfe9-f015-4084-b100-fbb08f528667\") " pod="openstack/swift-ring-rebalance-w8g6f" Oct 12 07:48:16 crc kubenswrapper[4599]: I1012 07:48:16.492346 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f44cfe9-f015-4084-b100-fbb08f528667-combined-ca-bundle\") pod \"swift-ring-rebalance-w8g6f\" (UID: \"3f44cfe9-f015-4084-b100-fbb08f528667\") " pod="openstack/swift-ring-rebalance-w8g6f" Oct 12 07:48:16 crc kubenswrapper[4599]: I1012 07:48:16.492378 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/67c4df83-6f28-4ad0-b53c-383ebe12642f-ovsdbserver-sb\") pod \"dnsmasq-dns-85dd5d7485-f9tjt\" (UID: \"67c4df83-6f28-4ad0-b53c-383ebe12642f\") " pod="openstack/dnsmasq-dns-85dd5d7485-f9tjt" Oct 12 07:48:16 crc kubenswrapper[4599]: I1012 07:48:16.492420 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3f44cfe9-f015-4084-b100-fbb08f528667-etc-swift\") pod \"swift-ring-rebalance-w8g6f\" (UID: \"3f44cfe9-f015-4084-b100-fbb08f528667\") " pod="openstack/swift-ring-rebalance-w8g6f" Oct 12 07:48:16 crc kubenswrapper[4599]: I1012 07:48:16.594345 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/ce16e02d-17cf-467a-aca5-944a67d4cd79-ovs-rundir\") pod \"ovn-controller-metrics-4dn56\" (UID: \"ce16e02d-17cf-467a-aca5-944a67d4cd79\") " pod="openstack/ovn-controller-metrics-4dn56" Oct 12 07:48:16 crc kubenswrapper[4599]: I1012 07:48:16.594414 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3f44cfe9-f015-4084-b100-fbb08f528667-etc-swift\") pod \"swift-ring-rebalance-w8g6f\" (UID: \"3f44cfe9-f015-4084-b100-fbb08f528667\") " pod="openstack/swift-ring-rebalance-w8g6f" Oct 12 07:48:16 crc kubenswrapper[4599]: I1012 07:48:16.594442 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce16e02d-17cf-467a-aca5-944a67d4cd79-combined-ca-bundle\") pod \"ovn-controller-metrics-4dn56\" (UID: \"ce16e02d-17cf-467a-aca5-944a67d4cd79\") " pod="openstack/ovn-controller-metrics-4dn56" Oct 12 07:48:16 crc kubenswrapper[4599]: I1012 07:48:16.594478 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7f1c5e37-2b6d-4058-86e1-466baaa0f6c4-etc-swift\") pod \"swift-storage-0\" (UID: \"7f1c5e37-2b6d-4058-86e1-466baaa0f6c4\") " pod="openstack/swift-storage-0" Oct 12 07:48:16 crc kubenswrapper[4599]: I1012 07:48:16.594496 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bc5zc\" (UniqueName: \"kubernetes.io/projected/ce16e02d-17cf-467a-aca5-944a67d4cd79-kube-api-access-bc5zc\") pod \"ovn-controller-metrics-4dn56\" (UID: \"ce16e02d-17cf-467a-aca5-944a67d4cd79\") " pod="openstack/ovn-controller-metrics-4dn56" Oct 12 07:48:16 crc kubenswrapper[4599]: I1012 07:48:16.594556 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/ce16e02d-17cf-467a-aca5-944a67d4cd79-ovn-rundir\") pod \"ovn-controller-metrics-4dn56\" (UID: \"ce16e02d-17cf-467a-aca5-944a67d4cd79\") " pod="openstack/ovn-controller-metrics-4dn56" Oct 12 07:48:16 crc kubenswrapper[4599]: I1012 07:48:16.594577 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxh86\" (UniqueName: \"kubernetes.io/projected/67c4df83-6f28-4ad0-b53c-383ebe12642f-kube-api-access-pxh86\") pod \"dnsmasq-dns-85dd5d7485-f9tjt\" (UID: \"67c4df83-6f28-4ad0-b53c-383ebe12642f\") " pod="openstack/dnsmasq-dns-85dd5d7485-f9tjt" Oct 12 07:48:16 crc kubenswrapper[4599]: I1012 07:48:16.594603 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce16e02d-17cf-467a-aca5-944a67d4cd79-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-4dn56\" (UID: \"ce16e02d-17cf-467a-aca5-944a67d4cd79\") " pod="openstack/ovn-controller-metrics-4dn56" Oct 12 07:48:16 crc kubenswrapper[4599]: I1012 07:48:16.594634 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5llh\" (UniqueName: \"kubernetes.io/projected/3f44cfe9-f015-4084-b100-fbb08f528667-kube-api-access-t5llh\") pod \"swift-ring-rebalance-w8g6f\" (UID: \"3f44cfe9-f015-4084-b100-fbb08f528667\") " pod="openstack/swift-ring-rebalance-w8g6f" Oct 12 07:48:16 crc kubenswrapper[4599]: I1012 07:48:16.594677 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce16e02d-17cf-467a-aca5-944a67d4cd79-config\") pod \"ovn-controller-metrics-4dn56\" (UID: \"ce16e02d-17cf-467a-aca5-944a67d4cd79\") " pod="openstack/ovn-controller-metrics-4dn56" Oct 12 07:48:16 crc kubenswrapper[4599]: I1012 07:48:16.594746 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3f44cfe9-f015-4084-b100-fbb08f528667-ring-data-devices\") pod \"swift-ring-rebalance-w8g6f\" (UID: \"3f44cfe9-f015-4084-b100-fbb08f528667\") " pod="openstack/swift-ring-rebalance-w8g6f" Oct 12 07:48:16 crc kubenswrapper[4599]: I1012 07:48:16.594781 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/67c4df83-6f28-4ad0-b53c-383ebe12642f-dns-svc\") pod \"dnsmasq-dns-85dd5d7485-f9tjt\" (UID: \"67c4df83-6f28-4ad0-b53c-383ebe12642f\") " pod="openstack/dnsmasq-dns-85dd5d7485-f9tjt" Oct 12 07:48:16 crc kubenswrapper[4599]: I1012 07:48:16.594801 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3f44cfe9-f015-4084-b100-fbb08f528667-scripts\") pod \"swift-ring-rebalance-w8g6f\" (UID: \"3f44cfe9-f015-4084-b100-fbb08f528667\") " pod="openstack/swift-ring-rebalance-w8g6f" Oct 12 07:48:16 crc kubenswrapper[4599]: I1012 07:48:16.594829 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67c4df83-6f28-4ad0-b53c-383ebe12642f-config\") pod \"dnsmasq-dns-85dd5d7485-f9tjt\" (UID: \"67c4df83-6f28-4ad0-b53c-383ebe12642f\") " pod="openstack/dnsmasq-dns-85dd5d7485-f9tjt" Oct 12 07:48:16 crc kubenswrapper[4599]: I1012 07:48:16.594874 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3f44cfe9-f015-4084-b100-fbb08f528667-dispersionconf\") pod \"swift-ring-rebalance-w8g6f\" (UID: \"3f44cfe9-f015-4084-b100-fbb08f528667\") " pod="openstack/swift-ring-rebalance-w8g6f" Oct 12 07:48:16 crc kubenswrapper[4599]: I1012 07:48:16.594893 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3f44cfe9-f015-4084-b100-fbb08f528667-swiftconf\") pod \"swift-ring-rebalance-w8g6f\" (UID: \"3f44cfe9-f015-4084-b100-fbb08f528667\") " pod="openstack/swift-ring-rebalance-w8g6f" Oct 12 07:48:16 crc kubenswrapper[4599]: I1012 07:48:16.594921 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f44cfe9-f015-4084-b100-fbb08f528667-combined-ca-bundle\") pod \"swift-ring-rebalance-w8g6f\" (UID: \"3f44cfe9-f015-4084-b100-fbb08f528667\") " pod="openstack/swift-ring-rebalance-w8g6f" Oct 12 07:48:16 crc kubenswrapper[4599]: E1012 07:48:16.594927 4599 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 12 07:48:16 crc kubenswrapper[4599]: E1012 07:48:16.594967 4599 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 12 07:48:16 crc kubenswrapper[4599]: E1012 07:48:16.595060 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7f1c5e37-2b6d-4058-86e1-466baaa0f6c4-etc-swift podName:7f1c5e37-2b6d-4058-86e1-466baaa0f6c4 nodeName:}" failed. No retries permitted until 2025-10-12 07:48:17.595013406 +0000 UTC m=+794.384208907 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/7f1c5e37-2b6d-4058-86e1-466baaa0f6c4-etc-swift") pod "swift-storage-0" (UID: "7f1c5e37-2b6d-4058-86e1-466baaa0f6c4") : configmap "swift-ring-files" not found Oct 12 07:48:16 crc kubenswrapper[4599]: I1012 07:48:16.595821 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/67c4df83-6f28-4ad0-b53c-383ebe12642f-ovsdbserver-sb\") pod \"dnsmasq-dns-85dd5d7485-f9tjt\" (UID: \"67c4df83-6f28-4ad0-b53c-383ebe12642f\") " pod="openstack/dnsmasq-dns-85dd5d7485-f9tjt" Oct 12 07:48:16 crc kubenswrapper[4599]: I1012 07:48:16.595915 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3f44cfe9-f015-4084-b100-fbb08f528667-etc-swift\") pod \"swift-ring-rebalance-w8g6f\" (UID: \"3f44cfe9-f015-4084-b100-fbb08f528667\") " pod="openstack/swift-ring-rebalance-w8g6f" Oct 12 07:48:16 crc kubenswrapper[4599]: I1012 07:48:16.594953 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/67c4df83-6f28-4ad0-b53c-383ebe12642f-ovsdbserver-sb\") pod \"dnsmasq-dns-85dd5d7485-f9tjt\" (UID: \"67c4df83-6f28-4ad0-b53c-383ebe12642f\") " pod="openstack/dnsmasq-dns-85dd5d7485-f9tjt" Oct 12 07:48:16 crc kubenswrapper[4599]: I1012 07:48:16.596714 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/67c4df83-6f28-4ad0-b53c-383ebe12642f-dns-svc\") pod \"dnsmasq-dns-85dd5d7485-f9tjt\" (UID: \"67c4df83-6f28-4ad0-b53c-383ebe12642f\") " pod="openstack/dnsmasq-dns-85dd5d7485-f9tjt" Oct 12 07:48:16 crc kubenswrapper[4599]: I1012 07:48:16.597419 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67c4df83-6f28-4ad0-b53c-383ebe12642f-config\") pod \"dnsmasq-dns-85dd5d7485-f9tjt\" (UID: \"67c4df83-6f28-4ad0-b53c-383ebe12642f\") " pod="openstack/dnsmasq-dns-85dd5d7485-f9tjt" Oct 12 07:48:16 crc kubenswrapper[4599]: I1012 07:48:16.597435 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3f44cfe9-f015-4084-b100-fbb08f528667-ring-data-devices\") pod \"swift-ring-rebalance-w8g6f\" (UID: \"3f44cfe9-f015-4084-b100-fbb08f528667\") " pod="openstack/swift-ring-rebalance-w8g6f" Oct 12 07:48:16 crc kubenswrapper[4599]: I1012 07:48:16.603820 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3f44cfe9-f015-4084-b100-fbb08f528667-dispersionconf\") pod \"swift-ring-rebalance-w8g6f\" (UID: \"3f44cfe9-f015-4084-b100-fbb08f528667\") " pod="openstack/swift-ring-rebalance-w8g6f" Oct 12 07:48:16 crc kubenswrapper[4599]: I1012 07:48:16.610906 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3f44cfe9-f015-4084-b100-fbb08f528667-swiftconf\") pod \"swift-ring-rebalance-w8g6f\" (UID: \"3f44cfe9-f015-4084-b100-fbb08f528667\") " pod="openstack/swift-ring-rebalance-w8g6f" Oct 12 07:48:16 crc kubenswrapper[4599]: I1012 07:48:16.611927 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxh86\" (UniqueName: \"kubernetes.io/projected/67c4df83-6f28-4ad0-b53c-383ebe12642f-kube-api-access-pxh86\") pod \"dnsmasq-dns-85dd5d7485-f9tjt\" (UID: \"67c4df83-6f28-4ad0-b53c-383ebe12642f\") " pod="openstack/dnsmasq-dns-85dd5d7485-f9tjt" Oct 12 07:48:16 crc kubenswrapper[4599]: I1012 07:48:16.611976 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3f44cfe9-f015-4084-b100-fbb08f528667-scripts\") pod \"swift-ring-rebalance-w8g6f\" (UID: \"3f44cfe9-f015-4084-b100-fbb08f528667\") " pod="openstack/swift-ring-rebalance-w8g6f" Oct 12 07:48:16 crc kubenswrapper[4599]: I1012 07:48:16.620803 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5llh\" (UniqueName: \"kubernetes.io/projected/3f44cfe9-f015-4084-b100-fbb08f528667-kube-api-access-t5llh\") pod \"swift-ring-rebalance-w8g6f\" (UID: \"3f44cfe9-f015-4084-b100-fbb08f528667\") " pod="openstack/swift-ring-rebalance-w8g6f" Oct 12 07:48:16 crc kubenswrapper[4599]: I1012 07:48:16.622083 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f44cfe9-f015-4084-b100-fbb08f528667-combined-ca-bundle\") pod \"swift-ring-rebalance-w8g6f\" (UID: \"3f44cfe9-f015-4084-b100-fbb08f528667\") " pod="openstack/swift-ring-rebalance-w8g6f" Oct 12 07:48:16 crc kubenswrapper[4599]: I1012 07:48:16.640946 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85dd5d7485-f9tjt" Oct 12 07:48:16 crc kubenswrapper[4599]: I1012 07:48:16.693773 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-w8g6f" Oct 12 07:48:16 crc kubenswrapper[4599]: I1012 07:48:16.699729 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/ce16e02d-17cf-467a-aca5-944a67d4cd79-ovs-rundir\") pod \"ovn-controller-metrics-4dn56\" (UID: \"ce16e02d-17cf-467a-aca5-944a67d4cd79\") " pod="openstack/ovn-controller-metrics-4dn56" Oct 12 07:48:16 crc kubenswrapper[4599]: I1012 07:48:16.699775 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce16e02d-17cf-467a-aca5-944a67d4cd79-combined-ca-bundle\") pod \"ovn-controller-metrics-4dn56\" (UID: \"ce16e02d-17cf-467a-aca5-944a67d4cd79\") " pod="openstack/ovn-controller-metrics-4dn56" Oct 12 07:48:16 crc kubenswrapper[4599]: I1012 07:48:16.699816 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bc5zc\" (UniqueName: \"kubernetes.io/projected/ce16e02d-17cf-467a-aca5-944a67d4cd79-kube-api-access-bc5zc\") pod \"ovn-controller-metrics-4dn56\" (UID: \"ce16e02d-17cf-467a-aca5-944a67d4cd79\") " pod="openstack/ovn-controller-metrics-4dn56" Oct 12 07:48:16 crc kubenswrapper[4599]: I1012 07:48:16.699850 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/ce16e02d-17cf-467a-aca5-944a67d4cd79-ovn-rundir\") pod \"ovn-controller-metrics-4dn56\" (UID: \"ce16e02d-17cf-467a-aca5-944a67d4cd79\") " pod="openstack/ovn-controller-metrics-4dn56" Oct 12 07:48:16 crc kubenswrapper[4599]: I1012 07:48:16.699874 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce16e02d-17cf-467a-aca5-944a67d4cd79-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-4dn56\" (UID: \"ce16e02d-17cf-467a-aca5-944a67d4cd79\") " pod="openstack/ovn-controller-metrics-4dn56" Oct 12 07:48:16 crc kubenswrapper[4599]: I1012 07:48:16.699908 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce16e02d-17cf-467a-aca5-944a67d4cd79-config\") pod \"ovn-controller-metrics-4dn56\" (UID: \"ce16e02d-17cf-467a-aca5-944a67d4cd79\") " pod="openstack/ovn-controller-metrics-4dn56" Oct 12 07:48:16 crc kubenswrapper[4599]: I1012 07:48:16.700624 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce16e02d-17cf-467a-aca5-944a67d4cd79-config\") pod \"ovn-controller-metrics-4dn56\" (UID: \"ce16e02d-17cf-467a-aca5-944a67d4cd79\") " pod="openstack/ovn-controller-metrics-4dn56" Oct 12 07:48:16 crc kubenswrapper[4599]: I1012 07:48:16.700823 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/ce16e02d-17cf-467a-aca5-944a67d4cd79-ovs-rundir\") pod \"ovn-controller-metrics-4dn56\" (UID: \"ce16e02d-17cf-467a-aca5-944a67d4cd79\") " pod="openstack/ovn-controller-metrics-4dn56" Oct 12 07:48:16 crc kubenswrapper[4599]: I1012 07:48:16.701605 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/ce16e02d-17cf-467a-aca5-944a67d4cd79-ovn-rundir\") pod \"ovn-controller-metrics-4dn56\" (UID: \"ce16e02d-17cf-467a-aca5-944a67d4cd79\") " pod="openstack/ovn-controller-metrics-4dn56" Oct 12 07:48:16 crc kubenswrapper[4599]: I1012 07:48:16.708225 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce16e02d-17cf-467a-aca5-944a67d4cd79-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-4dn56\" (UID: \"ce16e02d-17cf-467a-aca5-944a67d4cd79\") " pod="openstack/ovn-controller-metrics-4dn56" Oct 12 07:48:16 crc kubenswrapper[4599]: I1012 07:48:16.709728 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce16e02d-17cf-467a-aca5-944a67d4cd79-combined-ca-bundle\") pod \"ovn-controller-metrics-4dn56\" (UID: \"ce16e02d-17cf-467a-aca5-944a67d4cd79\") " pod="openstack/ovn-controller-metrics-4dn56" Oct 12 07:48:16 crc kubenswrapper[4599]: I1012 07:48:16.723827 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bc5zc\" (UniqueName: \"kubernetes.io/projected/ce16e02d-17cf-467a-aca5-944a67d4cd79-kube-api-access-bc5zc\") pod \"ovn-controller-metrics-4dn56\" (UID: \"ce16e02d-17cf-467a-aca5-944a67d4cd79\") " pod="openstack/ovn-controller-metrics-4dn56" Oct 12 07:48:16 crc kubenswrapper[4599]: I1012 07:48:16.737662 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85dd5d7485-f9tjt"] Oct 12 07:48:16 crc kubenswrapper[4599]: I1012 07:48:16.749250 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-d5c4f869-vd6xg"] Oct 12 07:48:16 crc kubenswrapper[4599]: I1012 07:48:16.754397 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d5c4f869-vd6xg" Oct 12 07:48:16 crc kubenswrapper[4599]: I1012 07:48:16.760501 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d5c4f869-vd6xg"] Oct 12 07:48:16 crc kubenswrapper[4599]: I1012 07:48:16.761559 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Oct 12 07:48:16 crc kubenswrapper[4599]: I1012 07:48:16.762133 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Oct 12 07:48:16 crc kubenswrapper[4599]: I1012 07:48:16.777717 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-4dn56" Oct 12 07:48:16 crc kubenswrapper[4599]: I1012 07:48:16.806696 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Oct 12 07:48:16 crc kubenswrapper[4599]: I1012 07:48:16.903581 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/946b8921-7749-4631-a3a0-17b48397fb1a-config\") pod \"dnsmasq-dns-d5c4f869-vd6xg\" (UID: \"946b8921-7749-4631-a3a0-17b48397fb1a\") " pod="openstack/dnsmasq-dns-d5c4f869-vd6xg" Oct 12 07:48:16 crc kubenswrapper[4599]: I1012 07:48:16.904742 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kltjz\" (UniqueName: \"kubernetes.io/projected/946b8921-7749-4631-a3a0-17b48397fb1a-kube-api-access-kltjz\") pod \"dnsmasq-dns-d5c4f869-vd6xg\" (UID: \"946b8921-7749-4631-a3a0-17b48397fb1a\") " pod="openstack/dnsmasq-dns-d5c4f869-vd6xg" Oct 12 07:48:16 crc kubenswrapper[4599]: I1012 07:48:16.904994 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/946b8921-7749-4631-a3a0-17b48397fb1a-ovsdbserver-sb\") pod \"dnsmasq-dns-d5c4f869-vd6xg\" (UID: \"946b8921-7749-4631-a3a0-17b48397fb1a\") " pod="openstack/dnsmasq-dns-d5c4f869-vd6xg" Oct 12 07:48:16 crc kubenswrapper[4599]: I1012 07:48:16.905104 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/946b8921-7749-4631-a3a0-17b48397fb1a-ovsdbserver-nb\") pod \"dnsmasq-dns-d5c4f869-vd6xg\" (UID: \"946b8921-7749-4631-a3a0-17b48397fb1a\") " pod="openstack/dnsmasq-dns-d5c4f869-vd6xg" Oct 12 07:48:16 crc kubenswrapper[4599]: I1012 07:48:16.905177 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/946b8921-7749-4631-a3a0-17b48397fb1a-dns-svc\") pod \"dnsmasq-dns-d5c4f869-vd6xg\" (UID: \"946b8921-7749-4631-a3a0-17b48397fb1a\") " pod="openstack/dnsmasq-dns-d5c4f869-vd6xg" Oct 12 07:48:16 crc kubenswrapper[4599]: I1012 07:48:16.973487 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54f5b6d497-tl42t" event={"ID":"d9eb733c-6880-4f6e-a004-cc4028582ca9","Type":"ContainerStarted","Data":"e5ae2acf8cfa378ca3fda23273bf0ea5fbb77598a288e24fa25f1292c5d5536e"} Oct 12 07:48:16 crc kubenswrapper[4599]: I1012 07:48:16.994492 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-54f5b6d497-tl42t" podStartSLOduration=2.994476415 podStartE2EDuration="2.994476415s" podCreationTimestamp="2025-10-12 07:48:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:48:16.985575139 +0000 UTC m=+793.774770641" watchObservedRunningTime="2025-10-12 07:48:16.994476415 +0000 UTC m=+793.783671916" Oct 12 07:48:17 crc kubenswrapper[4599]: I1012 07:48:17.006992 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/946b8921-7749-4631-a3a0-17b48397fb1a-config\") pod \"dnsmasq-dns-d5c4f869-vd6xg\" (UID: \"946b8921-7749-4631-a3a0-17b48397fb1a\") " pod="openstack/dnsmasq-dns-d5c4f869-vd6xg" Oct 12 07:48:17 crc kubenswrapper[4599]: I1012 07:48:17.007037 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kltjz\" (UniqueName: \"kubernetes.io/projected/946b8921-7749-4631-a3a0-17b48397fb1a-kube-api-access-kltjz\") pod \"dnsmasq-dns-d5c4f869-vd6xg\" (UID: \"946b8921-7749-4631-a3a0-17b48397fb1a\") " pod="openstack/dnsmasq-dns-d5c4f869-vd6xg" Oct 12 07:48:17 crc kubenswrapper[4599]: I1012 07:48:17.007076 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/946b8921-7749-4631-a3a0-17b48397fb1a-ovsdbserver-sb\") pod \"dnsmasq-dns-d5c4f869-vd6xg\" (UID: \"946b8921-7749-4631-a3a0-17b48397fb1a\") " pod="openstack/dnsmasq-dns-d5c4f869-vd6xg" Oct 12 07:48:17 crc kubenswrapper[4599]: I1012 07:48:17.007112 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/946b8921-7749-4631-a3a0-17b48397fb1a-ovsdbserver-nb\") pod \"dnsmasq-dns-d5c4f869-vd6xg\" (UID: \"946b8921-7749-4631-a3a0-17b48397fb1a\") " pod="openstack/dnsmasq-dns-d5c4f869-vd6xg" Oct 12 07:48:17 crc kubenswrapper[4599]: I1012 07:48:17.007137 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/946b8921-7749-4631-a3a0-17b48397fb1a-dns-svc\") pod \"dnsmasq-dns-d5c4f869-vd6xg\" (UID: \"946b8921-7749-4631-a3a0-17b48397fb1a\") " pod="openstack/dnsmasq-dns-d5c4f869-vd6xg" Oct 12 07:48:17 crc kubenswrapper[4599]: I1012 07:48:17.007936 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-lbmsk" Oct 12 07:48:17 crc kubenswrapper[4599]: I1012 07:48:17.008526 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/946b8921-7749-4631-a3a0-17b48397fb1a-ovsdbserver-sb\") pod \"dnsmasq-dns-d5c4f869-vd6xg\" (UID: \"946b8921-7749-4631-a3a0-17b48397fb1a\") " pod="openstack/dnsmasq-dns-d5c4f869-vd6xg" Oct 12 07:48:17 crc kubenswrapper[4599]: I1012 07:48:17.008703 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/946b8921-7749-4631-a3a0-17b48397fb1a-dns-svc\") pod \"dnsmasq-dns-d5c4f869-vd6xg\" (UID: \"946b8921-7749-4631-a3a0-17b48397fb1a\") " pod="openstack/dnsmasq-dns-d5c4f869-vd6xg" Oct 12 07:48:17 crc kubenswrapper[4599]: I1012 07:48:17.009036 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/946b8921-7749-4631-a3a0-17b48397fb1a-config\") pod \"dnsmasq-dns-d5c4f869-vd6xg\" (UID: \"946b8921-7749-4631-a3a0-17b48397fb1a\") " pod="openstack/dnsmasq-dns-d5c4f869-vd6xg" Oct 12 07:48:17 crc kubenswrapper[4599]: I1012 07:48:17.009064 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/946b8921-7749-4631-a3a0-17b48397fb1a-ovsdbserver-nb\") pod \"dnsmasq-dns-d5c4f869-vd6xg\" (UID: \"946b8921-7749-4631-a3a0-17b48397fb1a\") " pod="openstack/dnsmasq-dns-d5c4f869-vd6xg" Oct 12 07:48:17 crc kubenswrapper[4599]: I1012 07:48:17.013822 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Oct 12 07:48:17 crc kubenswrapper[4599]: I1012 07:48:17.030401 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kltjz\" (UniqueName: \"kubernetes.io/projected/946b8921-7749-4631-a3a0-17b48397fb1a-kube-api-access-kltjz\") pod \"dnsmasq-dns-d5c4f869-vd6xg\" (UID: \"946b8921-7749-4631-a3a0-17b48397fb1a\") " pod="openstack/dnsmasq-dns-d5c4f869-vd6xg" Oct 12 07:48:17 crc kubenswrapper[4599]: I1012 07:48:17.061034 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lbmsk"] Oct 12 07:48:17 crc kubenswrapper[4599]: I1012 07:48:17.090546 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d5c4f869-vd6xg" Oct 12 07:48:17 crc kubenswrapper[4599]: I1012 07:48:17.130706 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85dd5d7485-f9tjt"] Oct 12 07:48:17 crc kubenswrapper[4599]: I1012 07:48:17.178776 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Oct 12 07:48:17 crc kubenswrapper[4599]: I1012 07:48:17.180095 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 12 07:48:17 crc kubenswrapper[4599]: I1012 07:48:17.184826 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-j8jsf" Oct 12 07:48:17 crc kubenswrapper[4599]: I1012 07:48:17.185089 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Oct 12 07:48:17 crc kubenswrapper[4599]: I1012 07:48:17.185173 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Oct 12 07:48:17 crc kubenswrapper[4599]: I1012 07:48:17.185214 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Oct 12 07:48:17 crc kubenswrapper[4599]: I1012 07:48:17.190541 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 12 07:48:17 crc kubenswrapper[4599]: I1012 07:48:17.239751 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-4dn56"] Oct 12 07:48:17 crc kubenswrapper[4599]: I1012 07:48:17.254565 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-w8g6f"] Oct 12 07:48:17 crc kubenswrapper[4599]: I1012 07:48:17.314230 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/07a217da-3192-46dd-a935-7b124b5e6961-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"07a217da-3192-46dd-a935-7b124b5e6961\") " pod="openstack/ovn-northd-0" Oct 12 07:48:17 crc kubenswrapper[4599]: I1012 07:48:17.314467 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/07a217da-3192-46dd-a935-7b124b5e6961-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"07a217da-3192-46dd-a935-7b124b5e6961\") " pod="openstack/ovn-northd-0" Oct 12 07:48:17 crc kubenswrapper[4599]: I1012 07:48:17.314493 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07a217da-3192-46dd-a935-7b124b5e6961-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"07a217da-3192-46dd-a935-7b124b5e6961\") " pod="openstack/ovn-northd-0" Oct 12 07:48:17 crc kubenswrapper[4599]: I1012 07:48:17.314544 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/07a217da-3192-46dd-a935-7b124b5e6961-scripts\") pod \"ovn-northd-0\" (UID: \"07a217da-3192-46dd-a935-7b124b5e6961\") " pod="openstack/ovn-northd-0" Oct 12 07:48:17 crc kubenswrapper[4599]: I1012 07:48:17.314565 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/07a217da-3192-46dd-a935-7b124b5e6961-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"07a217da-3192-46dd-a935-7b124b5e6961\") " pod="openstack/ovn-northd-0" Oct 12 07:48:17 crc kubenswrapper[4599]: I1012 07:48:17.314604 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07a217da-3192-46dd-a935-7b124b5e6961-config\") pod \"ovn-northd-0\" (UID: \"07a217da-3192-46dd-a935-7b124b5e6961\") " pod="openstack/ovn-northd-0" Oct 12 07:48:17 crc kubenswrapper[4599]: I1012 07:48:17.314717 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wmhk\" (UniqueName: \"kubernetes.io/projected/07a217da-3192-46dd-a935-7b124b5e6961-kube-api-access-5wmhk\") pod \"ovn-northd-0\" (UID: \"07a217da-3192-46dd-a935-7b124b5e6961\") " pod="openstack/ovn-northd-0" Oct 12 07:48:17 crc kubenswrapper[4599]: I1012 07:48:17.416704 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/07a217da-3192-46dd-a935-7b124b5e6961-scripts\") pod \"ovn-northd-0\" (UID: \"07a217da-3192-46dd-a935-7b124b5e6961\") " pod="openstack/ovn-northd-0" Oct 12 07:48:17 crc kubenswrapper[4599]: I1012 07:48:17.416762 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/07a217da-3192-46dd-a935-7b124b5e6961-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"07a217da-3192-46dd-a935-7b124b5e6961\") " pod="openstack/ovn-northd-0" Oct 12 07:48:17 crc kubenswrapper[4599]: I1012 07:48:17.416835 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07a217da-3192-46dd-a935-7b124b5e6961-config\") pod \"ovn-northd-0\" (UID: \"07a217da-3192-46dd-a935-7b124b5e6961\") " pod="openstack/ovn-northd-0" Oct 12 07:48:17 crc kubenswrapper[4599]: I1012 07:48:17.416860 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wmhk\" (UniqueName: \"kubernetes.io/projected/07a217da-3192-46dd-a935-7b124b5e6961-kube-api-access-5wmhk\") pod \"ovn-northd-0\" (UID: \"07a217da-3192-46dd-a935-7b124b5e6961\") " pod="openstack/ovn-northd-0" Oct 12 07:48:17 crc kubenswrapper[4599]: I1012 07:48:17.416946 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/07a217da-3192-46dd-a935-7b124b5e6961-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"07a217da-3192-46dd-a935-7b124b5e6961\") " pod="openstack/ovn-northd-0" Oct 12 07:48:17 crc kubenswrapper[4599]: I1012 07:48:17.416986 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/07a217da-3192-46dd-a935-7b124b5e6961-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"07a217da-3192-46dd-a935-7b124b5e6961\") " pod="openstack/ovn-northd-0" Oct 12 07:48:17 crc kubenswrapper[4599]: I1012 07:48:17.417021 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07a217da-3192-46dd-a935-7b124b5e6961-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"07a217da-3192-46dd-a935-7b124b5e6961\") " pod="openstack/ovn-northd-0" Oct 12 07:48:17 crc kubenswrapper[4599]: I1012 07:48:17.418517 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/07a217da-3192-46dd-a935-7b124b5e6961-scripts\") pod \"ovn-northd-0\" (UID: \"07a217da-3192-46dd-a935-7b124b5e6961\") " pod="openstack/ovn-northd-0" Oct 12 07:48:17 crc kubenswrapper[4599]: I1012 07:48:17.418636 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/07a217da-3192-46dd-a935-7b124b5e6961-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"07a217da-3192-46dd-a935-7b124b5e6961\") " pod="openstack/ovn-northd-0" Oct 12 07:48:17 crc kubenswrapper[4599]: I1012 07:48:17.418925 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07a217da-3192-46dd-a935-7b124b5e6961-config\") pod \"ovn-northd-0\" (UID: \"07a217da-3192-46dd-a935-7b124b5e6961\") " pod="openstack/ovn-northd-0" Oct 12 07:48:17 crc kubenswrapper[4599]: I1012 07:48:17.423138 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/07a217da-3192-46dd-a935-7b124b5e6961-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"07a217da-3192-46dd-a935-7b124b5e6961\") " pod="openstack/ovn-northd-0" Oct 12 07:48:17 crc kubenswrapper[4599]: I1012 07:48:17.423971 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07a217da-3192-46dd-a935-7b124b5e6961-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"07a217da-3192-46dd-a935-7b124b5e6961\") " pod="openstack/ovn-northd-0" Oct 12 07:48:17 crc kubenswrapper[4599]: I1012 07:48:17.433945 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/07a217da-3192-46dd-a935-7b124b5e6961-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"07a217da-3192-46dd-a935-7b124b5e6961\") " pod="openstack/ovn-northd-0" Oct 12 07:48:17 crc kubenswrapper[4599]: I1012 07:48:17.436518 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wmhk\" (UniqueName: \"kubernetes.io/projected/07a217da-3192-46dd-a935-7b124b5e6961-kube-api-access-5wmhk\") pod \"ovn-northd-0\" (UID: \"07a217da-3192-46dd-a935-7b124b5e6961\") " pod="openstack/ovn-northd-0" Oct 12 07:48:17 crc kubenswrapper[4599]: I1012 07:48:17.472227 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Oct 12 07:48:17 crc kubenswrapper[4599]: I1012 07:48:17.498634 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 12 07:48:17 crc kubenswrapper[4599]: I1012 07:48:17.520529 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Oct 12 07:48:17 crc kubenswrapper[4599]: I1012 07:48:17.607622 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d5c4f869-vd6xg"] Oct 12 07:48:17 crc kubenswrapper[4599]: I1012 07:48:17.620290 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7f1c5e37-2b6d-4058-86e1-466baaa0f6c4-etc-swift\") pod \"swift-storage-0\" (UID: \"7f1c5e37-2b6d-4058-86e1-466baaa0f6c4\") " pod="openstack/swift-storage-0" Oct 12 07:48:17 crc kubenswrapper[4599]: E1012 07:48:17.620620 4599 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 12 07:48:17 crc kubenswrapper[4599]: E1012 07:48:17.620645 4599 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 12 07:48:17 crc kubenswrapper[4599]: E1012 07:48:17.620690 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7f1c5e37-2b6d-4058-86e1-466baaa0f6c4-etc-swift podName:7f1c5e37-2b6d-4058-86e1-466baaa0f6c4 nodeName:}" failed. No retries permitted until 2025-10-12 07:48:19.620673798 +0000 UTC m=+796.409869299 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/7f1c5e37-2b6d-4058-86e1-466baaa0f6c4-etc-swift") pod "swift-storage-0" (UID: "7f1c5e37-2b6d-4058-86e1-466baaa0f6c4") : configmap "swift-ring-files" not found Oct 12 07:48:17 crc kubenswrapper[4599]: I1012 07:48:17.901354 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 12 07:48:17 crc kubenswrapper[4599]: I1012 07:48:17.912608 4599 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 12 07:48:17 crc kubenswrapper[4599]: I1012 07:48:17.982173 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-4dn56" event={"ID":"ce16e02d-17cf-467a-aca5-944a67d4cd79","Type":"ContainerStarted","Data":"ab9c29115ae54f061f3a970e05cc539abcead147ed24209e56c514c1f30bfc09"} Oct 12 07:48:17 crc kubenswrapper[4599]: I1012 07:48:17.982244 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-4dn56" event={"ID":"ce16e02d-17cf-467a-aca5-944a67d4cd79","Type":"ContainerStarted","Data":"1a7ddf5d4c1d185d1088337c87b887a847972cb7bb753f58f76449bbfcd55e21"} Oct 12 07:48:17 crc kubenswrapper[4599]: I1012 07:48:17.983684 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d5c4f869-vd6xg" event={"ID":"946b8921-7749-4631-a3a0-17b48397fb1a","Type":"ContainerStarted","Data":"714a623a457bb4b68973fe59ad63bd61487350664c144f10bc20db73a00d8cb8"} Oct 12 07:48:17 crc kubenswrapper[4599]: I1012 07:48:17.983718 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d5c4f869-vd6xg" event={"ID":"946b8921-7749-4631-a3a0-17b48397fb1a","Type":"ContainerStarted","Data":"2c7189a71d3a0a9deb0a7af30201fd9c1414d6ecbab74a67805ad45dbb4d2f9c"} Oct 12 07:48:17 crc kubenswrapper[4599]: I1012 07:48:17.985109 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-w8g6f" event={"ID":"3f44cfe9-f015-4084-b100-fbb08f528667","Type":"ContainerStarted","Data":"f63579da5011ec60e2fc0ff351691517f65d506bb93c8b8d44cdce7ffb09ab39"} Oct 12 07:48:17 crc kubenswrapper[4599]: I1012 07:48:17.988294 4599 generic.go:334] "Generic (PLEG): container finished" podID="67c4df83-6f28-4ad0-b53c-383ebe12642f" containerID="56a9b5568bfa5a0d13193d912f716d620169a7e201cafdbb5e6cbf9deb151b7f" exitCode=0 Oct 12 07:48:17 crc kubenswrapper[4599]: I1012 07:48:17.988382 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85dd5d7485-f9tjt" event={"ID":"67c4df83-6f28-4ad0-b53c-383ebe12642f","Type":"ContainerDied","Data":"56a9b5568bfa5a0d13193d912f716d620169a7e201cafdbb5e6cbf9deb151b7f"} Oct 12 07:48:17 crc kubenswrapper[4599]: I1012 07:48:17.988433 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85dd5d7485-f9tjt" event={"ID":"67c4df83-6f28-4ad0-b53c-383ebe12642f","Type":"ContainerStarted","Data":"76fd9753464f5c8143e4f1d40c61b358b5f4033bd3046761cff1d4956f017772"} Oct 12 07:48:17 crc kubenswrapper[4599]: I1012 07:48:17.991682 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"07a217da-3192-46dd-a935-7b124b5e6961","Type":"ContainerStarted","Data":"d4a2c7a840af72fbc913b6b9a3f170163f14f51aee7ffe3378ff720ab90d4345"} Oct 12 07:48:17 crc kubenswrapper[4599]: I1012 07:48:17.992725 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-54f5b6d497-tl42t" Oct 12 07:48:17 crc kubenswrapper[4599]: I1012 07:48:17.992814 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-54f5b6d497-tl42t" podUID="d9eb733c-6880-4f6e-a004-cc4028582ca9" containerName="dnsmasq-dns" containerID="cri-o://e5ae2acf8cfa378ca3fda23273bf0ea5fbb77598a288e24fa25f1292c5d5536e" gracePeriod=10 Oct 12 07:48:18 crc kubenswrapper[4599]: I1012 07:48:18.001944 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-4dn56" podStartSLOduration=2.001933649 podStartE2EDuration="2.001933649s" podCreationTimestamp="2025-10-12 07:48:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:48:18.000230366 +0000 UTC m=+794.789425878" watchObservedRunningTime="2025-10-12 07:48:18.001933649 +0000 UTC m=+794.791129151" Oct 12 07:48:18 crc kubenswrapper[4599]: I1012 07:48:18.289630 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85dd5d7485-f9tjt" Oct 12 07:48:18 crc kubenswrapper[4599]: I1012 07:48:18.438482 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/67c4df83-6f28-4ad0-b53c-383ebe12642f-ovsdbserver-sb\") pod \"67c4df83-6f28-4ad0-b53c-383ebe12642f\" (UID: \"67c4df83-6f28-4ad0-b53c-383ebe12642f\") " Oct 12 07:48:18 crc kubenswrapper[4599]: I1012 07:48:18.438777 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/67c4df83-6f28-4ad0-b53c-383ebe12642f-dns-svc\") pod \"67c4df83-6f28-4ad0-b53c-383ebe12642f\" (UID: \"67c4df83-6f28-4ad0-b53c-383ebe12642f\") " Oct 12 07:48:18 crc kubenswrapper[4599]: I1012 07:48:18.438806 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pxh86\" (UniqueName: \"kubernetes.io/projected/67c4df83-6f28-4ad0-b53c-383ebe12642f-kube-api-access-pxh86\") pod \"67c4df83-6f28-4ad0-b53c-383ebe12642f\" (UID: \"67c4df83-6f28-4ad0-b53c-383ebe12642f\") " Oct 12 07:48:18 crc kubenswrapper[4599]: I1012 07:48:18.438830 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67c4df83-6f28-4ad0-b53c-383ebe12642f-config\") pod \"67c4df83-6f28-4ad0-b53c-383ebe12642f\" (UID: \"67c4df83-6f28-4ad0-b53c-383ebe12642f\") " Oct 12 07:48:18 crc kubenswrapper[4599]: I1012 07:48:18.444286 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67c4df83-6f28-4ad0-b53c-383ebe12642f-kube-api-access-pxh86" (OuterVolumeSpecName: "kube-api-access-pxh86") pod "67c4df83-6f28-4ad0-b53c-383ebe12642f" (UID: "67c4df83-6f28-4ad0-b53c-383ebe12642f"). InnerVolumeSpecName "kube-api-access-pxh86". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:48:18 crc kubenswrapper[4599]: I1012 07:48:18.457576 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67c4df83-6f28-4ad0-b53c-383ebe12642f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "67c4df83-6f28-4ad0-b53c-383ebe12642f" (UID: "67c4df83-6f28-4ad0-b53c-383ebe12642f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:48:18 crc kubenswrapper[4599]: I1012 07:48:18.458192 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67c4df83-6f28-4ad0-b53c-383ebe12642f-config" (OuterVolumeSpecName: "config") pod "67c4df83-6f28-4ad0-b53c-383ebe12642f" (UID: "67c4df83-6f28-4ad0-b53c-383ebe12642f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:48:18 crc kubenswrapper[4599]: I1012 07:48:18.458760 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67c4df83-6f28-4ad0-b53c-383ebe12642f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "67c4df83-6f28-4ad0-b53c-383ebe12642f" (UID: "67c4df83-6f28-4ad0-b53c-383ebe12642f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:48:18 crc kubenswrapper[4599]: I1012 07:48:18.542649 4599 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67c4df83-6f28-4ad0-b53c-383ebe12642f-config\") on node \"crc\" DevicePath \"\"" Oct 12 07:48:18 crc kubenswrapper[4599]: I1012 07:48:18.542702 4599 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/67c4df83-6f28-4ad0-b53c-383ebe12642f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 12 07:48:18 crc kubenswrapper[4599]: I1012 07:48:18.542744 4599 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/67c4df83-6f28-4ad0-b53c-383ebe12642f-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 12 07:48:18 crc kubenswrapper[4599]: I1012 07:48:18.542757 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pxh86\" (UniqueName: \"kubernetes.io/projected/67c4df83-6f28-4ad0-b53c-383ebe12642f-kube-api-access-pxh86\") on node \"crc\" DevicePath \"\"" Oct 12 07:48:19 crc kubenswrapper[4599]: I1012 07:48:19.002732 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85dd5d7485-f9tjt" event={"ID":"67c4df83-6f28-4ad0-b53c-383ebe12642f","Type":"ContainerDied","Data":"76fd9753464f5c8143e4f1d40c61b358b5f4033bd3046761cff1d4956f017772"} Oct 12 07:48:19 crc kubenswrapper[4599]: I1012 07:48:19.002852 4599 scope.go:117] "RemoveContainer" containerID="56a9b5568bfa5a0d13193d912f716d620169a7e201cafdbb5e6cbf9deb151b7f" Oct 12 07:48:19 crc kubenswrapper[4599]: I1012 07:48:19.002764 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85dd5d7485-f9tjt" Oct 12 07:48:19 crc kubenswrapper[4599]: I1012 07:48:19.006545 4599 generic.go:334] "Generic (PLEG): container finished" podID="946b8921-7749-4631-a3a0-17b48397fb1a" containerID="714a623a457bb4b68973fe59ad63bd61487350664c144f10bc20db73a00d8cb8" exitCode=0 Oct 12 07:48:19 crc kubenswrapper[4599]: I1012 07:48:19.006629 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d5c4f869-vd6xg" event={"ID":"946b8921-7749-4631-a3a0-17b48397fb1a","Type":"ContainerDied","Data":"714a623a457bb4b68973fe59ad63bd61487350664c144f10bc20db73a00d8cb8"} Oct 12 07:48:19 crc kubenswrapper[4599]: I1012 07:48:19.016560 4599 generic.go:334] "Generic (PLEG): container finished" podID="d9eb733c-6880-4f6e-a004-cc4028582ca9" containerID="e5ae2acf8cfa378ca3fda23273bf0ea5fbb77598a288e24fa25f1292c5d5536e" exitCode=0 Oct 12 07:48:19 crc kubenswrapper[4599]: I1012 07:48:19.016613 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54f5b6d497-tl42t" event={"ID":"d9eb733c-6880-4f6e-a004-cc4028582ca9","Type":"ContainerDied","Data":"e5ae2acf8cfa378ca3fda23273bf0ea5fbb77598a288e24fa25f1292c5d5536e"} Oct 12 07:48:19 crc kubenswrapper[4599]: I1012 07:48:19.016821 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-lbmsk" podUID="f139cdb1-1669-49a2-8bc4-f68a6bfd67c7" containerName="registry-server" containerID="cri-o://cdabc263cabe01010fcde8d9787fb081990b3ea1bea0c7f033d69ad11403df57" gracePeriod=2 Oct 12 07:48:19 crc kubenswrapper[4599]: I1012 07:48:19.171618 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54f5b6d497-tl42t" Oct 12 07:48:19 crc kubenswrapper[4599]: I1012 07:48:19.215305 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85dd5d7485-f9tjt"] Oct 12 07:48:19 crc kubenswrapper[4599]: I1012 07:48:19.220543 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-85dd5d7485-f9tjt"] Oct 12 07:48:19 crc kubenswrapper[4599]: I1012 07:48:19.254859 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d9eb733c-6880-4f6e-a004-cc4028582ca9-dns-svc\") pod \"d9eb733c-6880-4f6e-a004-cc4028582ca9\" (UID: \"d9eb733c-6880-4f6e-a004-cc4028582ca9\") " Oct 12 07:48:19 crc kubenswrapper[4599]: I1012 07:48:19.254914 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9eb733c-6880-4f6e-a004-cc4028582ca9-config\") pod \"d9eb733c-6880-4f6e-a004-cc4028582ca9\" (UID: \"d9eb733c-6880-4f6e-a004-cc4028582ca9\") " Oct 12 07:48:19 crc kubenswrapper[4599]: I1012 07:48:19.254997 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kn8zx\" (UniqueName: \"kubernetes.io/projected/d9eb733c-6880-4f6e-a004-cc4028582ca9-kube-api-access-kn8zx\") pod \"d9eb733c-6880-4f6e-a004-cc4028582ca9\" (UID: \"d9eb733c-6880-4f6e-a004-cc4028582ca9\") " Oct 12 07:48:19 crc kubenswrapper[4599]: I1012 07:48:19.266722 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9eb733c-6880-4f6e-a004-cc4028582ca9-kube-api-access-kn8zx" (OuterVolumeSpecName: "kube-api-access-kn8zx") pod "d9eb733c-6880-4f6e-a004-cc4028582ca9" (UID: "d9eb733c-6880-4f6e-a004-cc4028582ca9"). InnerVolumeSpecName "kube-api-access-kn8zx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:48:19 crc kubenswrapper[4599]: I1012 07:48:19.296726 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9eb733c-6880-4f6e-a004-cc4028582ca9-config" (OuterVolumeSpecName: "config") pod "d9eb733c-6880-4f6e-a004-cc4028582ca9" (UID: "d9eb733c-6880-4f6e-a004-cc4028582ca9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:48:19 crc kubenswrapper[4599]: I1012 07:48:19.325491 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9eb733c-6880-4f6e-a004-cc4028582ca9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d9eb733c-6880-4f6e-a004-cc4028582ca9" (UID: "d9eb733c-6880-4f6e-a004-cc4028582ca9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:48:19 crc kubenswrapper[4599]: I1012 07:48:19.370129 4599 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d9eb733c-6880-4f6e-a004-cc4028582ca9-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 12 07:48:19 crc kubenswrapper[4599]: I1012 07:48:19.370161 4599 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9eb733c-6880-4f6e-a004-cc4028582ca9-config\") on node \"crc\" DevicePath \"\"" Oct 12 07:48:19 crc kubenswrapper[4599]: I1012 07:48:19.370172 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kn8zx\" (UniqueName: \"kubernetes.io/projected/d9eb733c-6880-4f6e-a004-cc4028582ca9-kube-api-access-kn8zx\") on node \"crc\" DevicePath \"\"" Oct 12 07:48:19 crc kubenswrapper[4599]: I1012 07:48:19.443750 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lbmsk" Oct 12 07:48:19 crc kubenswrapper[4599]: I1012 07:48:19.563787 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67c4df83-6f28-4ad0-b53c-383ebe12642f" path="/var/lib/kubelet/pods/67c4df83-6f28-4ad0-b53c-383ebe12642f/volumes" Oct 12 07:48:19 crc kubenswrapper[4599]: I1012 07:48:19.572112 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f139cdb1-1669-49a2-8bc4-f68a6bfd67c7-utilities\") pod \"f139cdb1-1669-49a2-8bc4-f68a6bfd67c7\" (UID: \"f139cdb1-1669-49a2-8bc4-f68a6bfd67c7\") " Oct 12 07:48:19 crc kubenswrapper[4599]: I1012 07:48:19.572250 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f139cdb1-1669-49a2-8bc4-f68a6bfd67c7-catalog-content\") pod \"f139cdb1-1669-49a2-8bc4-f68a6bfd67c7\" (UID: \"f139cdb1-1669-49a2-8bc4-f68a6bfd67c7\") " Oct 12 07:48:19 crc kubenswrapper[4599]: I1012 07:48:19.572385 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q7v7h\" (UniqueName: \"kubernetes.io/projected/f139cdb1-1669-49a2-8bc4-f68a6bfd67c7-kube-api-access-q7v7h\") pod \"f139cdb1-1669-49a2-8bc4-f68a6bfd67c7\" (UID: \"f139cdb1-1669-49a2-8bc4-f68a6bfd67c7\") " Oct 12 07:48:19 crc kubenswrapper[4599]: I1012 07:48:19.572972 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f139cdb1-1669-49a2-8bc4-f68a6bfd67c7-utilities" (OuterVolumeSpecName: "utilities") pod "f139cdb1-1669-49a2-8bc4-f68a6bfd67c7" (UID: "f139cdb1-1669-49a2-8bc4-f68a6bfd67c7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:48:19 crc kubenswrapper[4599]: I1012 07:48:19.576535 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f139cdb1-1669-49a2-8bc4-f68a6bfd67c7-kube-api-access-q7v7h" (OuterVolumeSpecName: "kube-api-access-q7v7h") pod "f139cdb1-1669-49a2-8bc4-f68a6bfd67c7" (UID: "f139cdb1-1669-49a2-8bc4-f68a6bfd67c7"). InnerVolumeSpecName "kube-api-access-q7v7h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:48:19 crc kubenswrapper[4599]: I1012 07:48:19.643835 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f139cdb1-1669-49a2-8bc4-f68a6bfd67c7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f139cdb1-1669-49a2-8bc4-f68a6bfd67c7" (UID: "f139cdb1-1669-49a2-8bc4-f68a6bfd67c7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:48:19 crc kubenswrapper[4599]: I1012 07:48:19.674433 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7f1c5e37-2b6d-4058-86e1-466baaa0f6c4-etc-swift\") pod \"swift-storage-0\" (UID: \"7f1c5e37-2b6d-4058-86e1-466baaa0f6c4\") " pod="openstack/swift-storage-0" Oct 12 07:48:19 crc kubenswrapper[4599]: I1012 07:48:19.674632 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q7v7h\" (UniqueName: \"kubernetes.io/projected/f139cdb1-1669-49a2-8bc4-f68a6bfd67c7-kube-api-access-q7v7h\") on node \"crc\" DevicePath \"\"" Oct 12 07:48:19 crc kubenswrapper[4599]: I1012 07:48:19.674655 4599 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f139cdb1-1669-49a2-8bc4-f68a6bfd67c7-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 07:48:19 crc kubenswrapper[4599]: I1012 07:48:19.674668 4599 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f139cdb1-1669-49a2-8bc4-f68a6bfd67c7-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 07:48:19 crc kubenswrapper[4599]: E1012 07:48:19.674665 4599 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 12 07:48:19 crc kubenswrapper[4599]: E1012 07:48:19.674702 4599 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 12 07:48:19 crc kubenswrapper[4599]: E1012 07:48:19.674770 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7f1c5e37-2b6d-4058-86e1-466baaa0f6c4-etc-swift podName:7f1c5e37-2b6d-4058-86e1-466baaa0f6c4 nodeName:}" failed. No retries permitted until 2025-10-12 07:48:23.674747335 +0000 UTC m=+800.463942837 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/7f1c5e37-2b6d-4058-86e1-466baaa0f6c4-etc-swift") pod "swift-storage-0" (UID: "7f1c5e37-2b6d-4058-86e1-466baaa0f6c4") : configmap "swift-ring-files" not found Oct 12 07:48:20 crc kubenswrapper[4599]: I1012 07:48:20.027298 4599 generic.go:334] "Generic (PLEG): container finished" podID="f139cdb1-1669-49a2-8bc4-f68a6bfd67c7" containerID="cdabc263cabe01010fcde8d9787fb081990b3ea1bea0c7f033d69ad11403df57" exitCode=0 Oct 12 07:48:20 crc kubenswrapper[4599]: I1012 07:48:20.027575 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lbmsk" event={"ID":"f139cdb1-1669-49a2-8bc4-f68a6bfd67c7","Type":"ContainerDied","Data":"cdabc263cabe01010fcde8d9787fb081990b3ea1bea0c7f033d69ad11403df57"} Oct 12 07:48:20 crc kubenswrapper[4599]: I1012 07:48:20.027643 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lbmsk" event={"ID":"f139cdb1-1669-49a2-8bc4-f68a6bfd67c7","Type":"ContainerDied","Data":"93ca963b42f9995879fcaf7ee68fc22db633ceeac305f07b1f88f1331891cd8b"} Oct 12 07:48:20 crc kubenswrapper[4599]: I1012 07:48:20.027668 4599 scope.go:117] "RemoveContainer" containerID="cdabc263cabe01010fcde8d9787fb081990b3ea1bea0c7f033d69ad11403df57" Oct 12 07:48:20 crc kubenswrapper[4599]: I1012 07:48:20.027833 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lbmsk" Oct 12 07:48:20 crc kubenswrapper[4599]: I1012 07:48:20.033149 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"07a217da-3192-46dd-a935-7b124b5e6961","Type":"ContainerStarted","Data":"df2b6f006e2a86e05c0090879080fd270df180967be10ccca877176f06e59903"} Oct 12 07:48:20 crc kubenswrapper[4599]: I1012 07:48:20.035987 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d5c4f869-vd6xg" event={"ID":"946b8921-7749-4631-a3a0-17b48397fb1a","Type":"ContainerStarted","Data":"c3a0b4b790ca4529e2982ef697da3f57431e2c32880884f8a554ddf944e7b362"} Oct 12 07:48:20 crc kubenswrapper[4599]: I1012 07:48:20.036125 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-d5c4f869-vd6xg" Oct 12 07:48:20 crc kubenswrapper[4599]: I1012 07:48:20.041173 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54f5b6d497-tl42t" event={"ID":"d9eb733c-6880-4f6e-a004-cc4028582ca9","Type":"ContainerDied","Data":"b1d7204e87c2600aa199f6b0d8229509b446d20e9908076032d46f0041b2d798"} Oct 12 07:48:20 crc kubenswrapper[4599]: I1012 07:48:20.041239 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54f5b6d497-tl42t" Oct 12 07:48:20 crc kubenswrapper[4599]: I1012 07:48:20.057988 4599 scope.go:117] "RemoveContainer" containerID="1c9db235d81e75c4898b424e4ebc71c758f8229e2b5fb73c74c204082b8f8b04" Oct 12 07:48:20 crc kubenswrapper[4599]: I1012 07:48:20.059179 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-d5c4f869-vd6xg" podStartSLOduration=4.059165296 podStartE2EDuration="4.059165296s" podCreationTimestamp="2025-10-12 07:48:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:48:20.053080184 +0000 UTC m=+796.842275686" watchObservedRunningTime="2025-10-12 07:48:20.059165296 +0000 UTC m=+796.848360799" Oct 12 07:48:20 crc kubenswrapper[4599]: I1012 07:48:20.071921 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lbmsk"] Oct 12 07:48:20 crc kubenswrapper[4599]: I1012 07:48:20.076114 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-lbmsk"] Oct 12 07:48:20 crc kubenswrapper[4599]: I1012 07:48:20.085950 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54f5b6d497-tl42t"] Oct 12 07:48:20 crc kubenswrapper[4599]: I1012 07:48:20.088146 4599 scope.go:117] "RemoveContainer" containerID="e0779ce472de984f1c08ec40b9e84f227723d1ccc1bd774eae855920872fc430" Oct 12 07:48:20 crc kubenswrapper[4599]: I1012 07:48:20.089317 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-54f5b6d497-tl42t"] Oct 12 07:48:20 crc kubenswrapper[4599]: I1012 07:48:20.104579 4599 scope.go:117] "RemoveContainer" containerID="cdabc263cabe01010fcde8d9787fb081990b3ea1bea0c7f033d69ad11403df57" Oct 12 07:48:20 crc kubenswrapper[4599]: E1012 07:48:20.105041 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cdabc263cabe01010fcde8d9787fb081990b3ea1bea0c7f033d69ad11403df57\": container with ID starting with cdabc263cabe01010fcde8d9787fb081990b3ea1bea0c7f033d69ad11403df57 not found: ID does not exist" containerID="cdabc263cabe01010fcde8d9787fb081990b3ea1bea0c7f033d69ad11403df57" Oct 12 07:48:20 crc kubenswrapper[4599]: I1012 07:48:20.105085 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdabc263cabe01010fcde8d9787fb081990b3ea1bea0c7f033d69ad11403df57"} err="failed to get container status \"cdabc263cabe01010fcde8d9787fb081990b3ea1bea0c7f033d69ad11403df57\": rpc error: code = NotFound desc = could not find container \"cdabc263cabe01010fcde8d9787fb081990b3ea1bea0c7f033d69ad11403df57\": container with ID starting with cdabc263cabe01010fcde8d9787fb081990b3ea1bea0c7f033d69ad11403df57 not found: ID does not exist" Oct 12 07:48:20 crc kubenswrapper[4599]: I1012 07:48:20.105109 4599 scope.go:117] "RemoveContainer" containerID="1c9db235d81e75c4898b424e4ebc71c758f8229e2b5fb73c74c204082b8f8b04" Oct 12 07:48:20 crc kubenswrapper[4599]: E1012 07:48:20.105583 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c9db235d81e75c4898b424e4ebc71c758f8229e2b5fb73c74c204082b8f8b04\": container with ID starting with 1c9db235d81e75c4898b424e4ebc71c758f8229e2b5fb73c74c204082b8f8b04 not found: ID does not exist" containerID="1c9db235d81e75c4898b424e4ebc71c758f8229e2b5fb73c74c204082b8f8b04" Oct 12 07:48:20 crc kubenswrapper[4599]: I1012 07:48:20.105618 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c9db235d81e75c4898b424e4ebc71c758f8229e2b5fb73c74c204082b8f8b04"} err="failed to get container status \"1c9db235d81e75c4898b424e4ebc71c758f8229e2b5fb73c74c204082b8f8b04\": rpc error: code = NotFound desc = could not find container \"1c9db235d81e75c4898b424e4ebc71c758f8229e2b5fb73c74c204082b8f8b04\": container with ID starting with 1c9db235d81e75c4898b424e4ebc71c758f8229e2b5fb73c74c204082b8f8b04 not found: ID does not exist" Oct 12 07:48:20 crc kubenswrapper[4599]: I1012 07:48:20.105647 4599 scope.go:117] "RemoveContainer" containerID="e0779ce472de984f1c08ec40b9e84f227723d1ccc1bd774eae855920872fc430" Oct 12 07:48:20 crc kubenswrapper[4599]: E1012 07:48:20.106064 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0779ce472de984f1c08ec40b9e84f227723d1ccc1bd774eae855920872fc430\": container with ID starting with e0779ce472de984f1c08ec40b9e84f227723d1ccc1bd774eae855920872fc430 not found: ID does not exist" containerID="e0779ce472de984f1c08ec40b9e84f227723d1ccc1bd774eae855920872fc430" Oct 12 07:48:20 crc kubenswrapper[4599]: I1012 07:48:20.106094 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0779ce472de984f1c08ec40b9e84f227723d1ccc1bd774eae855920872fc430"} err="failed to get container status \"e0779ce472de984f1c08ec40b9e84f227723d1ccc1bd774eae855920872fc430\": rpc error: code = NotFound desc = could not find container \"e0779ce472de984f1c08ec40b9e84f227723d1ccc1bd774eae855920872fc430\": container with ID starting with e0779ce472de984f1c08ec40b9e84f227723d1ccc1bd774eae855920872fc430 not found: ID does not exist" Oct 12 07:48:20 crc kubenswrapper[4599]: I1012 07:48:20.106113 4599 scope.go:117] "RemoveContainer" containerID="e5ae2acf8cfa378ca3fda23273bf0ea5fbb77598a288e24fa25f1292c5d5536e" Oct 12 07:48:20 crc kubenswrapper[4599]: I1012 07:48:20.121500 4599 scope.go:117] "RemoveContainer" containerID="f721268981052aee61add8db6d2f6afacb4f9864a830bd3c38c0f9fe0bea9044" Oct 12 07:48:21 crc kubenswrapper[4599]: I1012 07:48:21.055615 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"07a217da-3192-46dd-a935-7b124b5e6961","Type":"ContainerStarted","Data":"10ca9691753510c7106608557c412e08fd896295927d8e6a0f84edf5c9569ebb"} Oct 12 07:48:21 crc kubenswrapper[4599]: I1012 07:48:21.055947 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Oct 12 07:48:21 crc kubenswrapper[4599]: I1012 07:48:21.073716 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.212054742 podStartE2EDuration="4.073693052s" podCreationTimestamp="2025-10-12 07:48:17 +0000 UTC" firstStartedPulling="2025-10-12 07:48:17.912314952 +0000 UTC m=+794.701510454" lastFinishedPulling="2025-10-12 07:48:19.773953262 +0000 UTC m=+796.563148764" observedRunningTime="2025-10-12 07:48:21.072888625 +0000 UTC m=+797.862084126" watchObservedRunningTime="2025-10-12 07:48:21.073693052 +0000 UTC m=+797.862888554" Oct 12 07:48:21 crc kubenswrapper[4599]: I1012 07:48:21.556066 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9eb733c-6880-4f6e-a004-cc4028582ca9" path="/var/lib/kubelet/pods/d9eb733c-6880-4f6e-a004-cc4028582ca9/volumes" Oct 12 07:48:21 crc kubenswrapper[4599]: I1012 07:48:21.556680 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f139cdb1-1669-49a2-8bc4-f68a6bfd67c7" path="/var/lib/kubelet/pods/f139cdb1-1669-49a2-8bc4-f68a6bfd67c7/volumes" Oct 12 07:48:22 crc kubenswrapper[4599]: I1012 07:48:22.714733 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-qbjjh"] Oct 12 07:48:22 crc kubenswrapper[4599]: E1012 07:48:22.715561 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f139cdb1-1669-49a2-8bc4-f68a6bfd67c7" containerName="extract-utilities" Oct 12 07:48:22 crc kubenswrapper[4599]: I1012 07:48:22.715575 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="f139cdb1-1669-49a2-8bc4-f68a6bfd67c7" containerName="extract-utilities" Oct 12 07:48:22 crc kubenswrapper[4599]: E1012 07:48:22.715588 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67c4df83-6f28-4ad0-b53c-383ebe12642f" containerName="init" Oct 12 07:48:22 crc kubenswrapper[4599]: I1012 07:48:22.715594 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="67c4df83-6f28-4ad0-b53c-383ebe12642f" containerName="init" Oct 12 07:48:22 crc kubenswrapper[4599]: E1012 07:48:22.715611 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9eb733c-6880-4f6e-a004-cc4028582ca9" containerName="init" Oct 12 07:48:22 crc kubenswrapper[4599]: I1012 07:48:22.715616 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9eb733c-6880-4f6e-a004-cc4028582ca9" containerName="init" Oct 12 07:48:22 crc kubenswrapper[4599]: E1012 07:48:22.715632 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f139cdb1-1669-49a2-8bc4-f68a6bfd67c7" containerName="extract-content" Oct 12 07:48:22 crc kubenswrapper[4599]: I1012 07:48:22.715638 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="f139cdb1-1669-49a2-8bc4-f68a6bfd67c7" containerName="extract-content" Oct 12 07:48:22 crc kubenswrapper[4599]: E1012 07:48:22.715648 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9eb733c-6880-4f6e-a004-cc4028582ca9" containerName="dnsmasq-dns" Oct 12 07:48:22 crc kubenswrapper[4599]: I1012 07:48:22.715653 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9eb733c-6880-4f6e-a004-cc4028582ca9" containerName="dnsmasq-dns" Oct 12 07:48:22 crc kubenswrapper[4599]: E1012 07:48:22.715662 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f139cdb1-1669-49a2-8bc4-f68a6bfd67c7" containerName="registry-server" Oct 12 07:48:22 crc kubenswrapper[4599]: I1012 07:48:22.715668 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="f139cdb1-1669-49a2-8bc4-f68a6bfd67c7" containerName="registry-server" Oct 12 07:48:22 crc kubenswrapper[4599]: I1012 07:48:22.715835 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9eb733c-6880-4f6e-a004-cc4028582ca9" containerName="dnsmasq-dns" Oct 12 07:48:22 crc kubenswrapper[4599]: I1012 07:48:22.715845 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="67c4df83-6f28-4ad0-b53c-383ebe12642f" containerName="init" Oct 12 07:48:22 crc kubenswrapper[4599]: I1012 07:48:22.715861 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="f139cdb1-1669-49a2-8bc4-f68a6bfd67c7" containerName="registry-server" Oct 12 07:48:22 crc kubenswrapper[4599]: I1012 07:48:22.716431 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-qbjjh" Oct 12 07:48:22 crc kubenswrapper[4599]: I1012 07:48:22.727157 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-qbjjh"] Oct 12 07:48:22 crc kubenswrapper[4599]: I1012 07:48:22.837234 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qx8d6\" (UniqueName: \"kubernetes.io/projected/85dd307d-9c6b-4d7d-b639-5486431ba73f-kube-api-access-qx8d6\") pod \"keystone-db-create-qbjjh\" (UID: \"85dd307d-9c6b-4d7d-b639-5486431ba73f\") " pod="openstack/keystone-db-create-qbjjh" Oct 12 07:48:22 crc kubenswrapper[4599]: I1012 07:48:22.939192 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qx8d6\" (UniqueName: \"kubernetes.io/projected/85dd307d-9c6b-4d7d-b639-5486431ba73f-kube-api-access-qx8d6\") pod \"keystone-db-create-qbjjh\" (UID: \"85dd307d-9c6b-4d7d-b639-5486431ba73f\") " pod="openstack/keystone-db-create-qbjjh" Oct 12 07:48:22 crc kubenswrapper[4599]: I1012 07:48:22.941651 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-cdhl9"] Oct 12 07:48:22 crc kubenswrapper[4599]: I1012 07:48:22.942772 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-cdhl9" Oct 12 07:48:22 crc kubenswrapper[4599]: I1012 07:48:22.955956 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-cdhl9"] Oct 12 07:48:22 crc kubenswrapper[4599]: I1012 07:48:22.956230 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qx8d6\" (UniqueName: \"kubernetes.io/projected/85dd307d-9c6b-4d7d-b639-5486431ba73f-kube-api-access-qx8d6\") pod \"keystone-db-create-qbjjh\" (UID: \"85dd307d-9c6b-4d7d-b639-5486431ba73f\") " pod="openstack/keystone-db-create-qbjjh" Oct 12 07:48:23 crc kubenswrapper[4599]: I1012 07:48:23.036576 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-qbjjh" Oct 12 07:48:23 crc kubenswrapper[4599]: I1012 07:48:23.041181 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvgt5\" (UniqueName: \"kubernetes.io/projected/3a8d4eed-9d78-407f-a3fb-db0af1beac68-kube-api-access-zvgt5\") pod \"placement-db-create-cdhl9\" (UID: \"3a8d4eed-9d78-407f-a3fb-db0af1beac68\") " pod="openstack/placement-db-create-cdhl9" Oct 12 07:48:23 crc kubenswrapper[4599]: I1012 07:48:23.087248 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-w8g6f" event={"ID":"3f44cfe9-f015-4084-b100-fbb08f528667","Type":"ContainerStarted","Data":"5e8806a25e947a4db8fde27c486086a1e9bf4a5d19f4bc04ea2fadf8f0b4cc93"} Oct 12 07:48:23 crc kubenswrapper[4599]: I1012 07:48:23.101972 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-w8g6f" podStartSLOduration=2.228054347 podStartE2EDuration="7.10195606s" podCreationTimestamp="2025-10-12 07:48:16 +0000 UTC" firstStartedPulling="2025-10-12 07:48:17.262224966 +0000 UTC m=+794.051420468" lastFinishedPulling="2025-10-12 07:48:22.13612668 +0000 UTC m=+798.925322181" observedRunningTime="2025-10-12 07:48:23.097529777 +0000 UTC m=+799.886725279" watchObservedRunningTime="2025-10-12 07:48:23.10195606 +0000 UTC m=+799.891151562" Oct 12 07:48:23 crc kubenswrapper[4599]: I1012 07:48:23.145677 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvgt5\" (UniqueName: \"kubernetes.io/projected/3a8d4eed-9d78-407f-a3fb-db0af1beac68-kube-api-access-zvgt5\") pod \"placement-db-create-cdhl9\" (UID: \"3a8d4eed-9d78-407f-a3fb-db0af1beac68\") " pod="openstack/placement-db-create-cdhl9" Oct 12 07:48:23 crc kubenswrapper[4599]: I1012 07:48:23.146024 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-gbrxb"] Oct 12 07:48:23 crc kubenswrapper[4599]: I1012 07:48:23.152640 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-gbrxb" Oct 12 07:48:23 crc kubenswrapper[4599]: I1012 07:48:23.153244 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-gbrxb"] Oct 12 07:48:23 crc kubenswrapper[4599]: I1012 07:48:23.162462 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvgt5\" (UniqueName: \"kubernetes.io/projected/3a8d4eed-9d78-407f-a3fb-db0af1beac68-kube-api-access-zvgt5\") pod \"placement-db-create-cdhl9\" (UID: \"3a8d4eed-9d78-407f-a3fb-db0af1beac68\") " pod="openstack/placement-db-create-cdhl9" Oct 12 07:48:23 crc kubenswrapper[4599]: I1012 07:48:23.247982 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knpqj\" (UniqueName: \"kubernetes.io/projected/43961ba0-a4ba-4c09-875a-e461a27bac2a-kube-api-access-knpqj\") pod \"glance-db-create-gbrxb\" (UID: \"43961ba0-a4ba-4c09-875a-e461a27bac2a\") " pod="openstack/glance-db-create-gbrxb" Oct 12 07:48:23 crc kubenswrapper[4599]: I1012 07:48:23.281408 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-cdhl9" Oct 12 07:48:23 crc kubenswrapper[4599]: I1012 07:48:23.353086 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knpqj\" (UniqueName: \"kubernetes.io/projected/43961ba0-a4ba-4c09-875a-e461a27bac2a-kube-api-access-knpqj\") pod \"glance-db-create-gbrxb\" (UID: \"43961ba0-a4ba-4c09-875a-e461a27bac2a\") " pod="openstack/glance-db-create-gbrxb" Oct 12 07:48:23 crc kubenswrapper[4599]: I1012 07:48:23.375916 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knpqj\" (UniqueName: \"kubernetes.io/projected/43961ba0-a4ba-4c09-875a-e461a27bac2a-kube-api-access-knpqj\") pod \"glance-db-create-gbrxb\" (UID: \"43961ba0-a4ba-4c09-875a-e461a27bac2a\") " pod="openstack/glance-db-create-gbrxb" Oct 12 07:48:23 crc kubenswrapper[4599]: I1012 07:48:23.456544 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-qbjjh"] Oct 12 07:48:23 crc kubenswrapper[4599]: I1012 07:48:23.522417 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-gbrxb" Oct 12 07:48:23 crc kubenswrapper[4599]: I1012 07:48:23.691629 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-cdhl9"] Oct 12 07:48:23 crc kubenswrapper[4599]: W1012 07:48:23.695242 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3a8d4eed_9d78_407f_a3fb_db0af1beac68.slice/crio-7f7fd1519e006b8b7628ead696511694a399a319f18efe8e603ad42db8f6d616 WatchSource:0}: Error finding container 7f7fd1519e006b8b7628ead696511694a399a319f18efe8e603ad42db8f6d616: Status 404 returned error can't find the container with id 7f7fd1519e006b8b7628ead696511694a399a319f18efe8e603ad42db8f6d616 Oct 12 07:48:23 crc kubenswrapper[4599]: I1012 07:48:23.760124 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7f1c5e37-2b6d-4058-86e1-466baaa0f6c4-etc-swift\") pod \"swift-storage-0\" (UID: \"7f1c5e37-2b6d-4058-86e1-466baaa0f6c4\") " pod="openstack/swift-storage-0" Oct 12 07:48:23 crc kubenswrapper[4599]: E1012 07:48:23.760389 4599 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 12 07:48:23 crc kubenswrapper[4599]: E1012 07:48:23.760419 4599 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 12 07:48:23 crc kubenswrapper[4599]: E1012 07:48:23.760496 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7f1c5e37-2b6d-4058-86e1-466baaa0f6c4-etc-swift podName:7f1c5e37-2b6d-4058-86e1-466baaa0f6c4 nodeName:}" failed. No retries permitted until 2025-10-12 07:48:31.760471031 +0000 UTC m=+808.549666533 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/7f1c5e37-2b6d-4058-86e1-466baaa0f6c4-etc-swift") pod "swift-storage-0" (UID: "7f1c5e37-2b6d-4058-86e1-466baaa0f6c4") : configmap "swift-ring-files" not found Oct 12 07:48:23 crc kubenswrapper[4599]: I1012 07:48:23.921703 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-gbrxb"] Oct 12 07:48:23 crc kubenswrapper[4599]: W1012 07:48:23.923763 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod43961ba0_a4ba_4c09_875a_e461a27bac2a.slice/crio-b1212f84406645e6e6fc66b6bfcef1e778a02448bb5d8ea09f3e93a856014876 WatchSource:0}: Error finding container b1212f84406645e6e6fc66b6bfcef1e778a02448bb5d8ea09f3e93a856014876: Status 404 returned error can't find the container with id b1212f84406645e6e6fc66b6bfcef1e778a02448bb5d8ea09f3e93a856014876 Oct 12 07:48:24 crc kubenswrapper[4599]: I1012 07:48:24.094971 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-qbjjh" event={"ID":"85dd307d-9c6b-4d7d-b639-5486431ba73f","Type":"ContainerStarted","Data":"df4c4171c94e0fc268d3f4847c864b27a6e84cae1959d3cb3668daa5d0e6548c"} Oct 12 07:48:24 crc kubenswrapper[4599]: I1012 07:48:24.096895 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-gbrxb" event={"ID":"43961ba0-a4ba-4c09-875a-e461a27bac2a","Type":"ContainerStarted","Data":"b1212f84406645e6e6fc66b6bfcef1e778a02448bb5d8ea09f3e93a856014876"} Oct 12 07:48:24 crc kubenswrapper[4599]: I1012 07:48:24.098101 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-cdhl9" event={"ID":"3a8d4eed-9d78-407f-a3fb-db0af1beac68","Type":"ContainerStarted","Data":"7f7fd1519e006b8b7628ead696511694a399a319f18efe8e603ad42db8f6d616"} Oct 12 07:48:27 crc kubenswrapper[4599]: I1012 07:48:27.092554 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-d5c4f869-vd6xg" Oct 12 07:48:27 crc kubenswrapper[4599]: I1012 07:48:27.155796 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-69bb789bb9-4nnl8"] Oct 12 07:48:27 crc kubenswrapper[4599]: I1012 07:48:27.156063 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-69bb789bb9-4nnl8" podUID="91fafba3-ee3c-4380-be34-b897a835c882" containerName="dnsmasq-dns" containerID="cri-o://07c16b8ba9e98a9dc2495ed15740955c70c56ee6d1d92468562577005f712803" gracePeriod=10 Oct 12 07:48:27 crc kubenswrapper[4599]: I1012 07:48:27.156436 4599 generic.go:334] "Generic (PLEG): container finished" podID="85dd307d-9c6b-4d7d-b639-5486431ba73f" containerID="9225b2079f92039fddefdaea4f9004ffc1332886b48003d919e8eaaba5c27286" exitCode=0 Oct 12 07:48:27 crc kubenswrapper[4599]: I1012 07:48:27.156556 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-qbjjh" event={"ID":"85dd307d-9c6b-4d7d-b639-5486431ba73f","Type":"ContainerDied","Data":"9225b2079f92039fddefdaea4f9004ffc1332886b48003d919e8eaaba5c27286"} Oct 12 07:48:27 crc kubenswrapper[4599]: I1012 07:48:27.163680 4599 generic.go:334] "Generic (PLEG): container finished" podID="43961ba0-a4ba-4c09-875a-e461a27bac2a" containerID="79f4f29286f309ffef46334b4f8654641852a050e08565d35c5a53340cbc443c" exitCode=0 Oct 12 07:48:27 crc kubenswrapper[4599]: I1012 07:48:27.163796 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-gbrxb" event={"ID":"43961ba0-a4ba-4c09-875a-e461a27bac2a","Type":"ContainerDied","Data":"79f4f29286f309ffef46334b4f8654641852a050e08565d35c5a53340cbc443c"} Oct 12 07:48:27 crc kubenswrapper[4599]: I1012 07:48:27.168732 4599 generic.go:334] "Generic (PLEG): container finished" podID="3a8d4eed-9d78-407f-a3fb-db0af1beac68" containerID="b0cbe86729cc44763ec50cdb04ac7a46a5f4f4828cee9ecd30479a94c0f0171d" exitCode=0 Oct 12 07:48:27 crc kubenswrapper[4599]: I1012 07:48:27.168859 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-cdhl9" event={"ID":"3a8d4eed-9d78-407f-a3fb-db0af1beac68","Type":"ContainerDied","Data":"b0cbe86729cc44763ec50cdb04ac7a46a5f4f4828cee9ecd30479a94c0f0171d"} Oct 12 07:48:27 crc kubenswrapper[4599]: I1012 07:48:27.552744 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69bb789bb9-4nnl8" Oct 12 07:48:27 crc kubenswrapper[4599]: I1012 07:48:27.621413 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/91fafba3-ee3c-4380-be34-b897a835c882-dns-svc\") pod \"91fafba3-ee3c-4380-be34-b897a835c882\" (UID: \"91fafba3-ee3c-4380-be34-b897a835c882\") " Oct 12 07:48:27 crc kubenswrapper[4599]: I1012 07:48:27.621512 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9lhqd\" (UniqueName: \"kubernetes.io/projected/91fafba3-ee3c-4380-be34-b897a835c882-kube-api-access-9lhqd\") pod \"91fafba3-ee3c-4380-be34-b897a835c882\" (UID: \"91fafba3-ee3c-4380-be34-b897a835c882\") " Oct 12 07:48:27 crc kubenswrapper[4599]: I1012 07:48:27.621621 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91fafba3-ee3c-4380-be34-b897a835c882-config\") pod \"91fafba3-ee3c-4380-be34-b897a835c882\" (UID: \"91fafba3-ee3c-4380-be34-b897a835c882\") " Oct 12 07:48:27 crc kubenswrapper[4599]: I1012 07:48:27.627010 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91fafba3-ee3c-4380-be34-b897a835c882-kube-api-access-9lhqd" (OuterVolumeSpecName: "kube-api-access-9lhqd") pod "91fafba3-ee3c-4380-be34-b897a835c882" (UID: "91fafba3-ee3c-4380-be34-b897a835c882"). InnerVolumeSpecName "kube-api-access-9lhqd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:48:27 crc kubenswrapper[4599]: I1012 07:48:27.650736 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91fafba3-ee3c-4380-be34-b897a835c882-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "91fafba3-ee3c-4380-be34-b897a835c882" (UID: "91fafba3-ee3c-4380-be34-b897a835c882"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:48:27 crc kubenswrapper[4599]: I1012 07:48:27.654397 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91fafba3-ee3c-4380-be34-b897a835c882-config" (OuterVolumeSpecName: "config") pod "91fafba3-ee3c-4380-be34-b897a835c882" (UID: "91fafba3-ee3c-4380-be34-b897a835c882"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:48:27 crc kubenswrapper[4599]: I1012 07:48:27.723869 4599 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91fafba3-ee3c-4380-be34-b897a835c882-config\") on node \"crc\" DevicePath \"\"" Oct 12 07:48:27 crc kubenswrapper[4599]: I1012 07:48:27.723909 4599 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/91fafba3-ee3c-4380-be34-b897a835c882-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 12 07:48:27 crc kubenswrapper[4599]: I1012 07:48:27.723918 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9lhqd\" (UniqueName: \"kubernetes.io/projected/91fafba3-ee3c-4380-be34-b897a835c882-kube-api-access-9lhqd\") on node \"crc\" DevicePath \"\"" Oct 12 07:48:28 crc kubenswrapper[4599]: I1012 07:48:28.176665 4599 generic.go:334] "Generic (PLEG): container finished" podID="91fafba3-ee3c-4380-be34-b897a835c882" containerID="07c16b8ba9e98a9dc2495ed15740955c70c56ee6d1d92468562577005f712803" exitCode=0 Oct 12 07:48:28 crc kubenswrapper[4599]: I1012 07:48:28.176722 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69bb789bb9-4nnl8" Oct 12 07:48:28 crc kubenswrapper[4599]: I1012 07:48:28.176764 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69bb789bb9-4nnl8" event={"ID":"91fafba3-ee3c-4380-be34-b897a835c882","Type":"ContainerDied","Data":"07c16b8ba9e98a9dc2495ed15740955c70c56ee6d1d92468562577005f712803"} Oct 12 07:48:28 crc kubenswrapper[4599]: I1012 07:48:28.176806 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69bb789bb9-4nnl8" event={"ID":"91fafba3-ee3c-4380-be34-b897a835c882","Type":"ContainerDied","Data":"a85f7795ec0add658f7480a09e462840b1f1615108f33255a271de37b26d9300"} Oct 12 07:48:28 crc kubenswrapper[4599]: I1012 07:48:28.176828 4599 scope.go:117] "RemoveContainer" containerID="07c16b8ba9e98a9dc2495ed15740955c70c56ee6d1d92468562577005f712803" Oct 12 07:48:28 crc kubenswrapper[4599]: I1012 07:48:28.207812 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-69bb789bb9-4nnl8"] Oct 12 07:48:28 crc kubenswrapper[4599]: I1012 07:48:28.213023 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-69bb789bb9-4nnl8"] Oct 12 07:48:28 crc kubenswrapper[4599]: I1012 07:48:28.213192 4599 scope.go:117] "RemoveContainer" containerID="a946191ced61b07ba1d89c37e81acb84650d5f7b40f563b56bbdb52beb01d1a4" Oct 12 07:48:28 crc kubenswrapper[4599]: I1012 07:48:28.233063 4599 scope.go:117] "RemoveContainer" containerID="07c16b8ba9e98a9dc2495ed15740955c70c56ee6d1d92468562577005f712803" Oct 12 07:48:28 crc kubenswrapper[4599]: E1012 07:48:28.234179 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07c16b8ba9e98a9dc2495ed15740955c70c56ee6d1d92468562577005f712803\": container with ID starting with 07c16b8ba9e98a9dc2495ed15740955c70c56ee6d1d92468562577005f712803 not found: ID does not exist" containerID="07c16b8ba9e98a9dc2495ed15740955c70c56ee6d1d92468562577005f712803" Oct 12 07:48:28 crc kubenswrapper[4599]: I1012 07:48:28.234232 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07c16b8ba9e98a9dc2495ed15740955c70c56ee6d1d92468562577005f712803"} err="failed to get container status \"07c16b8ba9e98a9dc2495ed15740955c70c56ee6d1d92468562577005f712803\": rpc error: code = NotFound desc = could not find container \"07c16b8ba9e98a9dc2495ed15740955c70c56ee6d1d92468562577005f712803\": container with ID starting with 07c16b8ba9e98a9dc2495ed15740955c70c56ee6d1d92468562577005f712803 not found: ID does not exist" Oct 12 07:48:28 crc kubenswrapper[4599]: I1012 07:48:28.234265 4599 scope.go:117] "RemoveContainer" containerID="a946191ced61b07ba1d89c37e81acb84650d5f7b40f563b56bbdb52beb01d1a4" Oct 12 07:48:28 crc kubenswrapper[4599]: E1012 07:48:28.234863 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a946191ced61b07ba1d89c37e81acb84650d5f7b40f563b56bbdb52beb01d1a4\": container with ID starting with a946191ced61b07ba1d89c37e81acb84650d5f7b40f563b56bbdb52beb01d1a4 not found: ID does not exist" containerID="a946191ced61b07ba1d89c37e81acb84650d5f7b40f563b56bbdb52beb01d1a4" Oct 12 07:48:28 crc kubenswrapper[4599]: I1012 07:48:28.234891 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a946191ced61b07ba1d89c37e81acb84650d5f7b40f563b56bbdb52beb01d1a4"} err="failed to get container status \"a946191ced61b07ba1d89c37e81acb84650d5f7b40f563b56bbdb52beb01d1a4\": rpc error: code = NotFound desc = could not find container \"a946191ced61b07ba1d89c37e81acb84650d5f7b40f563b56bbdb52beb01d1a4\": container with ID starting with a946191ced61b07ba1d89c37e81acb84650d5f7b40f563b56bbdb52beb01d1a4 not found: ID does not exist" Oct 12 07:48:28 crc kubenswrapper[4599]: I1012 07:48:28.499450 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-qbjjh" Oct 12 07:48:28 crc kubenswrapper[4599]: I1012 07:48:28.570457 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-cdhl9" Oct 12 07:48:28 crc kubenswrapper[4599]: I1012 07:48:28.576068 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-gbrxb" Oct 12 07:48:28 crc kubenswrapper[4599]: I1012 07:48:28.644533 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvgt5\" (UniqueName: \"kubernetes.io/projected/3a8d4eed-9d78-407f-a3fb-db0af1beac68-kube-api-access-zvgt5\") pod \"3a8d4eed-9d78-407f-a3fb-db0af1beac68\" (UID: \"3a8d4eed-9d78-407f-a3fb-db0af1beac68\") " Oct 12 07:48:28 crc kubenswrapper[4599]: I1012 07:48:28.644719 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-knpqj\" (UniqueName: \"kubernetes.io/projected/43961ba0-a4ba-4c09-875a-e461a27bac2a-kube-api-access-knpqj\") pod \"43961ba0-a4ba-4c09-875a-e461a27bac2a\" (UID: \"43961ba0-a4ba-4c09-875a-e461a27bac2a\") " Oct 12 07:48:28 crc kubenswrapper[4599]: I1012 07:48:28.644946 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qx8d6\" (UniqueName: \"kubernetes.io/projected/85dd307d-9c6b-4d7d-b639-5486431ba73f-kube-api-access-qx8d6\") pod \"85dd307d-9c6b-4d7d-b639-5486431ba73f\" (UID: \"85dd307d-9c6b-4d7d-b639-5486431ba73f\") " Oct 12 07:48:28 crc kubenswrapper[4599]: I1012 07:48:28.651313 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a8d4eed-9d78-407f-a3fb-db0af1beac68-kube-api-access-zvgt5" (OuterVolumeSpecName: "kube-api-access-zvgt5") pod "3a8d4eed-9d78-407f-a3fb-db0af1beac68" (UID: "3a8d4eed-9d78-407f-a3fb-db0af1beac68"). InnerVolumeSpecName "kube-api-access-zvgt5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:48:28 crc kubenswrapper[4599]: I1012 07:48:28.651645 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43961ba0-a4ba-4c09-875a-e461a27bac2a-kube-api-access-knpqj" (OuterVolumeSpecName: "kube-api-access-knpqj") pod "43961ba0-a4ba-4c09-875a-e461a27bac2a" (UID: "43961ba0-a4ba-4c09-875a-e461a27bac2a"). InnerVolumeSpecName "kube-api-access-knpqj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:48:28 crc kubenswrapper[4599]: I1012 07:48:28.651837 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85dd307d-9c6b-4d7d-b639-5486431ba73f-kube-api-access-qx8d6" (OuterVolumeSpecName: "kube-api-access-qx8d6") pod "85dd307d-9c6b-4d7d-b639-5486431ba73f" (UID: "85dd307d-9c6b-4d7d-b639-5486431ba73f"). InnerVolumeSpecName "kube-api-access-qx8d6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:48:28 crc kubenswrapper[4599]: I1012 07:48:28.748361 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qx8d6\" (UniqueName: \"kubernetes.io/projected/85dd307d-9c6b-4d7d-b639-5486431ba73f-kube-api-access-qx8d6\") on node \"crc\" DevicePath \"\"" Oct 12 07:48:28 crc kubenswrapper[4599]: I1012 07:48:28.748512 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zvgt5\" (UniqueName: \"kubernetes.io/projected/3a8d4eed-9d78-407f-a3fb-db0af1beac68-kube-api-access-zvgt5\") on node \"crc\" DevicePath \"\"" Oct 12 07:48:28 crc kubenswrapper[4599]: I1012 07:48:28.748579 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-knpqj\" (UniqueName: \"kubernetes.io/projected/43961ba0-a4ba-4c09-875a-e461a27bac2a-kube-api-access-knpqj\") on node \"crc\" DevicePath \"\"" Oct 12 07:48:29 crc kubenswrapper[4599]: I1012 07:48:29.186479 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-cdhl9" Oct 12 07:48:29 crc kubenswrapper[4599]: I1012 07:48:29.186509 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-cdhl9" event={"ID":"3a8d4eed-9d78-407f-a3fb-db0af1beac68","Type":"ContainerDied","Data":"7f7fd1519e006b8b7628ead696511694a399a319f18efe8e603ad42db8f6d616"} Oct 12 07:48:29 crc kubenswrapper[4599]: I1012 07:48:29.186565 4599 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f7fd1519e006b8b7628ead696511694a399a319f18efe8e603ad42db8f6d616" Oct 12 07:48:29 crc kubenswrapper[4599]: I1012 07:48:29.189932 4599 generic.go:334] "Generic (PLEG): container finished" podID="3f44cfe9-f015-4084-b100-fbb08f528667" containerID="5e8806a25e947a4db8fde27c486086a1e9bf4a5d19f4bc04ea2fadf8f0b4cc93" exitCode=0 Oct 12 07:48:29 crc kubenswrapper[4599]: I1012 07:48:29.190002 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-w8g6f" event={"ID":"3f44cfe9-f015-4084-b100-fbb08f528667","Type":"ContainerDied","Data":"5e8806a25e947a4db8fde27c486086a1e9bf4a5d19f4bc04ea2fadf8f0b4cc93"} Oct 12 07:48:29 crc kubenswrapper[4599]: I1012 07:48:29.194084 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-qbjjh" Oct 12 07:48:29 crc kubenswrapper[4599]: I1012 07:48:29.194141 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-qbjjh" event={"ID":"85dd307d-9c6b-4d7d-b639-5486431ba73f","Type":"ContainerDied","Data":"df4c4171c94e0fc268d3f4847c864b27a6e84cae1959d3cb3668daa5d0e6548c"} Oct 12 07:48:29 crc kubenswrapper[4599]: I1012 07:48:29.194287 4599 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df4c4171c94e0fc268d3f4847c864b27a6e84cae1959d3cb3668daa5d0e6548c" Oct 12 07:48:29 crc kubenswrapper[4599]: I1012 07:48:29.195787 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-gbrxb" event={"ID":"43961ba0-a4ba-4c09-875a-e461a27bac2a","Type":"ContainerDied","Data":"b1212f84406645e6e6fc66b6bfcef1e778a02448bb5d8ea09f3e93a856014876"} Oct 12 07:48:29 crc kubenswrapper[4599]: I1012 07:48:29.195838 4599 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b1212f84406645e6e6fc66b6bfcef1e778a02448bb5d8ea09f3e93a856014876" Oct 12 07:48:29 crc kubenswrapper[4599]: I1012 07:48:29.195858 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-gbrxb" Oct 12 07:48:29 crc kubenswrapper[4599]: I1012 07:48:29.554000 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91fafba3-ee3c-4380-be34-b897a835c882" path="/var/lib/kubelet/pods/91fafba3-ee3c-4380-be34-b897a835c882/volumes" Oct 12 07:48:30 crc kubenswrapper[4599]: I1012 07:48:30.472165 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-w8g6f" Oct 12 07:48:30 crc kubenswrapper[4599]: I1012 07:48:30.582702 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3f44cfe9-f015-4084-b100-fbb08f528667-scripts\") pod \"3f44cfe9-f015-4084-b100-fbb08f528667\" (UID: \"3f44cfe9-f015-4084-b100-fbb08f528667\") " Oct 12 07:48:30 crc kubenswrapper[4599]: I1012 07:48:30.582792 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3f44cfe9-f015-4084-b100-fbb08f528667-swiftconf\") pod \"3f44cfe9-f015-4084-b100-fbb08f528667\" (UID: \"3f44cfe9-f015-4084-b100-fbb08f528667\") " Oct 12 07:48:30 crc kubenswrapper[4599]: I1012 07:48:30.582814 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3f44cfe9-f015-4084-b100-fbb08f528667-ring-data-devices\") pod \"3f44cfe9-f015-4084-b100-fbb08f528667\" (UID: \"3f44cfe9-f015-4084-b100-fbb08f528667\") " Oct 12 07:48:30 crc kubenswrapper[4599]: I1012 07:48:30.583068 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f44cfe9-f015-4084-b100-fbb08f528667-combined-ca-bundle\") pod \"3f44cfe9-f015-4084-b100-fbb08f528667\" (UID: \"3f44cfe9-f015-4084-b100-fbb08f528667\") " Oct 12 07:48:30 crc kubenswrapper[4599]: I1012 07:48:30.583223 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3f44cfe9-f015-4084-b100-fbb08f528667-etc-swift\") pod \"3f44cfe9-f015-4084-b100-fbb08f528667\" (UID: \"3f44cfe9-f015-4084-b100-fbb08f528667\") " Oct 12 07:48:30 crc kubenswrapper[4599]: I1012 07:48:30.583328 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5llh\" (UniqueName: \"kubernetes.io/projected/3f44cfe9-f015-4084-b100-fbb08f528667-kube-api-access-t5llh\") pod \"3f44cfe9-f015-4084-b100-fbb08f528667\" (UID: \"3f44cfe9-f015-4084-b100-fbb08f528667\") " Oct 12 07:48:30 crc kubenswrapper[4599]: I1012 07:48:30.583469 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3f44cfe9-f015-4084-b100-fbb08f528667-dispersionconf\") pod \"3f44cfe9-f015-4084-b100-fbb08f528667\" (UID: \"3f44cfe9-f015-4084-b100-fbb08f528667\") " Oct 12 07:48:30 crc kubenswrapper[4599]: I1012 07:48:30.584147 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f44cfe9-f015-4084-b100-fbb08f528667-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "3f44cfe9-f015-4084-b100-fbb08f528667" (UID: "3f44cfe9-f015-4084-b100-fbb08f528667"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:48:30 crc kubenswrapper[4599]: I1012 07:48:30.584298 4599 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3f44cfe9-f015-4084-b100-fbb08f528667-ring-data-devices\") on node \"crc\" DevicePath \"\"" Oct 12 07:48:30 crc kubenswrapper[4599]: I1012 07:48:30.584720 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f44cfe9-f015-4084-b100-fbb08f528667-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "3f44cfe9-f015-4084-b100-fbb08f528667" (UID: "3f44cfe9-f015-4084-b100-fbb08f528667"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:48:30 crc kubenswrapper[4599]: I1012 07:48:30.589909 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f44cfe9-f015-4084-b100-fbb08f528667-kube-api-access-t5llh" (OuterVolumeSpecName: "kube-api-access-t5llh") pod "3f44cfe9-f015-4084-b100-fbb08f528667" (UID: "3f44cfe9-f015-4084-b100-fbb08f528667"). InnerVolumeSpecName "kube-api-access-t5llh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:48:30 crc kubenswrapper[4599]: I1012 07:48:30.593067 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f44cfe9-f015-4084-b100-fbb08f528667-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "3f44cfe9-f015-4084-b100-fbb08f528667" (UID: "3f44cfe9-f015-4084-b100-fbb08f528667"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:48:30 crc kubenswrapper[4599]: I1012 07:48:30.602312 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f44cfe9-f015-4084-b100-fbb08f528667-scripts" (OuterVolumeSpecName: "scripts") pod "3f44cfe9-f015-4084-b100-fbb08f528667" (UID: "3f44cfe9-f015-4084-b100-fbb08f528667"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:48:30 crc kubenswrapper[4599]: I1012 07:48:30.603311 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f44cfe9-f015-4084-b100-fbb08f528667-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3f44cfe9-f015-4084-b100-fbb08f528667" (UID: "3f44cfe9-f015-4084-b100-fbb08f528667"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:48:30 crc kubenswrapper[4599]: I1012 07:48:30.606034 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f44cfe9-f015-4084-b100-fbb08f528667-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "3f44cfe9-f015-4084-b100-fbb08f528667" (UID: "3f44cfe9-f015-4084-b100-fbb08f528667"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:48:30 crc kubenswrapper[4599]: I1012 07:48:30.686257 4599 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3f44cfe9-f015-4084-b100-fbb08f528667-dispersionconf\") on node \"crc\" DevicePath \"\"" Oct 12 07:48:30 crc kubenswrapper[4599]: I1012 07:48:30.686421 4599 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3f44cfe9-f015-4084-b100-fbb08f528667-scripts\") on node \"crc\" DevicePath \"\"" Oct 12 07:48:30 crc kubenswrapper[4599]: I1012 07:48:30.686492 4599 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3f44cfe9-f015-4084-b100-fbb08f528667-swiftconf\") on node \"crc\" DevicePath \"\"" Oct 12 07:48:30 crc kubenswrapper[4599]: I1012 07:48:30.686555 4599 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f44cfe9-f015-4084-b100-fbb08f528667-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 07:48:30 crc kubenswrapper[4599]: I1012 07:48:30.686610 4599 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3f44cfe9-f015-4084-b100-fbb08f528667-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 12 07:48:30 crc kubenswrapper[4599]: I1012 07:48:30.686662 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t5llh\" (UniqueName: \"kubernetes.io/projected/3f44cfe9-f015-4084-b100-fbb08f528667-kube-api-access-t5llh\") on node \"crc\" DevicePath \"\"" Oct 12 07:48:31 crc kubenswrapper[4599]: I1012 07:48:31.216516 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-w8g6f" event={"ID":"3f44cfe9-f015-4084-b100-fbb08f528667","Type":"ContainerDied","Data":"f63579da5011ec60e2fc0ff351691517f65d506bb93c8b8d44cdce7ffb09ab39"} Oct 12 07:48:31 crc kubenswrapper[4599]: I1012 07:48:31.216576 4599 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f63579da5011ec60e2fc0ff351691517f65d506bb93c8b8d44cdce7ffb09ab39" Oct 12 07:48:31 crc kubenswrapper[4599]: I1012 07:48:31.216585 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-w8g6f" Oct 12 07:48:31 crc kubenswrapper[4599]: I1012 07:48:31.806113 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7f1c5e37-2b6d-4058-86e1-466baaa0f6c4-etc-swift\") pod \"swift-storage-0\" (UID: \"7f1c5e37-2b6d-4058-86e1-466baaa0f6c4\") " pod="openstack/swift-storage-0" Oct 12 07:48:31 crc kubenswrapper[4599]: I1012 07:48:31.810982 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7f1c5e37-2b6d-4058-86e1-466baaa0f6c4-etc-swift\") pod \"swift-storage-0\" (UID: \"7f1c5e37-2b6d-4058-86e1-466baaa0f6c4\") " pod="openstack/swift-storage-0" Oct 12 07:48:32 crc kubenswrapper[4599]: I1012 07:48:32.087922 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 12 07:48:32 crc kubenswrapper[4599]: I1012 07:48:32.547303 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Oct 12 07:48:32 crc kubenswrapper[4599]: I1012 07:48:32.565894 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 12 07:48:33 crc kubenswrapper[4599]: I1012 07:48:33.239559 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7f1c5e37-2b6d-4058-86e1-466baaa0f6c4","Type":"ContainerStarted","Data":"d5c4181ec09a1eab8a17066c561940918c968cabef4180761c40cfba87b94c42"} Oct 12 07:48:33 crc kubenswrapper[4599]: I1012 07:48:33.270046 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-b848-account-create-vx6cq"] Oct 12 07:48:33 crc kubenswrapper[4599]: E1012 07:48:33.270356 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f44cfe9-f015-4084-b100-fbb08f528667" containerName="swift-ring-rebalance" Oct 12 07:48:33 crc kubenswrapper[4599]: I1012 07:48:33.270371 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f44cfe9-f015-4084-b100-fbb08f528667" containerName="swift-ring-rebalance" Oct 12 07:48:33 crc kubenswrapper[4599]: E1012 07:48:33.270397 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43961ba0-a4ba-4c09-875a-e461a27bac2a" containerName="mariadb-database-create" Oct 12 07:48:33 crc kubenswrapper[4599]: I1012 07:48:33.270405 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="43961ba0-a4ba-4c09-875a-e461a27bac2a" containerName="mariadb-database-create" Oct 12 07:48:33 crc kubenswrapper[4599]: E1012 07:48:33.270413 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91fafba3-ee3c-4380-be34-b897a835c882" containerName="dnsmasq-dns" Oct 12 07:48:33 crc kubenswrapper[4599]: I1012 07:48:33.270419 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="91fafba3-ee3c-4380-be34-b897a835c882" containerName="dnsmasq-dns" Oct 12 07:48:33 crc kubenswrapper[4599]: E1012 07:48:33.270432 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85dd307d-9c6b-4d7d-b639-5486431ba73f" containerName="mariadb-database-create" Oct 12 07:48:33 crc kubenswrapper[4599]: I1012 07:48:33.270437 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="85dd307d-9c6b-4d7d-b639-5486431ba73f" containerName="mariadb-database-create" Oct 12 07:48:33 crc kubenswrapper[4599]: E1012 07:48:33.270447 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91fafba3-ee3c-4380-be34-b897a835c882" containerName="init" Oct 12 07:48:33 crc kubenswrapper[4599]: I1012 07:48:33.270454 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="91fafba3-ee3c-4380-be34-b897a835c882" containerName="init" Oct 12 07:48:33 crc kubenswrapper[4599]: E1012 07:48:33.270467 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a8d4eed-9d78-407f-a3fb-db0af1beac68" containerName="mariadb-database-create" Oct 12 07:48:33 crc kubenswrapper[4599]: I1012 07:48:33.270472 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a8d4eed-9d78-407f-a3fb-db0af1beac68" containerName="mariadb-database-create" Oct 12 07:48:33 crc kubenswrapper[4599]: I1012 07:48:33.270605 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="85dd307d-9c6b-4d7d-b639-5486431ba73f" containerName="mariadb-database-create" Oct 12 07:48:33 crc kubenswrapper[4599]: I1012 07:48:33.270613 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a8d4eed-9d78-407f-a3fb-db0af1beac68" containerName="mariadb-database-create" Oct 12 07:48:33 crc kubenswrapper[4599]: I1012 07:48:33.270622 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="91fafba3-ee3c-4380-be34-b897a835c882" containerName="dnsmasq-dns" Oct 12 07:48:33 crc kubenswrapper[4599]: I1012 07:48:33.270632 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f44cfe9-f015-4084-b100-fbb08f528667" containerName="swift-ring-rebalance" Oct 12 07:48:33 crc kubenswrapper[4599]: I1012 07:48:33.270638 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="43961ba0-a4ba-4c09-875a-e461a27bac2a" containerName="mariadb-database-create" Oct 12 07:48:33 crc kubenswrapper[4599]: I1012 07:48:33.271126 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b848-account-create-vx6cq" Oct 12 07:48:33 crc kubenswrapper[4599]: I1012 07:48:33.272839 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Oct 12 07:48:33 crc kubenswrapper[4599]: I1012 07:48:33.276152 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-b848-account-create-vx6cq"] Oct 12 07:48:33 crc kubenswrapper[4599]: I1012 07:48:33.433571 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtdnf\" (UniqueName: \"kubernetes.io/projected/53b52909-5298-47ad-a169-91067086a742-kube-api-access-gtdnf\") pod \"glance-b848-account-create-vx6cq\" (UID: \"53b52909-5298-47ad-a169-91067086a742\") " pod="openstack/glance-b848-account-create-vx6cq" Oct 12 07:48:33 crc kubenswrapper[4599]: I1012 07:48:33.535200 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtdnf\" (UniqueName: \"kubernetes.io/projected/53b52909-5298-47ad-a169-91067086a742-kube-api-access-gtdnf\") pod \"glance-b848-account-create-vx6cq\" (UID: \"53b52909-5298-47ad-a169-91067086a742\") " pod="openstack/glance-b848-account-create-vx6cq" Oct 12 07:48:33 crc kubenswrapper[4599]: I1012 07:48:33.574781 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtdnf\" (UniqueName: \"kubernetes.io/projected/53b52909-5298-47ad-a169-91067086a742-kube-api-access-gtdnf\") pod \"glance-b848-account-create-vx6cq\" (UID: \"53b52909-5298-47ad-a169-91067086a742\") " pod="openstack/glance-b848-account-create-vx6cq" Oct 12 07:48:33 crc kubenswrapper[4599]: I1012 07:48:33.592488 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b848-account-create-vx6cq" Oct 12 07:48:33 crc kubenswrapper[4599]: I1012 07:48:33.963483 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-b848-account-create-vx6cq"] Oct 12 07:48:34 crc kubenswrapper[4599]: W1012 07:48:34.074370 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod53b52909_5298_47ad_a169_91067086a742.slice/crio-e92030d5b8a6bc1ad553eddfc87e91b2a8bd2673b002b34ad5199dd8f28ce179 WatchSource:0}: Error finding container e92030d5b8a6bc1ad553eddfc87e91b2a8bd2673b002b34ad5199dd8f28ce179: Status 404 returned error can't find the container with id e92030d5b8a6bc1ad553eddfc87e91b2a8bd2673b002b34ad5199dd8f28ce179 Oct 12 07:48:34 crc kubenswrapper[4599]: I1012 07:48:34.249683 4599 generic.go:334] "Generic (PLEG): container finished" podID="53b52909-5298-47ad-a169-91067086a742" containerID="55963fb841c3f600fb80adce0627448e7a10416bcc3f410bd1453d2c01b3066d" exitCode=0 Oct 12 07:48:34 crc kubenswrapper[4599]: I1012 07:48:34.249773 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b848-account-create-vx6cq" event={"ID":"53b52909-5298-47ad-a169-91067086a742","Type":"ContainerDied","Data":"55963fb841c3f600fb80adce0627448e7a10416bcc3f410bd1453d2c01b3066d"} Oct 12 07:48:34 crc kubenswrapper[4599]: I1012 07:48:34.249812 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b848-account-create-vx6cq" event={"ID":"53b52909-5298-47ad-a169-91067086a742","Type":"ContainerStarted","Data":"e92030d5b8a6bc1ad553eddfc87e91b2a8bd2673b002b34ad5199dd8f28ce179"} Oct 12 07:48:34 crc kubenswrapper[4599]: I1012 07:48:34.251674 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7f1c5e37-2b6d-4058-86e1-466baaa0f6c4","Type":"ContainerStarted","Data":"a1772f82b8b091b1195afbf934d890ba0d55bf5f09e43769ff7d2668c386d19e"} Oct 12 07:48:35 crc kubenswrapper[4599]: I1012 07:48:35.262849 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7f1c5e37-2b6d-4058-86e1-466baaa0f6c4","Type":"ContainerStarted","Data":"20d5caddd703de46431beced1914125e51ea69ae54dd223f53913c3130bd0643"} Oct 12 07:48:35 crc kubenswrapper[4599]: I1012 07:48:35.263253 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7f1c5e37-2b6d-4058-86e1-466baaa0f6c4","Type":"ContainerStarted","Data":"57a6e6f66351af571d264d52b35f7b00611030fd9d3f92fc7b36712475271e15"} Oct 12 07:48:35 crc kubenswrapper[4599]: I1012 07:48:35.263267 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7f1c5e37-2b6d-4058-86e1-466baaa0f6c4","Type":"ContainerStarted","Data":"00c795e2a506ee4ac11c5109c356a76b3342edcd1217e106dd6d1604f66cfb3b"} Oct 12 07:48:35 crc kubenswrapper[4599]: I1012 07:48:35.521412 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b848-account-create-vx6cq" Oct 12 07:48:35 crc kubenswrapper[4599]: I1012 07:48:35.680766 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gtdnf\" (UniqueName: \"kubernetes.io/projected/53b52909-5298-47ad-a169-91067086a742-kube-api-access-gtdnf\") pod \"53b52909-5298-47ad-a169-91067086a742\" (UID: \"53b52909-5298-47ad-a169-91067086a742\") " Oct 12 07:48:35 crc kubenswrapper[4599]: I1012 07:48:35.686393 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53b52909-5298-47ad-a169-91067086a742-kube-api-access-gtdnf" (OuterVolumeSpecName: "kube-api-access-gtdnf") pod "53b52909-5298-47ad-a169-91067086a742" (UID: "53b52909-5298-47ad-a169-91067086a742"). InnerVolumeSpecName "kube-api-access-gtdnf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:48:35 crc kubenswrapper[4599]: I1012 07:48:35.783541 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gtdnf\" (UniqueName: \"kubernetes.io/projected/53b52909-5298-47ad-a169-91067086a742-kube-api-access-gtdnf\") on node \"crc\" DevicePath \"\"" Oct 12 07:48:36 crc kubenswrapper[4599]: I1012 07:48:36.276572 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b848-account-create-vx6cq" event={"ID":"53b52909-5298-47ad-a169-91067086a742","Type":"ContainerDied","Data":"e92030d5b8a6bc1ad553eddfc87e91b2a8bd2673b002b34ad5199dd8f28ce179"} Oct 12 07:48:36 crc kubenswrapper[4599]: I1012 07:48:36.276626 4599 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e92030d5b8a6bc1ad553eddfc87e91b2a8bd2673b002b34ad5199dd8f28ce179" Oct 12 07:48:36 crc kubenswrapper[4599]: I1012 07:48:36.276627 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b848-account-create-vx6cq" Oct 12 07:48:37 crc kubenswrapper[4599]: I1012 07:48:37.286572 4599 generic.go:334] "Generic (PLEG): container finished" podID="4612b7e8-a507-4c57-989d-3411e4e302dd" containerID="5dcb020f2b51b1d9b71cd795b65e6f8bc68eb5b0896d6d3e8026f5bbb0c957cb" exitCode=0 Oct 12 07:48:37 crc kubenswrapper[4599]: I1012 07:48:37.286641 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4612b7e8-a507-4c57-989d-3411e4e302dd","Type":"ContainerDied","Data":"5dcb020f2b51b1d9b71cd795b65e6f8bc68eb5b0896d6d3e8026f5bbb0c957cb"} Oct 12 07:48:37 crc kubenswrapper[4599]: I1012 07:48:37.291932 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7f1c5e37-2b6d-4058-86e1-466baaa0f6c4","Type":"ContainerStarted","Data":"c7ce5d1e02d7b1b183e36edb2f9756f4c159017aee6b14e6a9a7decfe69af2f6"} Oct 12 07:48:37 crc kubenswrapper[4599]: I1012 07:48:37.291974 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7f1c5e37-2b6d-4058-86e1-466baaa0f6c4","Type":"ContainerStarted","Data":"47d77aab9599938c1eb8b3e4a430c96564b22f2448cb62319e089b36b85a724f"} Oct 12 07:48:37 crc kubenswrapper[4599]: I1012 07:48:37.291985 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7f1c5e37-2b6d-4058-86e1-466baaa0f6c4","Type":"ContainerStarted","Data":"3b81cdabe76254630be0769f1393582e10425bf46893272b8c773b612ca548ae"} Oct 12 07:48:37 crc kubenswrapper[4599]: I1012 07:48:37.291995 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7f1c5e37-2b6d-4058-86e1-466baaa0f6c4","Type":"ContainerStarted","Data":"ee9b4a91f30176a8cdc4a5492ad2bdb9ca0ff54c613e9dda933d61c25c0fa861"} Oct 12 07:48:37 crc kubenswrapper[4599]: I1012 07:48:37.706147 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-dch7h"] Oct 12 07:48:37 crc kubenswrapper[4599]: E1012 07:48:37.706745 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53b52909-5298-47ad-a169-91067086a742" containerName="mariadb-account-create" Oct 12 07:48:37 crc kubenswrapper[4599]: I1012 07:48:37.706766 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="53b52909-5298-47ad-a169-91067086a742" containerName="mariadb-account-create" Oct 12 07:48:37 crc kubenswrapper[4599]: I1012 07:48:37.706944 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="53b52909-5298-47ad-a169-91067086a742" containerName="mariadb-account-create" Oct 12 07:48:37 crc kubenswrapper[4599]: I1012 07:48:37.708097 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dch7h" Oct 12 07:48:37 crc kubenswrapper[4599]: I1012 07:48:37.716586 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dch7h"] Oct 12 07:48:37 crc kubenswrapper[4599]: I1012 07:48:37.814763 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44c467ff-3eaf-42e6-94c7-6699da7f8be8-utilities\") pod \"community-operators-dch7h\" (UID: \"44c467ff-3eaf-42e6-94c7-6699da7f8be8\") " pod="openshift-marketplace/community-operators-dch7h" Oct 12 07:48:37 crc kubenswrapper[4599]: I1012 07:48:37.814844 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44c467ff-3eaf-42e6-94c7-6699da7f8be8-catalog-content\") pod \"community-operators-dch7h\" (UID: \"44c467ff-3eaf-42e6-94c7-6699da7f8be8\") " pod="openshift-marketplace/community-operators-dch7h" Oct 12 07:48:37 crc kubenswrapper[4599]: I1012 07:48:37.814932 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjrqg\" (UniqueName: \"kubernetes.io/projected/44c467ff-3eaf-42e6-94c7-6699da7f8be8-kube-api-access-gjrqg\") pod \"community-operators-dch7h\" (UID: \"44c467ff-3eaf-42e6-94c7-6699da7f8be8\") " pod="openshift-marketplace/community-operators-dch7h" Oct 12 07:48:37 crc kubenswrapper[4599]: I1012 07:48:37.916324 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44c467ff-3eaf-42e6-94c7-6699da7f8be8-catalog-content\") pod \"community-operators-dch7h\" (UID: \"44c467ff-3eaf-42e6-94c7-6699da7f8be8\") " pod="openshift-marketplace/community-operators-dch7h" Oct 12 07:48:37 crc kubenswrapper[4599]: I1012 07:48:37.916438 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjrqg\" (UniqueName: \"kubernetes.io/projected/44c467ff-3eaf-42e6-94c7-6699da7f8be8-kube-api-access-gjrqg\") pod \"community-operators-dch7h\" (UID: \"44c467ff-3eaf-42e6-94c7-6699da7f8be8\") " pod="openshift-marketplace/community-operators-dch7h" Oct 12 07:48:37 crc kubenswrapper[4599]: I1012 07:48:37.916498 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44c467ff-3eaf-42e6-94c7-6699da7f8be8-utilities\") pod \"community-operators-dch7h\" (UID: \"44c467ff-3eaf-42e6-94c7-6699da7f8be8\") " pod="openshift-marketplace/community-operators-dch7h" Oct 12 07:48:37 crc kubenswrapper[4599]: I1012 07:48:37.916826 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44c467ff-3eaf-42e6-94c7-6699da7f8be8-catalog-content\") pod \"community-operators-dch7h\" (UID: \"44c467ff-3eaf-42e6-94c7-6699da7f8be8\") " pod="openshift-marketplace/community-operators-dch7h" Oct 12 07:48:37 crc kubenswrapper[4599]: I1012 07:48:37.916898 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44c467ff-3eaf-42e6-94c7-6699da7f8be8-utilities\") pod \"community-operators-dch7h\" (UID: \"44c467ff-3eaf-42e6-94c7-6699da7f8be8\") " pod="openshift-marketplace/community-operators-dch7h" Oct 12 07:48:37 crc kubenswrapper[4599]: I1012 07:48:37.934465 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjrqg\" (UniqueName: \"kubernetes.io/projected/44c467ff-3eaf-42e6-94c7-6699da7f8be8-kube-api-access-gjrqg\") pod \"community-operators-dch7h\" (UID: \"44c467ff-3eaf-42e6-94c7-6699da7f8be8\") " pod="openshift-marketplace/community-operators-dch7h" Oct 12 07:48:38 crc kubenswrapper[4599]: I1012 07:48:38.023713 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dch7h" Oct 12 07:48:38 crc kubenswrapper[4599]: I1012 07:48:38.302817 4599 generic.go:334] "Generic (PLEG): container finished" podID="2e036a1a-bc46-419f-88e4-312037490ec1" containerID="8fe013c90596b20721f39bf5f7fa9a8546584a44b02099db74e730f6de411d84" exitCode=0 Oct 12 07:48:38 crc kubenswrapper[4599]: I1012 07:48:38.302887 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"2e036a1a-bc46-419f-88e4-312037490ec1","Type":"ContainerDied","Data":"8fe013c90596b20721f39bf5f7fa9a8546584a44b02099db74e730f6de411d84"} Oct 12 07:48:38 crc kubenswrapper[4599]: I1012 07:48:38.311442 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4612b7e8-a507-4c57-989d-3411e4e302dd","Type":"ContainerStarted","Data":"d136139635ab8d327cc5286d3b7c813647993b35ce4f04c45a4355e388e1551c"} Oct 12 07:48:38 crc kubenswrapper[4599]: I1012 07:48:38.311651 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 12 07:48:38 crc kubenswrapper[4599]: I1012 07:48:38.350309 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=35.876496941 podStartE2EDuration="1m0.350292951s" podCreationTimestamp="2025-10-12 07:47:38 +0000 UTC" firstStartedPulling="2025-10-12 07:47:40.095631175 +0000 UTC m=+756.884826677" lastFinishedPulling="2025-10-12 07:48:04.569427185 +0000 UTC m=+781.358622687" observedRunningTime="2025-10-12 07:48:38.345750481 +0000 UTC m=+815.134945983" watchObservedRunningTime="2025-10-12 07:48:38.350292951 +0000 UTC m=+815.139488454" Oct 12 07:48:38 crc kubenswrapper[4599]: I1012 07:48:38.415403 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-p542w"] Oct 12 07:48:38 crc kubenswrapper[4599]: I1012 07:48:38.416988 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-p542w" Oct 12 07:48:38 crc kubenswrapper[4599]: I1012 07:48:38.420112 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-sxsrd" Oct 12 07:48:38 crc kubenswrapper[4599]: I1012 07:48:38.420144 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Oct 12 07:48:38 crc kubenswrapper[4599]: I1012 07:48:38.422198 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-p542w"] Oct 12 07:48:38 crc kubenswrapper[4599]: I1012 07:48:38.525811 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0041b3a-cda2-439c-96ae-673642206886-config-data\") pod \"glance-db-sync-p542w\" (UID: \"b0041b3a-cda2-439c-96ae-673642206886\") " pod="openstack/glance-db-sync-p542w" Oct 12 07:48:38 crc kubenswrapper[4599]: I1012 07:48:38.525852 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcglj\" (UniqueName: \"kubernetes.io/projected/b0041b3a-cda2-439c-96ae-673642206886-kube-api-access-vcglj\") pod \"glance-db-sync-p542w\" (UID: \"b0041b3a-cda2-439c-96ae-673642206886\") " pod="openstack/glance-db-sync-p542w" Oct 12 07:48:38 crc kubenswrapper[4599]: I1012 07:48:38.526106 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b0041b3a-cda2-439c-96ae-673642206886-db-sync-config-data\") pod \"glance-db-sync-p542w\" (UID: \"b0041b3a-cda2-439c-96ae-673642206886\") " pod="openstack/glance-db-sync-p542w" Oct 12 07:48:38 crc kubenswrapper[4599]: I1012 07:48:38.526178 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0041b3a-cda2-439c-96ae-673642206886-combined-ca-bundle\") pod \"glance-db-sync-p542w\" (UID: \"b0041b3a-cda2-439c-96ae-673642206886\") " pod="openstack/glance-db-sync-p542w" Oct 12 07:48:38 crc kubenswrapper[4599]: I1012 07:48:38.627425 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b0041b3a-cda2-439c-96ae-673642206886-db-sync-config-data\") pod \"glance-db-sync-p542w\" (UID: \"b0041b3a-cda2-439c-96ae-673642206886\") " pod="openstack/glance-db-sync-p542w" Oct 12 07:48:38 crc kubenswrapper[4599]: I1012 07:48:38.627474 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0041b3a-cda2-439c-96ae-673642206886-combined-ca-bundle\") pod \"glance-db-sync-p542w\" (UID: \"b0041b3a-cda2-439c-96ae-673642206886\") " pod="openstack/glance-db-sync-p542w" Oct 12 07:48:38 crc kubenswrapper[4599]: I1012 07:48:38.627523 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0041b3a-cda2-439c-96ae-673642206886-config-data\") pod \"glance-db-sync-p542w\" (UID: \"b0041b3a-cda2-439c-96ae-673642206886\") " pod="openstack/glance-db-sync-p542w" Oct 12 07:48:38 crc kubenswrapper[4599]: I1012 07:48:38.627564 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcglj\" (UniqueName: \"kubernetes.io/projected/b0041b3a-cda2-439c-96ae-673642206886-kube-api-access-vcglj\") pod \"glance-db-sync-p542w\" (UID: \"b0041b3a-cda2-439c-96ae-673642206886\") " pod="openstack/glance-db-sync-p542w" Oct 12 07:48:38 crc kubenswrapper[4599]: I1012 07:48:38.631309 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b0041b3a-cda2-439c-96ae-673642206886-db-sync-config-data\") pod \"glance-db-sync-p542w\" (UID: \"b0041b3a-cda2-439c-96ae-673642206886\") " pod="openstack/glance-db-sync-p542w" Oct 12 07:48:38 crc kubenswrapper[4599]: I1012 07:48:38.631577 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0041b3a-cda2-439c-96ae-673642206886-config-data\") pod \"glance-db-sync-p542w\" (UID: \"b0041b3a-cda2-439c-96ae-673642206886\") " pod="openstack/glance-db-sync-p542w" Oct 12 07:48:38 crc kubenswrapper[4599]: I1012 07:48:38.632115 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0041b3a-cda2-439c-96ae-673642206886-combined-ca-bundle\") pod \"glance-db-sync-p542w\" (UID: \"b0041b3a-cda2-439c-96ae-673642206886\") " pod="openstack/glance-db-sync-p542w" Oct 12 07:48:38 crc kubenswrapper[4599]: I1012 07:48:38.644681 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcglj\" (UniqueName: \"kubernetes.io/projected/b0041b3a-cda2-439c-96ae-673642206886-kube-api-access-vcglj\") pod \"glance-db-sync-p542w\" (UID: \"b0041b3a-cda2-439c-96ae-673642206886\") " pod="openstack/glance-db-sync-p542w" Oct 12 07:48:38 crc kubenswrapper[4599]: I1012 07:48:38.725970 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dch7h"] Oct 12 07:48:38 crc kubenswrapper[4599]: I1012 07:48:38.768008 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-p542w" Oct 12 07:48:39 crc kubenswrapper[4599]: I1012 07:48:39.113050 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-p542w"] Oct 12 07:48:39 crc kubenswrapper[4599]: W1012 07:48:39.115689 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb0041b3a_cda2_439c_96ae_673642206886.slice/crio-900aea57814e3b2ecb9df6b84fee421cc777399b533b0eac91ee939a4126cad8 WatchSource:0}: Error finding container 900aea57814e3b2ecb9df6b84fee421cc777399b533b0eac91ee939a4126cad8: Status 404 returned error can't find the container with id 900aea57814e3b2ecb9df6b84fee421cc777399b533b0eac91ee939a4126cad8 Oct 12 07:48:39 crc kubenswrapper[4599]: I1012 07:48:39.340357 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7f1c5e37-2b6d-4058-86e1-466baaa0f6c4","Type":"ContainerStarted","Data":"79c7d89b0e86e6c5b5d5d0a60f62053b130dc142b113fa7c921eca0c4a741f64"} Oct 12 07:48:39 crc kubenswrapper[4599]: I1012 07:48:39.340743 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7f1c5e37-2b6d-4058-86e1-466baaa0f6c4","Type":"ContainerStarted","Data":"0cedb03a5f3506b3f805297a01fb2de7201109787c92896b13b80f4be0db9952"} Oct 12 07:48:39 crc kubenswrapper[4599]: I1012 07:48:39.340757 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7f1c5e37-2b6d-4058-86e1-466baaa0f6c4","Type":"ContainerStarted","Data":"7c05922e7f8ace2b332fad3b751212cd2b079cb373ff84c4f16ce0357738d503"} Oct 12 07:48:39 crc kubenswrapper[4599]: I1012 07:48:39.340767 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7f1c5e37-2b6d-4058-86e1-466baaa0f6c4","Type":"ContainerStarted","Data":"d1e7d5afb428f0263f211185d7df4d8f3275ab165fe981f608be3754ab836046"} Oct 12 07:48:39 crc kubenswrapper[4599]: I1012 07:48:39.340777 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7f1c5e37-2b6d-4058-86e1-466baaa0f6c4","Type":"ContainerStarted","Data":"7f3f9bf8964654188f2e51be7949b67e7565d6e2a16b3b4ad4a4e6e8bda0d609"} Oct 12 07:48:39 crc kubenswrapper[4599]: I1012 07:48:39.340788 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7f1c5e37-2b6d-4058-86e1-466baaa0f6c4","Type":"ContainerStarted","Data":"e368b52ea027c65f721e0fe952a8100cf018e7b24e74d6be4e824a5dc97e82f2"} Oct 12 07:48:39 crc kubenswrapper[4599]: I1012 07:48:39.340801 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7f1c5e37-2b6d-4058-86e1-466baaa0f6c4","Type":"ContainerStarted","Data":"5f7dc8c35c4eec7afee1f66abd0c54703b6d1e82eff150f4334892ddcad42ad0"} Oct 12 07:48:39 crc kubenswrapper[4599]: I1012 07:48:39.347860 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"2e036a1a-bc46-419f-88e4-312037490ec1","Type":"ContainerStarted","Data":"ccf62acba6befb5b9bba752893021602b49c1f46c423bd607fea85c3143a2178"} Oct 12 07:48:39 crc kubenswrapper[4599]: I1012 07:48:39.348173 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 12 07:48:39 crc kubenswrapper[4599]: I1012 07:48:39.349703 4599 generic.go:334] "Generic (PLEG): container finished" podID="44c467ff-3eaf-42e6-94c7-6699da7f8be8" containerID="3b6360e9814c6d047759eec910bda2918a1c91aac39dc75eae01fdcd9793910e" exitCode=0 Oct 12 07:48:39 crc kubenswrapper[4599]: I1012 07:48:39.350278 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dch7h" event={"ID":"44c467ff-3eaf-42e6-94c7-6699da7f8be8","Type":"ContainerDied","Data":"3b6360e9814c6d047759eec910bda2918a1c91aac39dc75eae01fdcd9793910e"} Oct 12 07:48:39 crc kubenswrapper[4599]: I1012 07:48:39.350327 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dch7h" event={"ID":"44c467ff-3eaf-42e6-94c7-6699da7f8be8","Type":"ContainerStarted","Data":"d65ee474371afecfb5799d5cf12cd446074d69c333e18cc5865a192753742756"} Oct 12 07:48:39 crc kubenswrapper[4599]: I1012 07:48:39.353915 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-p542w" event={"ID":"b0041b3a-cda2-439c-96ae-673642206886","Type":"ContainerStarted","Data":"900aea57814e3b2ecb9df6b84fee421cc777399b533b0eac91ee939a4126cad8"} Oct 12 07:48:39 crc kubenswrapper[4599]: I1012 07:48:39.398618 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=19.682962609 podStartE2EDuration="25.39859758s" podCreationTimestamp="2025-10-12 07:48:14 +0000 UTC" firstStartedPulling="2025-10-12 07:48:32.585807406 +0000 UTC m=+809.375002909" lastFinishedPulling="2025-10-12 07:48:38.301442378 +0000 UTC m=+815.090637880" observedRunningTime="2025-10-12 07:48:39.380997701 +0000 UTC m=+816.170193203" watchObservedRunningTime="2025-10-12 07:48:39.39859758 +0000 UTC m=+816.187793083" Oct 12 07:48:39 crc kubenswrapper[4599]: I1012 07:48:39.402146 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=-9223371975.452637 podStartE2EDuration="1m1.402139683s" podCreationTimestamp="2025-10-12 07:47:38 +0000 UTC" firstStartedPulling="2025-10-12 07:47:40.378552667 +0000 UTC m=+757.167748170" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:48:39.396161113 +0000 UTC m=+816.185356615" watchObservedRunningTime="2025-10-12 07:48:39.402139683 +0000 UTC m=+816.191335185" Oct 12 07:48:39 crc kubenswrapper[4599]: I1012 07:48:39.707777 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5dcf8755f-m2r8l"] Oct 12 07:48:39 crc kubenswrapper[4599]: I1012 07:48:39.709184 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dcf8755f-m2r8l" Oct 12 07:48:39 crc kubenswrapper[4599]: I1012 07:48:39.711804 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Oct 12 07:48:39 crc kubenswrapper[4599]: I1012 07:48:39.722123 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5dcf8755f-m2r8l"] Oct 12 07:48:39 crc kubenswrapper[4599]: I1012 07:48:39.848099 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d0568da7-e8d7-4506-b405-8f7488ce28f9-dns-svc\") pod \"dnsmasq-dns-5dcf8755f-m2r8l\" (UID: \"d0568da7-e8d7-4506-b405-8f7488ce28f9\") " pod="openstack/dnsmasq-dns-5dcf8755f-m2r8l" Oct 12 07:48:39 crc kubenswrapper[4599]: I1012 07:48:39.848160 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d0568da7-e8d7-4506-b405-8f7488ce28f9-ovsdbserver-sb\") pod \"dnsmasq-dns-5dcf8755f-m2r8l\" (UID: \"d0568da7-e8d7-4506-b405-8f7488ce28f9\") " pod="openstack/dnsmasq-dns-5dcf8755f-m2r8l" Oct 12 07:48:39 crc kubenswrapper[4599]: I1012 07:48:39.848260 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0568da7-e8d7-4506-b405-8f7488ce28f9-config\") pod \"dnsmasq-dns-5dcf8755f-m2r8l\" (UID: \"d0568da7-e8d7-4506-b405-8f7488ce28f9\") " pod="openstack/dnsmasq-dns-5dcf8755f-m2r8l" Oct 12 07:48:39 crc kubenswrapper[4599]: I1012 07:48:39.848305 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flhsf\" (UniqueName: \"kubernetes.io/projected/d0568da7-e8d7-4506-b405-8f7488ce28f9-kube-api-access-flhsf\") pod \"dnsmasq-dns-5dcf8755f-m2r8l\" (UID: \"d0568da7-e8d7-4506-b405-8f7488ce28f9\") " pod="openstack/dnsmasq-dns-5dcf8755f-m2r8l" Oct 12 07:48:39 crc kubenswrapper[4599]: I1012 07:48:39.848321 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d0568da7-e8d7-4506-b405-8f7488ce28f9-ovsdbserver-nb\") pod \"dnsmasq-dns-5dcf8755f-m2r8l\" (UID: \"d0568da7-e8d7-4506-b405-8f7488ce28f9\") " pod="openstack/dnsmasq-dns-5dcf8755f-m2r8l" Oct 12 07:48:39 crc kubenswrapper[4599]: I1012 07:48:39.848448 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d0568da7-e8d7-4506-b405-8f7488ce28f9-dns-swift-storage-0\") pod \"dnsmasq-dns-5dcf8755f-m2r8l\" (UID: \"d0568da7-e8d7-4506-b405-8f7488ce28f9\") " pod="openstack/dnsmasq-dns-5dcf8755f-m2r8l" Oct 12 07:48:39 crc kubenswrapper[4599]: I1012 07:48:39.950249 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d0568da7-e8d7-4506-b405-8f7488ce28f9-dns-swift-storage-0\") pod \"dnsmasq-dns-5dcf8755f-m2r8l\" (UID: \"d0568da7-e8d7-4506-b405-8f7488ce28f9\") " pod="openstack/dnsmasq-dns-5dcf8755f-m2r8l" Oct 12 07:48:39 crc kubenswrapper[4599]: I1012 07:48:39.950508 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d0568da7-e8d7-4506-b405-8f7488ce28f9-dns-svc\") pod \"dnsmasq-dns-5dcf8755f-m2r8l\" (UID: \"d0568da7-e8d7-4506-b405-8f7488ce28f9\") " pod="openstack/dnsmasq-dns-5dcf8755f-m2r8l" Oct 12 07:48:39 crc kubenswrapper[4599]: I1012 07:48:39.950606 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d0568da7-e8d7-4506-b405-8f7488ce28f9-ovsdbserver-sb\") pod \"dnsmasq-dns-5dcf8755f-m2r8l\" (UID: \"d0568da7-e8d7-4506-b405-8f7488ce28f9\") " pod="openstack/dnsmasq-dns-5dcf8755f-m2r8l" Oct 12 07:48:39 crc kubenswrapper[4599]: I1012 07:48:39.950720 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0568da7-e8d7-4506-b405-8f7488ce28f9-config\") pod \"dnsmasq-dns-5dcf8755f-m2r8l\" (UID: \"d0568da7-e8d7-4506-b405-8f7488ce28f9\") " pod="openstack/dnsmasq-dns-5dcf8755f-m2r8l" Oct 12 07:48:39 crc kubenswrapper[4599]: I1012 07:48:39.950844 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flhsf\" (UniqueName: \"kubernetes.io/projected/d0568da7-e8d7-4506-b405-8f7488ce28f9-kube-api-access-flhsf\") pod \"dnsmasq-dns-5dcf8755f-m2r8l\" (UID: \"d0568da7-e8d7-4506-b405-8f7488ce28f9\") " pod="openstack/dnsmasq-dns-5dcf8755f-m2r8l" Oct 12 07:48:39 crc kubenswrapper[4599]: I1012 07:48:39.951304 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d0568da7-e8d7-4506-b405-8f7488ce28f9-ovsdbserver-nb\") pod \"dnsmasq-dns-5dcf8755f-m2r8l\" (UID: \"d0568da7-e8d7-4506-b405-8f7488ce28f9\") " pod="openstack/dnsmasq-dns-5dcf8755f-m2r8l" Oct 12 07:48:39 crc kubenswrapper[4599]: I1012 07:48:39.951135 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d0568da7-e8d7-4506-b405-8f7488ce28f9-dns-swift-storage-0\") pod \"dnsmasq-dns-5dcf8755f-m2r8l\" (UID: \"d0568da7-e8d7-4506-b405-8f7488ce28f9\") " pod="openstack/dnsmasq-dns-5dcf8755f-m2r8l" Oct 12 07:48:39 crc kubenswrapper[4599]: I1012 07:48:39.951399 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d0568da7-e8d7-4506-b405-8f7488ce28f9-dns-svc\") pod \"dnsmasq-dns-5dcf8755f-m2r8l\" (UID: \"d0568da7-e8d7-4506-b405-8f7488ce28f9\") " pod="openstack/dnsmasq-dns-5dcf8755f-m2r8l" Oct 12 07:48:39 crc kubenswrapper[4599]: I1012 07:48:39.951526 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d0568da7-e8d7-4506-b405-8f7488ce28f9-ovsdbserver-sb\") pod \"dnsmasq-dns-5dcf8755f-m2r8l\" (UID: \"d0568da7-e8d7-4506-b405-8f7488ce28f9\") " pod="openstack/dnsmasq-dns-5dcf8755f-m2r8l" Oct 12 07:48:39 crc kubenswrapper[4599]: I1012 07:48:39.951677 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0568da7-e8d7-4506-b405-8f7488ce28f9-config\") pod \"dnsmasq-dns-5dcf8755f-m2r8l\" (UID: \"d0568da7-e8d7-4506-b405-8f7488ce28f9\") " pod="openstack/dnsmasq-dns-5dcf8755f-m2r8l" Oct 12 07:48:39 crc kubenswrapper[4599]: I1012 07:48:39.952095 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d0568da7-e8d7-4506-b405-8f7488ce28f9-ovsdbserver-nb\") pod \"dnsmasq-dns-5dcf8755f-m2r8l\" (UID: \"d0568da7-e8d7-4506-b405-8f7488ce28f9\") " pod="openstack/dnsmasq-dns-5dcf8755f-m2r8l" Oct 12 07:48:39 crc kubenswrapper[4599]: I1012 07:48:39.970782 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flhsf\" (UniqueName: \"kubernetes.io/projected/d0568da7-e8d7-4506-b405-8f7488ce28f9-kube-api-access-flhsf\") pod \"dnsmasq-dns-5dcf8755f-m2r8l\" (UID: \"d0568da7-e8d7-4506-b405-8f7488ce28f9\") " pod="openstack/dnsmasq-dns-5dcf8755f-m2r8l" Oct 12 07:48:40 crc kubenswrapper[4599]: I1012 07:48:40.028595 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dcf8755f-m2r8l" Oct 12 07:48:40 crc kubenswrapper[4599]: I1012 07:48:40.365039 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dch7h" event={"ID":"44c467ff-3eaf-42e6-94c7-6699da7f8be8","Type":"ContainerStarted","Data":"ca1ea712abc6125505dea3af9d309f94fbce16a566858cd098da8fb6995925d8"} Oct 12 07:48:40 crc kubenswrapper[4599]: I1012 07:48:40.442492 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5dcf8755f-m2r8l"] Oct 12 07:48:40 crc kubenswrapper[4599]: W1012 07:48:40.446385 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd0568da7_e8d7_4506_b405_8f7488ce28f9.slice/crio-7c5721846c6c7c8df2b5bf313217b7a558c8304aab178d40e64c380422f42713 WatchSource:0}: Error finding container 7c5721846c6c7c8df2b5bf313217b7a558c8304aab178d40e64c380422f42713: Status 404 returned error can't find the container with id 7c5721846c6c7c8df2b5bf313217b7a558c8304aab178d40e64c380422f42713 Oct 12 07:48:41 crc kubenswrapper[4599]: I1012 07:48:41.376307 4599 generic.go:334] "Generic (PLEG): container finished" podID="44c467ff-3eaf-42e6-94c7-6699da7f8be8" containerID="ca1ea712abc6125505dea3af9d309f94fbce16a566858cd098da8fb6995925d8" exitCode=0 Oct 12 07:48:41 crc kubenswrapper[4599]: I1012 07:48:41.376381 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dch7h" event={"ID":"44c467ff-3eaf-42e6-94c7-6699da7f8be8","Type":"ContainerDied","Data":"ca1ea712abc6125505dea3af9d309f94fbce16a566858cd098da8fb6995925d8"} Oct 12 07:48:41 crc kubenswrapper[4599]: I1012 07:48:41.380149 4599 generic.go:334] "Generic (PLEG): container finished" podID="d0568da7-e8d7-4506-b405-8f7488ce28f9" containerID="c6a64ed85521f30ad8a4e0e35efdf36c47d2fc61e6549e3e9f947967949f0233" exitCode=0 Oct 12 07:48:41 crc kubenswrapper[4599]: I1012 07:48:41.380212 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dcf8755f-m2r8l" event={"ID":"d0568da7-e8d7-4506-b405-8f7488ce28f9","Type":"ContainerDied","Data":"c6a64ed85521f30ad8a4e0e35efdf36c47d2fc61e6549e3e9f947967949f0233"} Oct 12 07:48:41 crc kubenswrapper[4599]: I1012 07:48:41.380251 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dcf8755f-m2r8l" event={"ID":"d0568da7-e8d7-4506-b405-8f7488ce28f9","Type":"ContainerStarted","Data":"7c5721846c6c7c8df2b5bf313217b7a558c8304aab178d40e64c380422f42713"} Oct 12 07:48:42 crc kubenswrapper[4599]: I1012 07:48:42.396374 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dch7h" event={"ID":"44c467ff-3eaf-42e6-94c7-6699da7f8be8","Type":"ContainerStarted","Data":"878f1162b86b336dc5148caa2fb8b48e2328b2d61c559bf1baed534d12fe7746"} Oct 12 07:48:42 crc kubenswrapper[4599]: I1012 07:48:42.399560 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dcf8755f-m2r8l" event={"ID":"d0568da7-e8d7-4506-b405-8f7488ce28f9","Type":"ContainerStarted","Data":"ad652b3ca8bdcc1691f62d028ce31f4973911eb7043f562c6184fa5e4c28e41e"} Oct 12 07:48:42 crc kubenswrapper[4599]: I1012 07:48:42.400249 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5dcf8755f-m2r8l" Oct 12 07:48:42 crc kubenswrapper[4599]: I1012 07:48:42.424271 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-dch7h" podStartSLOduration=2.85949151 podStartE2EDuration="5.42424674s" podCreationTimestamp="2025-10-12 07:48:37 +0000 UTC" firstStartedPulling="2025-10-12 07:48:39.352689057 +0000 UTC m=+816.141884559" lastFinishedPulling="2025-10-12 07:48:41.917444287 +0000 UTC m=+818.706639789" observedRunningTime="2025-10-12 07:48:42.418356034 +0000 UTC m=+819.207551536" watchObservedRunningTime="2025-10-12 07:48:42.42424674 +0000 UTC m=+819.213442242" Oct 12 07:48:42 crc kubenswrapper[4599]: I1012 07:48:42.445667 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5dcf8755f-m2r8l" podStartSLOduration=3.445645656 podStartE2EDuration="3.445645656s" podCreationTimestamp="2025-10-12 07:48:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:48:42.438948159 +0000 UTC m=+819.228143661" watchObservedRunningTime="2025-10-12 07:48:42.445645656 +0000 UTC m=+819.234841158" Oct 12 07:48:42 crc kubenswrapper[4599]: I1012 07:48:42.757485 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-fe76-account-create-8pkc5"] Oct 12 07:48:42 crc kubenswrapper[4599]: I1012 07:48:42.758889 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-fe76-account-create-8pkc5" Oct 12 07:48:42 crc kubenswrapper[4599]: I1012 07:48:42.760756 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Oct 12 07:48:42 crc kubenswrapper[4599]: I1012 07:48:42.765833 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-fe76-account-create-8pkc5"] Oct 12 07:48:42 crc kubenswrapper[4599]: I1012 07:48:42.805999 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqkh2\" (UniqueName: \"kubernetes.io/projected/cbd9c1bf-e0d7-4ee9-acfb-864371dd2237-kube-api-access-lqkh2\") pod \"keystone-fe76-account-create-8pkc5\" (UID: \"cbd9c1bf-e0d7-4ee9-acfb-864371dd2237\") " pod="openstack/keystone-fe76-account-create-8pkc5" Oct 12 07:48:42 crc kubenswrapper[4599]: I1012 07:48:42.908250 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqkh2\" (UniqueName: \"kubernetes.io/projected/cbd9c1bf-e0d7-4ee9-acfb-864371dd2237-kube-api-access-lqkh2\") pod \"keystone-fe76-account-create-8pkc5\" (UID: \"cbd9c1bf-e0d7-4ee9-acfb-864371dd2237\") " pod="openstack/keystone-fe76-account-create-8pkc5" Oct 12 07:48:42 crc kubenswrapper[4599]: I1012 07:48:42.928294 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqkh2\" (UniqueName: \"kubernetes.io/projected/cbd9c1bf-e0d7-4ee9-acfb-864371dd2237-kube-api-access-lqkh2\") pod \"keystone-fe76-account-create-8pkc5\" (UID: \"cbd9c1bf-e0d7-4ee9-acfb-864371dd2237\") " pod="openstack/keystone-fe76-account-create-8pkc5" Oct 12 07:48:43 crc kubenswrapper[4599]: I1012 07:48:43.059461 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-79f1-account-create-n2j4q"] Oct 12 07:48:43 crc kubenswrapper[4599]: I1012 07:48:43.060417 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-79f1-account-create-n2j4q" Oct 12 07:48:43 crc kubenswrapper[4599]: I1012 07:48:43.063079 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Oct 12 07:48:43 crc kubenswrapper[4599]: I1012 07:48:43.072678 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-79f1-account-create-n2j4q"] Oct 12 07:48:43 crc kubenswrapper[4599]: I1012 07:48:43.082607 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-fe76-account-create-8pkc5" Oct 12 07:48:43 crc kubenswrapper[4599]: I1012 07:48:43.112136 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6hmm\" (UniqueName: \"kubernetes.io/projected/bc2fcc8c-96c9-4892-968d-f738570bc088-kube-api-access-k6hmm\") pod \"placement-79f1-account-create-n2j4q\" (UID: \"bc2fcc8c-96c9-4892-968d-f738570bc088\") " pod="openstack/placement-79f1-account-create-n2j4q" Oct 12 07:48:43 crc kubenswrapper[4599]: I1012 07:48:43.213692 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6hmm\" (UniqueName: \"kubernetes.io/projected/bc2fcc8c-96c9-4892-968d-f738570bc088-kube-api-access-k6hmm\") pod \"placement-79f1-account-create-n2j4q\" (UID: \"bc2fcc8c-96c9-4892-968d-f738570bc088\") " pod="openstack/placement-79f1-account-create-n2j4q" Oct 12 07:48:43 crc kubenswrapper[4599]: I1012 07:48:43.234323 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6hmm\" (UniqueName: \"kubernetes.io/projected/bc2fcc8c-96c9-4892-968d-f738570bc088-kube-api-access-k6hmm\") pod \"placement-79f1-account-create-n2j4q\" (UID: \"bc2fcc8c-96c9-4892-968d-f738570bc088\") " pod="openstack/placement-79f1-account-create-n2j4q" Oct 12 07:48:43 crc kubenswrapper[4599]: I1012 07:48:43.375016 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-79f1-account-create-n2j4q" Oct 12 07:48:43 crc kubenswrapper[4599]: I1012 07:48:43.490072 4599 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-9rbk6" podUID="ff6de4a7-bd76-46bc-a376-b1ec8c5ab712" containerName="ovn-controller" probeResult="failure" output=< Oct 12 07:48:43 crc kubenswrapper[4599]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 12 07:48:43 crc kubenswrapper[4599]: > Oct 12 07:48:43 crc kubenswrapper[4599]: I1012 07:48:43.496376 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-8s62q" Oct 12 07:48:43 crc kubenswrapper[4599]: I1012 07:48:43.514502 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-fe76-account-create-8pkc5"] Oct 12 07:48:43 crc kubenswrapper[4599]: W1012 07:48:43.518866 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcbd9c1bf_e0d7_4ee9_acfb_864371dd2237.slice/crio-d5c19ed2c87423382380383142736be04bd82b8d744f69da0285da058106ab0c WatchSource:0}: Error finding container d5c19ed2c87423382380383142736be04bd82b8d744f69da0285da058106ab0c: Status 404 returned error can't find the container with id d5c19ed2c87423382380383142736be04bd82b8d744f69da0285da058106ab0c Oct 12 07:48:43 crc kubenswrapper[4599]: I1012 07:48:43.533194 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-8s62q" Oct 12 07:48:43 crc kubenswrapper[4599]: I1012 07:48:43.627877 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-79f1-account-create-n2j4q"] Oct 12 07:48:43 crc kubenswrapper[4599]: I1012 07:48:43.738594 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-9rbk6-config-vmwrb"] Oct 12 07:48:43 crc kubenswrapper[4599]: I1012 07:48:43.739652 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9rbk6-config-vmwrb" Oct 12 07:48:43 crc kubenswrapper[4599]: I1012 07:48:43.744110 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Oct 12 07:48:43 crc kubenswrapper[4599]: I1012 07:48:43.753568 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-9rbk6-config-vmwrb"] Oct 12 07:48:43 crc kubenswrapper[4599]: I1012 07:48:43.827278 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/41fe5f7d-fd52-40e5-8bff-f448a431be15-additional-scripts\") pod \"ovn-controller-9rbk6-config-vmwrb\" (UID: \"41fe5f7d-fd52-40e5-8bff-f448a431be15\") " pod="openstack/ovn-controller-9rbk6-config-vmwrb" Oct 12 07:48:43 crc kubenswrapper[4599]: I1012 07:48:43.827373 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/41fe5f7d-fd52-40e5-8bff-f448a431be15-var-log-ovn\") pod \"ovn-controller-9rbk6-config-vmwrb\" (UID: \"41fe5f7d-fd52-40e5-8bff-f448a431be15\") " pod="openstack/ovn-controller-9rbk6-config-vmwrb" Oct 12 07:48:43 crc kubenswrapper[4599]: I1012 07:48:43.827447 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lffbr\" (UniqueName: \"kubernetes.io/projected/41fe5f7d-fd52-40e5-8bff-f448a431be15-kube-api-access-lffbr\") pod \"ovn-controller-9rbk6-config-vmwrb\" (UID: \"41fe5f7d-fd52-40e5-8bff-f448a431be15\") " pod="openstack/ovn-controller-9rbk6-config-vmwrb" Oct 12 07:48:43 crc kubenswrapper[4599]: I1012 07:48:43.827507 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/41fe5f7d-fd52-40e5-8bff-f448a431be15-var-run-ovn\") pod \"ovn-controller-9rbk6-config-vmwrb\" (UID: \"41fe5f7d-fd52-40e5-8bff-f448a431be15\") " pod="openstack/ovn-controller-9rbk6-config-vmwrb" Oct 12 07:48:43 crc kubenswrapper[4599]: I1012 07:48:43.827528 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/41fe5f7d-fd52-40e5-8bff-f448a431be15-var-run\") pod \"ovn-controller-9rbk6-config-vmwrb\" (UID: \"41fe5f7d-fd52-40e5-8bff-f448a431be15\") " pod="openstack/ovn-controller-9rbk6-config-vmwrb" Oct 12 07:48:43 crc kubenswrapper[4599]: I1012 07:48:43.827568 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/41fe5f7d-fd52-40e5-8bff-f448a431be15-scripts\") pod \"ovn-controller-9rbk6-config-vmwrb\" (UID: \"41fe5f7d-fd52-40e5-8bff-f448a431be15\") " pod="openstack/ovn-controller-9rbk6-config-vmwrb" Oct 12 07:48:43 crc kubenswrapper[4599]: I1012 07:48:43.930258 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/41fe5f7d-fd52-40e5-8bff-f448a431be15-var-log-ovn\") pod \"ovn-controller-9rbk6-config-vmwrb\" (UID: \"41fe5f7d-fd52-40e5-8bff-f448a431be15\") " pod="openstack/ovn-controller-9rbk6-config-vmwrb" Oct 12 07:48:43 crc kubenswrapper[4599]: I1012 07:48:43.930410 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lffbr\" (UniqueName: \"kubernetes.io/projected/41fe5f7d-fd52-40e5-8bff-f448a431be15-kube-api-access-lffbr\") pod \"ovn-controller-9rbk6-config-vmwrb\" (UID: \"41fe5f7d-fd52-40e5-8bff-f448a431be15\") " pod="openstack/ovn-controller-9rbk6-config-vmwrb" Oct 12 07:48:43 crc kubenswrapper[4599]: I1012 07:48:43.930482 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/41fe5f7d-fd52-40e5-8bff-f448a431be15-var-run-ovn\") pod \"ovn-controller-9rbk6-config-vmwrb\" (UID: \"41fe5f7d-fd52-40e5-8bff-f448a431be15\") " pod="openstack/ovn-controller-9rbk6-config-vmwrb" Oct 12 07:48:43 crc kubenswrapper[4599]: I1012 07:48:43.930507 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/41fe5f7d-fd52-40e5-8bff-f448a431be15-var-run\") pod \"ovn-controller-9rbk6-config-vmwrb\" (UID: \"41fe5f7d-fd52-40e5-8bff-f448a431be15\") " pod="openstack/ovn-controller-9rbk6-config-vmwrb" Oct 12 07:48:43 crc kubenswrapper[4599]: I1012 07:48:43.930550 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/41fe5f7d-fd52-40e5-8bff-f448a431be15-scripts\") pod \"ovn-controller-9rbk6-config-vmwrb\" (UID: \"41fe5f7d-fd52-40e5-8bff-f448a431be15\") " pod="openstack/ovn-controller-9rbk6-config-vmwrb" Oct 12 07:48:43 crc kubenswrapper[4599]: I1012 07:48:43.930584 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/41fe5f7d-fd52-40e5-8bff-f448a431be15-additional-scripts\") pod \"ovn-controller-9rbk6-config-vmwrb\" (UID: \"41fe5f7d-fd52-40e5-8bff-f448a431be15\") " pod="openstack/ovn-controller-9rbk6-config-vmwrb" Oct 12 07:48:43 crc kubenswrapper[4599]: I1012 07:48:43.930775 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/41fe5f7d-fd52-40e5-8bff-f448a431be15-var-log-ovn\") pod \"ovn-controller-9rbk6-config-vmwrb\" (UID: \"41fe5f7d-fd52-40e5-8bff-f448a431be15\") " pod="openstack/ovn-controller-9rbk6-config-vmwrb" Oct 12 07:48:43 crc kubenswrapper[4599]: I1012 07:48:43.930886 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/41fe5f7d-fd52-40e5-8bff-f448a431be15-var-run-ovn\") pod \"ovn-controller-9rbk6-config-vmwrb\" (UID: \"41fe5f7d-fd52-40e5-8bff-f448a431be15\") " pod="openstack/ovn-controller-9rbk6-config-vmwrb" Oct 12 07:48:43 crc kubenswrapper[4599]: I1012 07:48:43.931137 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/41fe5f7d-fd52-40e5-8bff-f448a431be15-var-run\") pod \"ovn-controller-9rbk6-config-vmwrb\" (UID: \"41fe5f7d-fd52-40e5-8bff-f448a431be15\") " pod="openstack/ovn-controller-9rbk6-config-vmwrb" Oct 12 07:48:43 crc kubenswrapper[4599]: I1012 07:48:43.931723 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/41fe5f7d-fd52-40e5-8bff-f448a431be15-additional-scripts\") pod \"ovn-controller-9rbk6-config-vmwrb\" (UID: \"41fe5f7d-fd52-40e5-8bff-f448a431be15\") " pod="openstack/ovn-controller-9rbk6-config-vmwrb" Oct 12 07:48:43 crc kubenswrapper[4599]: I1012 07:48:43.933144 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/41fe5f7d-fd52-40e5-8bff-f448a431be15-scripts\") pod \"ovn-controller-9rbk6-config-vmwrb\" (UID: \"41fe5f7d-fd52-40e5-8bff-f448a431be15\") " pod="openstack/ovn-controller-9rbk6-config-vmwrb" Oct 12 07:48:43 crc kubenswrapper[4599]: I1012 07:48:43.957474 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lffbr\" (UniqueName: \"kubernetes.io/projected/41fe5f7d-fd52-40e5-8bff-f448a431be15-kube-api-access-lffbr\") pod \"ovn-controller-9rbk6-config-vmwrb\" (UID: \"41fe5f7d-fd52-40e5-8bff-f448a431be15\") " pod="openstack/ovn-controller-9rbk6-config-vmwrb" Oct 12 07:48:44 crc kubenswrapper[4599]: I1012 07:48:44.085701 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9rbk6-config-vmwrb" Oct 12 07:48:44 crc kubenswrapper[4599]: I1012 07:48:44.422006 4599 generic.go:334] "Generic (PLEG): container finished" podID="bc2fcc8c-96c9-4892-968d-f738570bc088" containerID="ae1828343217bb40457a4eb0c045f4d10b0868019b799f41da9b7b4a4713abac" exitCode=0 Oct 12 07:48:44 crc kubenswrapper[4599]: I1012 07:48:44.422114 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-79f1-account-create-n2j4q" event={"ID":"bc2fcc8c-96c9-4892-968d-f738570bc088","Type":"ContainerDied","Data":"ae1828343217bb40457a4eb0c045f4d10b0868019b799f41da9b7b4a4713abac"} Oct 12 07:48:44 crc kubenswrapper[4599]: I1012 07:48:44.422161 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-79f1-account-create-n2j4q" event={"ID":"bc2fcc8c-96c9-4892-968d-f738570bc088","Type":"ContainerStarted","Data":"dde7a1aadc6ad748be04d86d9ab93c76895e2caa1c9898e739876673a8544474"} Oct 12 07:48:44 crc kubenswrapper[4599]: I1012 07:48:44.431010 4599 generic.go:334] "Generic (PLEG): container finished" podID="cbd9c1bf-e0d7-4ee9-acfb-864371dd2237" containerID="9f6c0bcc87a2d60f702d94bc107a73f5890df0096454007664f20e3334776a9d" exitCode=0 Oct 12 07:48:44 crc kubenswrapper[4599]: I1012 07:48:44.431103 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-fe76-account-create-8pkc5" event={"ID":"cbd9c1bf-e0d7-4ee9-acfb-864371dd2237","Type":"ContainerDied","Data":"9f6c0bcc87a2d60f702d94bc107a73f5890df0096454007664f20e3334776a9d"} Oct 12 07:48:44 crc kubenswrapper[4599]: I1012 07:48:44.431149 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-fe76-account-create-8pkc5" event={"ID":"cbd9c1bf-e0d7-4ee9-acfb-864371dd2237","Type":"ContainerStarted","Data":"d5c19ed2c87423382380383142736be04bd82b8d744f69da0285da058106ab0c"} Oct 12 07:48:44 crc kubenswrapper[4599]: W1012 07:48:44.536475 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod41fe5f7d_fd52_40e5_8bff_f448a431be15.slice/crio-bc06360d05412c7504c6641df2f4f7f0cf28abf61fed1a9cc2e74f749b30ea05 WatchSource:0}: Error finding container bc06360d05412c7504c6641df2f4f7f0cf28abf61fed1a9cc2e74f749b30ea05: Status 404 returned error can't find the container with id bc06360d05412c7504c6641df2f4f7f0cf28abf61fed1a9cc2e74f749b30ea05 Oct 12 07:48:44 crc kubenswrapper[4599]: I1012 07:48:44.541797 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-9rbk6-config-vmwrb"] Oct 12 07:48:45 crc kubenswrapper[4599]: I1012 07:48:45.441421 4599 generic.go:334] "Generic (PLEG): container finished" podID="41fe5f7d-fd52-40e5-8bff-f448a431be15" containerID="7cf832d2071e6d4f5e705a45d863f0a6493487c0a45db06ab75702f3c66f37f1" exitCode=0 Oct 12 07:48:45 crc kubenswrapper[4599]: I1012 07:48:45.441474 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9rbk6-config-vmwrb" event={"ID":"41fe5f7d-fd52-40e5-8bff-f448a431be15","Type":"ContainerDied","Data":"7cf832d2071e6d4f5e705a45d863f0a6493487c0a45db06ab75702f3c66f37f1"} Oct 12 07:48:45 crc kubenswrapper[4599]: I1012 07:48:45.442103 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9rbk6-config-vmwrb" event={"ID":"41fe5f7d-fd52-40e5-8bff-f448a431be15","Type":"ContainerStarted","Data":"bc06360d05412c7504c6641df2f4f7f0cf28abf61fed1a9cc2e74f749b30ea05"} Oct 12 07:48:45 crc kubenswrapper[4599]: I1012 07:48:45.833128 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-79f1-account-create-n2j4q" Oct 12 07:48:45 crc kubenswrapper[4599]: I1012 07:48:45.839257 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-fe76-account-create-8pkc5" Oct 12 07:48:46 crc kubenswrapper[4599]: I1012 07:48:46.018490 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k6hmm\" (UniqueName: \"kubernetes.io/projected/bc2fcc8c-96c9-4892-968d-f738570bc088-kube-api-access-k6hmm\") pod \"bc2fcc8c-96c9-4892-968d-f738570bc088\" (UID: \"bc2fcc8c-96c9-4892-968d-f738570bc088\") " Oct 12 07:48:46 crc kubenswrapper[4599]: I1012 07:48:46.018560 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqkh2\" (UniqueName: \"kubernetes.io/projected/cbd9c1bf-e0d7-4ee9-acfb-864371dd2237-kube-api-access-lqkh2\") pod \"cbd9c1bf-e0d7-4ee9-acfb-864371dd2237\" (UID: \"cbd9c1bf-e0d7-4ee9-acfb-864371dd2237\") " Oct 12 07:48:46 crc kubenswrapper[4599]: I1012 07:48:46.025433 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbd9c1bf-e0d7-4ee9-acfb-864371dd2237-kube-api-access-lqkh2" (OuterVolumeSpecName: "kube-api-access-lqkh2") pod "cbd9c1bf-e0d7-4ee9-acfb-864371dd2237" (UID: "cbd9c1bf-e0d7-4ee9-acfb-864371dd2237"). InnerVolumeSpecName "kube-api-access-lqkh2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:48:46 crc kubenswrapper[4599]: I1012 07:48:46.025888 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc2fcc8c-96c9-4892-968d-f738570bc088-kube-api-access-k6hmm" (OuterVolumeSpecName: "kube-api-access-k6hmm") pod "bc2fcc8c-96c9-4892-968d-f738570bc088" (UID: "bc2fcc8c-96c9-4892-968d-f738570bc088"). InnerVolumeSpecName "kube-api-access-k6hmm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:48:46 crc kubenswrapper[4599]: I1012 07:48:46.121250 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k6hmm\" (UniqueName: \"kubernetes.io/projected/bc2fcc8c-96c9-4892-968d-f738570bc088-kube-api-access-k6hmm\") on node \"crc\" DevicePath \"\"" Oct 12 07:48:46 crc kubenswrapper[4599]: I1012 07:48:46.121297 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lqkh2\" (UniqueName: \"kubernetes.io/projected/cbd9c1bf-e0d7-4ee9-acfb-864371dd2237-kube-api-access-lqkh2\") on node \"crc\" DevicePath \"\"" Oct 12 07:48:46 crc kubenswrapper[4599]: I1012 07:48:46.459368 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-79f1-account-create-n2j4q" event={"ID":"bc2fcc8c-96c9-4892-968d-f738570bc088","Type":"ContainerDied","Data":"dde7a1aadc6ad748be04d86d9ab93c76895e2caa1c9898e739876673a8544474"} Oct 12 07:48:46 crc kubenswrapper[4599]: I1012 07:48:46.459411 4599 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dde7a1aadc6ad748be04d86d9ab93c76895e2caa1c9898e739876673a8544474" Oct 12 07:48:46 crc kubenswrapper[4599]: I1012 07:48:46.459412 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-79f1-account-create-n2j4q" Oct 12 07:48:46 crc kubenswrapper[4599]: I1012 07:48:46.461166 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-fe76-account-create-8pkc5" event={"ID":"cbd9c1bf-e0d7-4ee9-acfb-864371dd2237","Type":"ContainerDied","Data":"d5c19ed2c87423382380383142736be04bd82b8d744f69da0285da058106ab0c"} Oct 12 07:48:46 crc kubenswrapper[4599]: I1012 07:48:46.461194 4599 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d5c19ed2c87423382380383142736be04bd82b8d744f69da0285da058106ab0c" Oct 12 07:48:46 crc kubenswrapper[4599]: I1012 07:48:46.461252 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-fe76-account-create-8pkc5" Oct 12 07:48:46 crc kubenswrapper[4599]: I1012 07:48:46.735851 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9rbk6-config-vmwrb" Oct 12 07:48:46 crc kubenswrapper[4599]: I1012 07:48:46.834062 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lffbr\" (UniqueName: \"kubernetes.io/projected/41fe5f7d-fd52-40e5-8bff-f448a431be15-kube-api-access-lffbr\") pod \"41fe5f7d-fd52-40e5-8bff-f448a431be15\" (UID: \"41fe5f7d-fd52-40e5-8bff-f448a431be15\") " Oct 12 07:48:46 crc kubenswrapper[4599]: I1012 07:48:46.834569 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/41fe5f7d-fd52-40e5-8bff-f448a431be15-additional-scripts\") pod \"41fe5f7d-fd52-40e5-8bff-f448a431be15\" (UID: \"41fe5f7d-fd52-40e5-8bff-f448a431be15\") " Oct 12 07:48:46 crc kubenswrapper[4599]: I1012 07:48:46.834680 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/41fe5f7d-fd52-40e5-8bff-f448a431be15-scripts\") pod \"41fe5f7d-fd52-40e5-8bff-f448a431be15\" (UID: \"41fe5f7d-fd52-40e5-8bff-f448a431be15\") " Oct 12 07:48:46 crc kubenswrapper[4599]: I1012 07:48:46.834711 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/41fe5f7d-fd52-40e5-8bff-f448a431be15-var-run-ovn\") pod \"41fe5f7d-fd52-40e5-8bff-f448a431be15\" (UID: \"41fe5f7d-fd52-40e5-8bff-f448a431be15\") " Oct 12 07:48:46 crc kubenswrapper[4599]: I1012 07:48:46.834737 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/41fe5f7d-fd52-40e5-8bff-f448a431be15-var-run\") pod \"41fe5f7d-fd52-40e5-8bff-f448a431be15\" (UID: \"41fe5f7d-fd52-40e5-8bff-f448a431be15\") " Oct 12 07:48:46 crc kubenswrapper[4599]: I1012 07:48:46.834786 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/41fe5f7d-fd52-40e5-8bff-f448a431be15-var-log-ovn\") pod \"41fe5f7d-fd52-40e5-8bff-f448a431be15\" (UID: \"41fe5f7d-fd52-40e5-8bff-f448a431be15\") " Oct 12 07:48:46 crc kubenswrapper[4599]: I1012 07:48:46.834838 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/41fe5f7d-fd52-40e5-8bff-f448a431be15-var-run" (OuterVolumeSpecName: "var-run") pod "41fe5f7d-fd52-40e5-8bff-f448a431be15" (UID: "41fe5f7d-fd52-40e5-8bff-f448a431be15"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 12 07:48:46 crc kubenswrapper[4599]: I1012 07:48:46.834838 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/41fe5f7d-fd52-40e5-8bff-f448a431be15-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "41fe5f7d-fd52-40e5-8bff-f448a431be15" (UID: "41fe5f7d-fd52-40e5-8bff-f448a431be15"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 12 07:48:46 crc kubenswrapper[4599]: I1012 07:48:46.834945 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/41fe5f7d-fd52-40e5-8bff-f448a431be15-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "41fe5f7d-fd52-40e5-8bff-f448a431be15" (UID: "41fe5f7d-fd52-40e5-8bff-f448a431be15"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 12 07:48:46 crc kubenswrapper[4599]: I1012 07:48:46.835261 4599 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/41fe5f7d-fd52-40e5-8bff-f448a431be15-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 12 07:48:46 crc kubenswrapper[4599]: I1012 07:48:46.835274 4599 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/41fe5f7d-fd52-40e5-8bff-f448a431be15-var-run\") on node \"crc\" DevicePath \"\"" Oct 12 07:48:46 crc kubenswrapper[4599]: I1012 07:48:46.835283 4599 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/41fe5f7d-fd52-40e5-8bff-f448a431be15-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 12 07:48:46 crc kubenswrapper[4599]: I1012 07:48:46.835770 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41fe5f7d-fd52-40e5-8bff-f448a431be15-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "41fe5f7d-fd52-40e5-8bff-f448a431be15" (UID: "41fe5f7d-fd52-40e5-8bff-f448a431be15"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:48:46 crc kubenswrapper[4599]: I1012 07:48:46.835875 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41fe5f7d-fd52-40e5-8bff-f448a431be15-scripts" (OuterVolumeSpecName: "scripts") pod "41fe5f7d-fd52-40e5-8bff-f448a431be15" (UID: "41fe5f7d-fd52-40e5-8bff-f448a431be15"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:48:46 crc kubenswrapper[4599]: I1012 07:48:46.838807 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41fe5f7d-fd52-40e5-8bff-f448a431be15-kube-api-access-lffbr" (OuterVolumeSpecName: "kube-api-access-lffbr") pod "41fe5f7d-fd52-40e5-8bff-f448a431be15" (UID: "41fe5f7d-fd52-40e5-8bff-f448a431be15"). InnerVolumeSpecName "kube-api-access-lffbr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:48:46 crc kubenswrapper[4599]: I1012 07:48:46.936626 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lffbr\" (UniqueName: \"kubernetes.io/projected/41fe5f7d-fd52-40e5-8bff-f448a431be15-kube-api-access-lffbr\") on node \"crc\" DevicePath \"\"" Oct 12 07:48:46 crc kubenswrapper[4599]: I1012 07:48:46.936665 4599 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/41fe5f7d-fd52-40e5-8bff-f448a431be15-additional-scripts\") on node \"crc\" DevicePath \"\"" Oct 12 07:48:46 crc kubenswrapper[4599]: I1012 07:48:46.936676 4599 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/41fe5f7d-fd52-40e5-8bff-f448a431be15-scripts\") on node \"crc\" DevicePath \"\"" Oct 12 07:48:47 crc kubenswrapper[4599]: I1012 07:48:47.472313 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9rbk6-config-vmwrb" event={"ID":"41fe5f7d-fd52-40e5-8bff-f448a431be15","Type":"ContainerDied","Data":"bc06360d05412c7504c6641df2f4f7f0cf28abf61fed1a9cc2e74f749b30ea05"} Oct 12 07:48:47 crc kubenswrapper[4599]: I1012 07:48:47.472387 4599 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc06360d05412c7504c6641df2f4f7f0cf28abf61fed1a9cc2e74f749b30ea05" Oct 12 07:48:47 crc kubenswrapper[4599]: I1012 07:48:47.472384 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9rbk6-config-vmwrb" Oct 12 07:48:47 crc kubenswrapper[4599]: I1012 07:48:47.829806 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-9rbk6-config-vmwrb"] Oct 12 07:48:47 crc kubenswrapper[4599]: I1012 07:48:47.842605 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-9rbk6-config-vmwrb"] Oct 12 07:48:47 crc kubenswrapper[4599]: I1012 07:48:47.925056 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-9rbk6-config-fcvrx"] Oct 12 07:48:47 crc kubenswrapper[4599]: E1012 07:48:47.926760 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41fe5f7d-fd52-40e5-8bff-f448a431be15" containerName="ovn-config" Oct 12 07:48:47 crc kubenswrapper[4599]: I1012 07:48:47.926783 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="41fe5f7d-fd52-40e5-8bff-f448a431be15" containerName="ovn-config" Oct 12 07:48:47 crc kubenswrapper[4599]: E1012 07:48:47.926815 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbd9c1bf-e0d7-4ee9-acfb-864371dd2237" containerName="mariadb-account-create" Oct 12 07:48:47 crc kubenswrapper[4599]: I1012 07:48:47.926822 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbd9c1bf-e0d7-4ee9-acfb-864371dd2237" containerName="mariadb-account-create" Oct 12 07:48:47 crc kubenswrapper[4599]: E1012 07:48:47.926845 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc2fcc8c-96c9-4892-968d-f738570bc088" containerName="mariadb-account-create" Oct 12 07:48:47 crc kubenswrapper[4599]: I1012 07:48:47.926852 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc2fcc8c-96c9-4892-968d-f738570bc088" containerName="mariadb-account-create" Oct 12 07:48:47 crc kubenswrapper[4599]: I1012 07:48:47.927001 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbd9c1bf-e0d7-4ee9-acfb-864371dd2237" containerName="mariadb-account-create" Oct 12 07:48:47 crc kubenswrapper[4599]: I1012 07:48:47.927025 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="41fe5f7d-fd52-40e5-8bff-f448a431be15" containerName="ovn-config" Oct 12 07:48:47 crc kubenswrapper[4599]: I1012 07:48:47.927042 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc2fcc8c-96c9-4892-968d-f738570bc088" containerName="mariadb-account-create" Oct 12 07:48:47 crc kubenswrapper[4599]: I1012 07:48:47.929496 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9rbk6-config-fcvrx" Oct 12 07:48:47 crc kubenswrapper[4599]: I1012 07:48:47.932061 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Oct 12 07:48:47 crc kubenswrapper[4599]: I1012 07:48:47.934908 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-9rbk6-config-fcvrx"] Oct 12 07:48:48 crc kubenswrapper[4599]: I1012 07:48:48.024631 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-dch7h" Oct 12 07:48:48 crc kubenswrapper[4599]: I1012 07:48:48.024685 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-dch7h" Oct 12 07:48:48 crc kubenswrapper[4599]: I1012 07:48:48.057783 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzpwn\" (UniqueName: \"kubernetes.io/projected/12556204-17e0-478c-882d-cd17ccf87c13-kube-api-access-nzpwn\") pod \"ovn-controller-9rbk6-config-fcvrx\" (UID: \"12556204-17e0-478c-882d-cd17ccf87c13\") " pod="openstack/ovn-controller-9rbk6-config-fcvrx" Oct 12 07:48:48 crc kubenswrapper[4599]: I1012 07:48:48.057875 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/12556204-17e0-478c-882d-cd17ccf87c13-scripts\") pod \"ovn-controller-9rbk6-config-fcvrx\" (UID: \"12556204-17e0-478c-882d-cd17ccf87c13\") " pod="openstack/ovn-controller-9rbk6-config-fcvrx" Oct 12 07:48:48 crc kubenswrapper[4599]: I1012 07:48:48.058052 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/12556204-17e0-478c-882d-cd17ccf87c13-additional-scripts\") pod \"ovn-controller-9rbk6-config-fcvrx\" (UID: \"12556204-17e0-478c-882d-cd17ccf87c13\") " pod="openstack/ovn-controller-9rbk6-config-fcvrx" Oct 12 07:48:48 crc kubenswrapper[4599]: I1012 07:48:48.058195 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/12556204-17e0-478c-882d-cd17ccf87c13-var-run\") pod \"ovn-controller-9rbk6-config-fcvrx\" (UID: \"12556204-17e0-478c-882d-cd17ccf87c13\") " pod="openstack/ovn-controller-9rbk6-config-fcvrx" Oct 12 07:48:48 crc kubenswrapper[4599]: I1012 07:48:48.058483 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/12556204-17e0-478c-882d-cd17ccf87c13-var-log-ovn\") pod \"ovn-controller-9rbk6-config-fcvrx\" (UID: \"12556204-17e0-478c-882d-cd17ccf87c13\") " pod="openstack/ovn-controller-9rbk6-config-fcvrx" Oct 12 07:48:48 crc kubenswrapper[4599]: I1012 07:48:48.058617 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/12556204-17e0-478c-882d-cd17ccf87c13-var-run-ovn\") pod \"ovn-controller-9rbk6-config-fcvrx\" (UID: \"12556204-17e0-478c-882d-cd17ccf87c13\") " pod="openstack/ovn-controller-9rbk6-config-fcvrx" Oct 12 07:48:48 crc kubenswrapper[4599]: I1012 07:48:48.067408 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-dch7h" Oct 12 07:48:48 crc kubenswrapper[4599]: I1012 07:48:48.162434 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzpwn\" (UniqueName: \"kubernetes.io/projected/12556204-17e0-478c-882d-cd17ccf87c13-kube-api-access-nzpwn\") pod \"ovn-controller-9rbk6-config-fcvrx\" (UID: \"12556204-17e0-478c-882d-cd17ccf87c13\") " pod="openstack/ovn-controller-9rbk6-config-fcvrx" Oct 12 07:48:48 crc kubenswrapper[4599]: I1012 07:48:48.162642 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/12556204-17e0-478c-882d-cd17ccf87c13-scripts\") pod \"ovn-controller-9rbk6-config-fcvrx\" (UID: \"12556204-17e0-478c-882d-cd17ccf87c13\") " pod="openstack/ovn-controller-9rbk6-config-fcvrx" Oct 12 07:48:48 crc kubenswrapper[4599]: I1012 07:48:48.162688 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/12556204-17e0-478c-882d-cd17ccf87c13-additional-scripts\") pod \"ovn-controller-9rbk6-config-fcvrx\" (UID: \"12556204-17e0-478c-882d-cd17ccf87c13\") " pod="openstack/ovn-controller-9rbk6-config-fcvrx" Oct 12 07:48:48 crc kubenswrapper[4599]: I1012 07:48:48.162776 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/12556204-17e0-478c-882d-cd17ccf87c13-var-run\") pod \"ovn-controller-9rbk6-config-fcvrx\" (UID: \"12556204-17e0-478c-882d-cd17ccf87c13\") " pod="openstack/ovn-controller-9rbk6-config-fcvrx" Oct 12 07:48:48 crc kubenswrapper[4599]: I1012 07:48:48.162814 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/12556204-17e0-478c-882d-cd17ccf87c13-var-log-ovn\") pod \"ovn-controller-9rbk6-config-fcvrx\" (UID: \"12556204-17e0-478c-882d-cd17ccf87c13\") " pod="openstack/ovn-controller-9rbk6-config-fcvrx" Oct 12 07:48:48 crc kubenswrapper[4599]: I1012 07:48:48.162845 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/12556204-17e0-478c-882d-cd17ccf87c13-var-run-ovn\") pod \"ovn-controller-9rbk6-config-fcvrx\" (UID: \"12556204-17e0-478c-882d-cd17ccf87c13\") " pod="openstack/ovn-controller-9rbk6-config-fcvrx" Oct 12 07:48:48 crc kubenswrapper[4599]: I1012 07:48:48.163648 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/12556204-17e0-478c-882d-cd17ccf87c13-additional-scripts\") pod \"ovn-controller-9rbk6-config-fcvrx\" (UID: \"12556204-17e0-478c-882d-cd17ccf87c13\") " pod="openstack/ovn-controller-9rbk6-config-fcvrx" Oct 12 07:48:48 crc kubenswrapper[4599]: I1012 07:48:48.164193 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/12556204-17e0-478c-882d-cd17ccf87c13-var-log-ovn\") pod \"ovn-controller-9rbk6-config-fcvrx\" (UID: \"12556204-17e0-478c-882d-cd17ccf87c13\") " pod="openstack/ovn-controller-9rbk6-config-fcvrx" Oct 12 07:48:48 crc kubenswrapper[4599]: I1012 07:48:48.164196 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/12556204-17e0-478c-882d-cd17ccf87c13-var-run-ovn\") pod \"ovn-controller-9rbk6-config-fcvrx\" (UID: \"12556204-17e0-478c-882d-cd17ccf87c13\") " pod="openstack/ovn-controller-9rbk6-config-fcvrx" Oct 12 07:48:48 crc kubenswrapper[4599]: I1012 07:48:48.164255 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/12556204-17e0-478c-882d-cd17ccf87c13-var-run\") pod \"ovn-controller-9rbk6-config-fcvrx\" (UID: \"12556204-17e0-478c-882d-cd17ccf87c13\") " pod="openstack/ovn-controller-9rbk6-config-fcvrx" Oct 12 07:48:48 crc kubenswrapper[4599]: I1012 07:48:48.165390 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/12556204-17e0-478c-882d-cd17ccf87c13-scripts\") pod \"ovn-controller-9rbk6-config-fcvrx\" (UID: \"12556204-17e0-478c-882d-cd17ccf87c13\") " pod="openstack/ovn-controller-9rbk6-config-fcvrx" Oct 12 07:48:48 crc kubenswrapper[4599]: I1012 07:48:48.180435 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzpwn\" (UniqueName: \"kubernetes.io/projected/12556204-17e0-478c-882d-cd17ccf87c13-kube-api-access-nzpwn\") pod \"ovn-controller-9rbk6-config-fcvrx\" (UID: \"12556204-17e0-478c-882d-cd17ccf87c13\") " pod="openstack/ovn-controller-9rbk6-config-fcvrx" Oct 12 07:48:48 crc kubenswrapper[4599]: I1012 07:48:48.244818 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9rbk6-config-fcvrx" Oct 12 07:48:48 crc kubenswrapper[4599]: I1012 07:48:48.483484 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-9rbk6" Oct 12 07:48:48 crc kubenswrapper[4599]: I1012 07:48:48.534536 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-dch7h" Oct 12 07:48:48 crc kubenswrapper[4599]: I1012 07:48:48.579191 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dch7h"] Oct 12 07:48:48 crc kubenswrapper[4599]: I1012 07:48:48.654007 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-9rbk6-config-fcvrx"] Oct 12 07:48:49 crc kubenswrapper[4599]: I1012 07:48:49.490512 4599 generic.go:334] "Generic (PLEG): container finished" podID="12556204-17e0-478c-882d-cd17ccf87c13" containerID="fe7e32371e0aaaf494f9280241171cf51b492232bc1f28959847c3d245f89a83" exitCode=0 Oct 12 07:48:49 crc kubenswrapper[4599]: I1012 07:48:49.490604 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9rbk6-config-fcvrx" event={"ID":"12556204-17e0-478c-882d-cd17ccf87c13","Type":"ContainerDied","Data":"fe7e32371e0aaaf494f9280241171cf51b492232bc1f28959847c3d245f89a83"} Oct 12 07:48:49 crc kubenswrapper[4599]: I1012 07:48:49.490930 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9rbk6-config-fcvrx" event={"ID":"12556204-17e0-478c-882d-cd17ccf87c13","Type":"ContainerStarted","Data":"2be266329e64a81978be28371b41401edb755b305b938f6f283040603bdc6d1d"} Oct 12 07:48:49 crc kubenswrapper[4599]: I1012 07:48:49.553450 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41fe5f7d-fd52-40e5-8bff-f448a431be15" path="/var/lib/kubelet/pods/41fe5f7d-fd52-40e5-8bff-f448a431be15/volumes" Oct 12 07:48:49 crc kubenswrapper[4599]: I1012 07:48:49.608645 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 12 07:48:49 crc kubenswrapper[4599]: I1012 07:48:49.907494 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 12 07:48:49 crc kubenswrapper[4599]: I1012 07:48:49.939539 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-qdm5f"] Oct 12 07:48:49 crc kubenswrapper[4599]: I1012 07:48:49.940610 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-qdm5f" Oct 12 07:48:49 crc kubenswrapper[4599]: I1012 07:48:49.978097 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-qdm5f"] Oct 12 07:48:50 crc kubenswrapper[4599]: I1012 07:48:50.030571 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5dcf8755f-m2r8l" Oct 12 07:48:50 crc kubenswrapper[4599]: I1012 07:48:50.047444 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-mhctb"] Oct 12 07:48:50 crc kubenswrapper[4599]: I1012 07:48:50.048778 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-mhctb" Oct 12 07:48:50 crc kubenswrapper[4599]: I1012 07:48:50.067897 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-mhctb"] Oct 12 07:48:50 crc kubenswrapper[4599]: I1012 07:48:50.104169 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d5c4f869-vd6xg"] Oct 12 07:48:50 crc kubenswrapper[4599]: I1012 07:48:50.104462 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-d5c4f869-vd6xg" podUID="946b8921-7749-4631-a3a0-17b48397fb1a" containerName="dnsmasq-dns" containerID="cri-o://c3a0b4b790ca4529e2982ef697da3f57431e2c32880884f8a554ddf944e7b362" gracePeriod=10 Oct 12 07:48:50 crc kubenswrapper[4599]: I1012 07:48:50.116507 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gz456\" (UniqueName: \"kubernetes.io/projected/9f881966-2c70-43ee-bcbd-2fca447e0697-kube-api-access-gz456\") pod \"cinder-db-create-qdm5f\" (UID: \"9f881966-2c70-43ee-bcbd-2fca447e0697\") " pod="openstack/cinder-db-create-qdm5f" Oct 12 07:48:50 crc kubenswrapper[4599]: I1012 07:48:50.218055 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gz456\" (UniqueName: \"kubernetes.io/projected/9f881966-2c70-43ee-bcbd-2fca447e0697-kube-api-access-gz456\") pod \"cinder-db-create-qdm5f\" (UID: \"9f881966-2c70-43ee-bcbd-2fca447e0697\") " pod="openstack/cinder-db-create-qdm5f" Oct 12 07:48:50 crc kubenswrapper[4599]: I1012 07:48:50.218193 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lts9k\" (UniqueName: \"kubernetes.io/projected/797ba2c7-ef03-4514-b48c-d4267a650fbc-kube-api-access-lts9k\") pod \"barbican-db-create-mhctb\" (UID: \"797ba2c7-ef03-4514-b48c-d4267a650fbc\") " pod="openstack/barbican-db-create-mhctb" Oct 12 07:48:50 crc kubenswrapper[4599]: I1012 07:48:50.252874 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gz456\" (UniqueName: \"kubernetes.io/projected/9f881966-2c70-43ee-bcbd-2fca447e0697-kube-api-access-gz456\") pod \"cinder-db-create-qdm5f\" (UID: \"9f881966-2c70-43ee-bcbd-2fca447e0697\") " pod="openstack/cinder-db-create-qdm5f" Oct 12 07:48:50 crc kubenswrapper[4599]: I1012 07:48:50.255996 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-qdm5f" Oct 12 07:48:50 crc kubenswrapper[4599]: I1012 07:48:50.321487 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lts9k\" (UniqueName: \"kubernetes.io/projected/797ba2c7-ef03-4514-b48c-d4267a650fbc-kube-api-access-lts9k\") pod \"barbican-db-create-mhctb\" (UID: \"797ba2c7-ef03-4514-b48c-d4267a650fbc\") " pod="openstack/barbican-db-create-mhctb" Oct 12 07:48:50 crc kubenswrapper[4599]: I1012 07:48:50.332388 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-hmf2g"] Oct 12 07:48:50 crc kubenswrapper[4599]: I1012 07:48:50.336102 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-hmf2g" Oct 12 07:48:50 crc kubenswrapper[4599]: I1012 07:48:50.346953 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-hmf2g"] Oct 12 07:48:50 crc kubenswrapper[4599]: I1012 07:48:50.354024 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lts9k\" (UniqueName: \"kubernetes.io/projected/797ba2c7-ef03-4514-b48c-d4267a650fbc-kube-api-access-lts9k\") pod \"barbican-db-create-mhctb\" (UID: \"797ba2c7-ef03-4514-b48c-d4267a650fbc\") " pod="openstack/barbican-db-create-mhctb" Oct 12 07:48:50 crc kubenswrapper[4599]: I1012 07:48:50.371242 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-mhctb" Oct 12 07:48:50 crc kubenswrapper[4599]: I1012 07:48:50.402872 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-s54p2"] Oct 12 07:48:50 crc kubenswrapper[4599]: I1012 07:48:50.404113 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-s54p2" Oct 12 07:48:50 crc kubenswrapper[4599]: I1012 07:48:50.405880 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 12 07:48:50 crc kubenswrapper[4599]: I1012 07:48:50.406104 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 12 07:48:50 crc kubenswrapper[4599]: I1012 07:48:50.406121 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 12 07:48:50 crc kubenswrapper[4599]: I1012 07:48:50.408942 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-6vkdx" Oct 12 07:48:50 crc kubenswrapper[4599]: I1012 07:48:50.411849 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-s54p2"] Oct 12 07:48:50 crc kubenswrapper[4599]: I1012 07:48:50.499684 4599 generic.go:334] "Generic (PLEG): container finished" podID="946b8921-7749-4631-a3a0-17b48397fb1a" containerID="c3a0b4b790ca4529e2982ef697da3f57431e2c32880884f8a554ddf944e7b362" exitCode=0 Oct 12 07:48:50 crc kubenswrapper[4599]: I1012 07:48:50.499757 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d5c4f869-vd6xg" event={"ID":"946b8921-7749-4631-a3a0-17b48397fb1a","Type":"ContainerDied","Data":"c3a0b4b790ca4529e2982ef697da3f57431e2c32880884f8a554ddf944e7b362"} Oct 12 07:48:50 crc kubenswrapper[4599]: I1012 07:48:50.499967 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-dch7h" podUID="44c467ff-3eaf-42e6-94c7-6699da7f8be8" containerName="registry-server" containerID="cri-o://878f1162b86b336dc5148caa2fb8b48e2328b2d61c559bf1baed534d12fe7746" gracePeriod=2 Oct 12 07:48:50 crc kubenswrapper[4599]: I1012 07:48:50.525235 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff665457-da4e-4e70-a9be-a64b343bd4d0-config-data\") pod \"keystone-db-sync-s54p2\" (UID: \"ff665457-da4e-4e70-a9be-a64b343bd4d0\") " pod="openstack/keystone-db-sync-s54p2" Oct 12 07:48:50 crc kubenswrapper[4599]: I1012 07:48:50.525897 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgv5b\" (UniqueName: \"kubernetes.io/projected/ff665457-da4e-4e70-a9be-a64b343bd4d0-kube-api-access-sgv5b\") pod \"keystone-db-sync-s54p2\" (UID: \"ff665457-da4e-4e70-a9be-a64b343bd4d0\") " pod="openstack/keystone-db-sync-s54p2" Oct 12 07:48:50 crc kubenswrapper[4599]: I1012 07:48:50.526005 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff665457-da4e-4e70-a9be-a64b343bd4d0-combined-ca-bundle\") pod \"keystone-db-sync-s54p2\" (UID: \"ff665457-da4e-4e70-a9be-a64b343bd4d0\") " pod="openstack/keystone-db-sync-s54p2" Oct 12 07:48:50 crc kubenswrapper[4599]: I1012 07:48:50.526313 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jct2l\" (UniqueName: \"kubernetes.io/projected/16cbe17e-851b-4275-9a2b-13ca14459b4a-kube-api-access-jct2l\") pod \"neutron-db-create-hmf2g\" (UID: \"16cbe17e-851b-4275-9a2b-13ca14459b4a\") " pod="openstack/neutron-db-create-hmf2g" Oct 12 07:48:50 crc kubenswrapper[4599]: I1012 07:48:50.628372 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgv5b\" (UniqueName: \"kubernetes.io/projected/ff665457-da4e-4e70-a9be-a64b343bd4d0-kube-api-access-sgv5b\") pod \"keystone-db-sync-s54p2\" (UID: \"ff665457-da4e-4e70-a9be-a64b343bd4d0\") " pod="openstack/keystone-db-sync-s54p2" Oct 12 07:48:50 crc kubenswrapper[4599]: I1012 07:48:50.628477 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff665457-da4e-4e70-a9be-a64b343bd4d0-combined-ca-bundle\") pod \"keystone-db-sync-s54p2\" (UID: \"ff665457-da4e-4e70-a9be-a64b343bd4d0\") " pod="openstack/keystone-db-sync-s54p2" Oct 12 07:48:50 crc kubenswrapper[4599]: I1012 07:48:50.628561 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jct2l\" (UniqueName: \"kubernetes.io/projected/16cbe17e-851b-4275-9a2b-13ca14459b4a-kube-api-access-jct2l\") pod \"neutron-db-create-hmf2g\" (UID: \"16cbe17e-851b-4275-9a2b-13ca14459b4a\") " pod="openstack/neutron-db-create-hmf2g" Oct 12 07:48:50 crc kubenswrapper[4599]: I1012 07:48:50.628604 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff665457-da4e-4e70-a9be-a64b343bd4d0-config-data\") pod \"keystone-db-sync-s54p2\" (UID: \"ff665457-da4e-4e70-a9be-a64b343bd4d0\") " pod="openstack/keystone-db-sync-s54p2" Oct 12 07:48:50 crc kubenswrapper[4599]: I1012 07:48:50.642184 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff665457-da4e-4e70-a9be-a64b343bd4d0-combined-ca-bundle\") pod \"keystone-db-sync-s54p2\" (UID: \"ff665457-da4e-4e70-a9be-a64b343bd4d0\") " pod="openstack/keystone-db-sync-s54p2" Oct 12 07:48:50 crc kubenswrapper[4599]: I1012 07:48:50.643556 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff665457-da4e-4e70-a9be-a64b343bd4d0-config-data\") pod \"keystone-db-sync-s54p2\" (UID: \"ff665457-da4e-4e70-a9be-a64b343bd4d0\") " pod="openstack/keystone-db-sync-s54p2" Oct 12 07:48:50 crc kubenswrapper[4599]: I1012 07:48:50.646772 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgv5b\" (UniqueName: \"kubernetes.io/projected/ff665457-da4e-4e70-a9be-a64b343bd4d0-kube-api-access-sgv5b\") pod \"keystone-db-sync-s54p2\" (UID: \"ff665457-da4e-4e70-a9be-a64b343bd4d0\") " pod="openstack/keystone-db-sync-s54p2" Oct 12 07:48:50 crc kubenswrapper[4599]: I1012 07:48:50.650410 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jct2l\" (UniqueName: \"kubernetes.io/projected/16cbe17e-851b-4275-9a2b-13ca14459b4a-kube-api-access-jct2l\") pod \"neutron-db-create-hmf2g\" (UID: \"16cbe17e-851b-4275-9a2b-13ca14459b4a\") " pod="openstack/neutron-db-create-hmf2g" Oct 12 07:48:50 crc kubenswrapper[4599]: I1012 07:48:50.689630 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-hmf2g" Oct 12 07:48:50 crc kubenswrapper[4599]: I1012 07:48:50.717276 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-s54p2" Oct 12 07:48:51 crc kubenswrapper[4599]: I1012 07:48:51.509781 4599 generic.go:334] "Generic (PLEG): container finished" podID="44c467ff-3eaf-42e6-94c7-6699da7f8be8" containerID="878f1162b86b336dc5148caa2fb8b48e2328b2d61c559bf1baed534d12fe7746" exitCode=0 Oct 12 07:48:51 crc kubenswrapper[4599]: I1012 07:48:51.509884 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dch7h" event={"ID":"44c467ff-3eaf-42e6-94c7-6699da7f8be8","Type":"ContainerDied","Data":"878f1162b86b336dc5148caa2fb8b48e2328b2d61c559bf1baed534d12fe7746"} Oct 12 07:48:52 crc kubenswrapper[4599]: I1012 07:48:52.091381 4599 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-d5c4f869-vd6xg" podUID="946b8921-7749-4631-a3a0-17b48397fb1a" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.113:5353: connect: connection refused" Oct 12 07:48:56 crc kubenswrapper[4599]: I1012 07:48:56.092062 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9rbk6-config-fcvrx" Oct 12 07:48:56 crc kubenswrapper[4599]: I1012 07:48:56.128689 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/12556204-17e0-478c-882d-cd17ccf87c13-var-run-ovn\") pod \"12556204-17e0-478c-882d-cd17ccf87c13\" (UID: \"12556204-17e0-478c-882d-cd17ccf87c13\") " Oct 12 07:48:56 crc kubenswrapper[4599]: I1012 07:48:56.128762 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/12556204-17e0-478c-882d-cd17ccf87c13-additional-scripts\") pod \"12556204-17e0-478c-882d-cd17ccf87c13\" (UID: \"12556204-17e0-478c-882d-cd17ccf87c13\") " Oct 12 07:48:56 crc kubenswrapper[4599]: I1012 07:48:56.128808 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/12556204-17e0-478c-882d-cd17ccf87c13-var-log-ovn\") pod \"12556204-17e0-478c-882d-cd17ccf87c13\" (UID: \"12556204-17e0-478c-882d-cd17ccf87c13\") " Oct 12 07:48:56 crc kubenswrapper[4599]: I1012 07:48:56.128853 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzpwn\" (UniqueName: \"kubernetes.io/projected/12556204-17e0-478c-882d-cd17ccf87c13-kube-api-access-nzpwn\") pod \"12556204-17e0-478c-882d-cd17ccf87c13\" (UID: \"12556204-17e0-478c-882d-cd17ccf87c13\") " Oct 12 07:48:56 crc kubenswrapper[4599]: I1012 07:48:56.128890 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/12556204-17e0-478c-882d-cd17ccf87c13-var-run\") pod \"12556204-17e0-478c-882d-cd17ccf87c13\" (UID: \"12556204-17e0-478c-882d-cd17ccf87c13\") " Oct 12 07:48:56 crc kubenswrapper[4599]: I1012 07:48:56.128927 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/12556204-17e0-478c-882d-cd17ccf87c13-scripts\") pod \"12556204-17e0-478c-882d-cd17ccf87c13\" (UID: \"12556204-17e0-478c-882d-cd17ccf87c13\") " Oct 12 07:48:56 crc kubenswrapper[4599]: I1012 07:48:56.130158 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12556204-17e0-478c-882d-cd17ccf87c13-scripts" (OuterVolumeSpecName: "scripts") pod "12556204-17e0-478c-882d-cd17ccf87c13" (UID: "12556204-17e0-478c-882d-cd17ccf87c13"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:48:56 crc kubenswrapper[4599]: I1012 07:48:56.130191 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/12556204-17e0-478c-882d-cd17ccf87c13-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "12556204-17e0-478c-882d-cd17ccf87c13" (UID: "12556204-17e0-478c-882d-cd17ccf87c13"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 12 07:48:56 crc kubenswrapper[4599]: I1012 07:48:56.130672 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/12556204-17e0-478c-882d-cd17ccf87c13-var-run" (OuterVolumeSpecName: "var-run") pod "12556204-17e0-478c-882d-cd17ccf87c13" (UID: "12556204-17e0-478c-882d-cd17ccf87c13"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 12 07:48:56 crc kubenswrapper[4599]: I1012 07:48:56.130735 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12556204-17e0-478c-882d-cd17ccf87c13-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "12556204-17e0-478c-882d-cd17ccf87c13" (UID: "12556204-17e0-478c-882d-cd17ccf87c13"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:48:56 crc kubenswrapper[4599]: I1012 07:48:56.130746 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/12556204-17e0-478c-882d-cd17ccf87c13-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "12556204-17e0-478c-882d-cd17ccf87c13" (UID: "12556204-17e0-478c-882d-cd17ccf87c13"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 12 07:48:56 crc kubenswrapper[4599]: I1012 07:48:56.135480 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12556204-17e0-478c-882d-cd17ccf87c13-kube-api-access-nzpwn" (OuterVolumeSpecName: "kube-api-access-nzpwn") pod "12556204-17e0-478c-882d-cd17ccf87c13" (UID: "12556204-17e0-478c-882d-cd17ccf87c13"). InnerVolumeSpecName "kube-api-access-nzpwn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:48:56 crc kubenswrapper[4599]: I1012 07:48:56.214931 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dch7h" Oct 12 07:48:56 crc kubenswrapper[4599]: I1012 07:48:56.230998 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gjrqg\" (UniqueName: \"kubernetes.io/projected/44c467ff-3eaf-42e6-94c7-6699da7f8be8-kube-api-access-gjrqg\") pod \"44c467ff-3eaf-42e6-94c7-6699da7f8be8\" (UID: \"44c467ff-3eaf-42e6-94c7-6699da7f8be8\") " Oct 12 07:48:56 crc kubenswrapper[4599]: I1012 07:48:56.231172 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44c467ff-3eaf-42e6-94c7-6699da7f8be8-utilities\") pod \"44c467ff-3eaf-42e6-94c7-6699da7f8be8\" (UID: \"44c467ff-3eaf-42e6-94c7-6699da7f8be8\") " Oct 12 07:48:56 crc kubenswrapper[4599]: I1012 07:48:56.231313 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44c467ff-3eaf-42e6-94c7-6699da7f8be8-catalog-content\") pod \"44c467ff-3eaf-42e6-94c7-6699da7f8be8\" (UID: \"44c467ff-3eaf-42e6-94c7-6699da7f8be8\") " Oct 12 07:48:56 crc kubenswrapper[4599]: I1012 07:48:56.234409 4599 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/12556204-17e0-478c-882d-cd17ccf87c13-scripts\") on node \"crc\" DevicePath \"\"" Oct 12 07:48:56 crc kubenswrapper[4599]: I1012 07:48:56.234434 4599 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/12556204-17e0-478c-882d-cd17ccf87c13-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 12 07:48:56 crc kubenswrapper[4599]: I1012 07:48:56.234444 4599 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/12556204-17e0-478c-882d-cd17ccf87c13-additional-scripts\") on node \"crc\" DevicePath \"\"" Oct 12 07:48:56 crc kubenswrapper[4599]: I1012 07:48:56.234457 4599 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/12556204-17e0-478c-882d-cd17ccf87c13-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 12 07:48:56 crc kubenswrapper[4599]: I1012 07:48:56.234468 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzpwn\" (UniqueName: \"kubernetes.io/projected/12556204-17e0-478c-882d-cd17ccf87c13-kube-api-access-nzpwn\") on node \"crc\" DevicePath \"\"" Oct 12 07:48:56 crc kubenswrapper[4599]: I1012 07:48:56.234479 4599 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/12556204-17e0-478c-882d-cd17ccf87c13-var-run\") on node \"crc\" DevicePath \"\"" Oct 12 07:48:56 crc kubenswrapper[4599]: I1012 07:48:56.235326 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44c467ff-3eaf-42e6-94c7-6699da7f8be8-kube-api-access-gjrqg" (OuterVolumeSpecName: "kube-api-access-gjrqg") pod "44c467ff-3eaf-42e6-94c7-6699da7f8be8" (UID: "44c467ff-3eaf-42e6-94c7-6699da7f8be8"). InnerVolumeSpecName "kube-api-access-gjrqg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:48:56 crc kubenswrapper[4599]: I1012 07:48:56.235845 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44c467ff-3eaf-42e6-94c7-6699da7f8be8-utilities" (OuterVolumeSpecName: "utilities") pod "44c467ff-3eaf-42e6-94c7-6699da7f8be8" (UID: "44c467ff-3eaf-42e6-94c7-6699da7f8be8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:48:56 crc kubenswrapper[4599]: I1012 07:48:56.271415 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44c467ff-3eaf-42e6-94c7-6699da7f8be8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "44c467ff-3eaf-42e6-94c7-6699da7f8be8" (UID: "44c467ff-3eaf-42e6-94c7-6699da7f8be8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:48:56 crc kubenswrapper[4599]: I1012 07:48:56.309661 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d5c4f869-vd6xg" Oct 12 07:48:56 crc kubenswrapper[4599]: I1012 07:48:56.335831 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/946b8921-7749-4631-a3a0-17b48397fb1a-ovsdbserver-nb\") pod \"946b8921-7749-4631-a3a0-17b48397fb1a\" (UID: \"946b8921-7749-4631-a3a0-17b48397fb1a\") " Oct 12 07:48:56 crc kubenswrapper[4599]: I1012 07:48:56.335911 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/946b8921-7749-4631-a3a0-17b48397fb1a-config\") pod \"946b8921-7749-4631-a3a0-17b48397fb1a\" (UID: \"946b8921-7749-4631-a3a0-17b48397fb1a\") " Oct 12 07:48:56 crc kubenswrapper[4599]: I1012 07:48:56.335979 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/946b8921-7749-4631-a3a0-17b48397fb1a-ovsdbserver-sb\") pod \"946b8921-7749-4631-a3a0-17b48397fb1a\" (UID: \"946b8921-7749-4631-a3a0-17b48397fb1a\") " Oct 12 07:48:56 crc kubenswrapper[4599]: I1012 07:48:56.336414 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/946b8921-7749-4631-a3a0-17b48397fb1a-dns-svc\") pod \"946b8921-7749-4631-a3a0-17b48397fb1a\" (UID: \"946b8921-7749-4631-a3a0-17b48397fb1a\") " Oct 12 07:48:56 crc kubenswrapper[4599]: I1012 07:48:56.336591 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kltjz\" (UniqueName: \"kubernetes.io/projected/946b8921-7749-4631-a3a0-17b48397fb1a-kube-api-access-kltjz\") pod \"946b8921-7749-4631-a3a0-17b48397fb1a\" (UID: \"946b8921-7749-4631-a3a0-17b48397fb1a\") " Oct 12 07:48:56 crc kubenswrapper[4599]: I1012 07:48:56.337093 4599 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44c467ff-3eaf-42e6-94c7-6699da7f8be8-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 07:48:56 crc kubenswrapper[4599]: I1012 07:48:56.337110 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gjrqg\" (UniqueName: \"kubernetes.io/projected/44c467ff-3eaf-42e6-94c7-6699da7f8be8-kube-api-access-gjrqg\") on node \"crc\" DevicePath \"\"" Oct 12 07:48:56 crc kubenswrapper[4599]: I1012 07:48:56.337123 4599 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44c467ff-3eaf-42e6-94c7-6699da7f8be8-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 07:48:56 crc kubenswrapper[4599]: I1012 07:48:56.340159 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/946b8921-7749-4631-a3a0-17b48397fb1a-kube-api-access-kltjz" (OuterVolumeSpecName: "kube-api-access-kltjz") pod "946b8921-7749-4631-a3a0-17b48397fb1a" (UID: "946b8921-7749-4631-a3a0-17b48397fb1a"). InnerVolumeSpecName "kube-api-access-kltjz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:48:56 crc kubenswrapper[4599]: I1012 07:48:56.370148 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/946b8921-7749-4631-a3a0-17b48397fb1a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "946b8921-7749-4631-a3a0-17b48397fb1a" (UID: "946b8921-7749-4631-a3a0-17b48397fb1a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:48:56 crc kubenswrapper[4599]: I1012 07:48:56.370896 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/946b8921-7749-4631-a3a0-17b48397fb1a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "946b8921-7749-4631-a3a0-17b48397fb1a" (UID: "946b8921-7749-4631-a3a0-17b48397fb1a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:48:56 crc kubenswrapper[4599]: I1012 07:48:56.372182 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/946b8921-7749-4631-a3a0-17b48397fb1a-config" (OuterVolumeSpecName: "config") pod "946b8921-7749-4631-a3a0-17b48397fb1a" (UID: "946b8921-7749-4631-a3a0-17b48397fb1a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:48:56 crc kubenswrapper[4599]: I1012 07:48:56.374835 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/946b8921-7749-4631-a3a0-17b48397fb1a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "946b8921-7749-4631-a3a0-17b48397fb1a" (UID: "946b8921-7749-4631-a3a0-17b48397fb1a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:48:56 crc kubenswrapper[4599]: I1012 07:48:56.440293 4599 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/946b8921-7749-4631-a3a0-17b48397fb1a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 12 07:48:56 crc kubenswrapper[4599]: I1012 07:48:56.440362 4599 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/946b8921-7749-4631-a3a0-17b48397fb1a-config\") on node \"crc\" DevicePath \"\"" Oct 12 07:48:56 crc kubenswrapper[4599]: I1012 07:48:56.440378 4599 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/946b8921-7749-4631-a3a0-17b48397fb1a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 12 07:48:56 crc kubenswrapper[4599]: I1012 07:48:56.440389 4599 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/946b8921-7749-4631-a3a0-17b48397fb1a-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 12 07:48:56 crc kubenswrapper[4599]: I1012 07:48:56.440405 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kltjz\" (UniqueName: \"kubernetes.io/projected/946b8921-7749-4631-a3a0-17b48397fb1a-kube-api-access-kltjz\") on node \"crc\" DevicePath \"\"" Oct 12 07:48:56 crc kubenswrapper[4599]: I1012 07:48:56.454856 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-s54p2"] Oct 12 07:48:56 crc kubenswrapper[4599]: I1012 07:48:56.459245 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-mhctb"] Oct 12 07:48:56 crc kubenswrapper[4599]: W1012 07:48:56.468482 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod797ba2c7_ef03_4514_b48c_d4267a650fbc.slice/crio-4275c5ddcf33489a4f177d17de9fe9c0a1a97e0163bd96f3770dd49a189c4a7a WatchSource:0}: Error finding container 4275c5ddcf33489a4f177d17de9fe9c0a1a97e0163bd96f3770dd49a189c4a7a: Status 404 returned error can't find the container with id 4275c5ddcf33489a4f177d17de9fe9c0a1a97e0163bd96f3770dd49a189c4a7a Oct 12 07:48:56 crc kubenswrapper[4599]: W1012 07:48:56.469053 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff665457_da4e_4e70_a9be_a64b343bd4d0.slice/crio-ae204b199e580622a970f2efe4131cd591867f417329ba7c6fa2f810720afda2 WatchSource:0}: Error finding container ae204b199e580622a970f2efe4131cd591867f417329ba7c6fa2f810720afda2: Status 404 returned error can't find the container with id ae204b199e580622a970f2efe4131cd591867f417329ba7c6fa2f810720afda2 Oct 12 07:48:56 crc kubenswrapper[4599]: I1012 07:48:56.548685 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-s54p2" event={"ID":"ff665457-da4e-4e70-a9be-a64b343bd4d0","Type":"ContainerStarted","Data":"ae204b199e580622a970f2efe4131cd591867f417329ba7c6fa2f810720afda2"} Oct 12 07:48:56 crc kubenswrapper[4599]: I1012 07:48:56.551945 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9rbk6-config-fcvrx" event={"ID":"12556204-17e0-478c-882d-cd17ccf87c13","Type":"ContainerDied","Data":"2be266329e64a81978be28371b41401edb755b305b938f6f283040603bdc6d1d"} Oct 12 07:48:56 crc kubenswrapper[4599]: I1012 07:48:56.552021 4599 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2be266329e64a81978be28371b41401edb755b305b938f6f283040603bdc6d1d" Oct 12 07:48:56 crc kubenswrapper[4599]: I1012 07:48:56.552027 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9rbk6-config-fcvrx" Oct 12 07:48:56 crc kubenswrapper[4599]: I1012 07:48:56.555474 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d5c4f869-vd6xg" event={"ID":"946b8921-7749-4631-a3a0-17b48397fb1a","Type":"ContainerDied","Data":"2c7189a71d3a0a9deb0a7af30201fd9c1414d6ecbab74a67805ad45dbb4d2f9c"} Oct 12 07:48:56 crc kubenswrapper[4599]: I1012 07:48:56.555558 4599 scope.go:117] "RemoveContainer" containerID="c3a0b4b790ca4529e2982ef697da3f57431e2c32880884f8a554ddf944e7b362" Oct 12 07:48:56 crc kubenswrapper[4599]: I1012 07:48:56.555725 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d5c4f869-vd6xg" Oct 12 07:48:56 crc kubenswrapper[4599]: I1012 07:48:56.558161 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-mhctb" event={"ID":"797ba2c7-ef03-4514-b48c-d4267a650fbc","Type":"ContainerStarted","Data":"4275c5ddcf33489a4f177d17de9fe9c0a1a97e0163bd96f3770dd49a189c4a7a"} Oct 12 07:48:56 crc kubenswrapper[4599]: I1012 07:48:56.568207 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dch7h" event={"ID":"44c467ff-3eaf-42e6-94c7-6699da7f8be8","Type":"ContainerDied","Data":"d65ee474371afecfb5799d5cf12cd446074d69c333e18cc5865a192753742756"} Oct 12 07:48:56 crc kubenswrapper[4599]: I1012 07:48:56.568397 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dch7h" Oct 12 07:48:56 crc kubenswrapper[4599]: I1012 07:48:56.584382 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-p542w" event={"ID":"b0041b3a-cda2-439c-96ae-673642206886","Type":"ContainerStarted","Data":"924847be9fa5a8aa64d08a40d77dad3f80609e2a1894277131b7b8f46b054cac"} Oct 12 07:48:56 crc kubenswrapper[4599]: I1012 07:48:56.585149 4599 scope.go:117] "RemoveContainer" containerID="714a623a457bb4b68973fe59ad63bd61487350664c144f10bc20db73a00d8cb8" Oct 12 07:48:56 crc kubenswrapper[4599]: I1012 07:48:56.604328 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-qdm5f"] Oct 12 07:48:56 crc kubenswrapper[4599]: I1012 07:48:56.619187 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-hmf2g"] Oct 12 07:48:56 crc kubenswrapper[4599]: I1012 07:48:56.626644 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d5c4f869-vd6xg"] Oct 12 07:48:56 crc kubenswrapper[4599]: I1012 07:48:56.627232 4599 scope.go:117] "RemoveContainer" containerID="878f1162b86b336dc5148caa2fb8b48e2328b2d61c559bf1baed534d12fe7746" Oct 12 07:48:56 crc kubenswrapper[4599]: I1012 07:48:56.643659 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-d5c4f869-vd6xg"] Oct 12 07:48:56 crc kubenswrapper[4599]: I1012 07:48:56.648703 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-p542w" podStartSLOduration=1.789304507 podStartE2EDuration="18.648681298s" podCreationTimestamp="2025-10-12 07:48:38 +0000 UTC" firstStartedPulling="2025-10-12 07:48:39.11917525 +0000 UTC m=+815.908370752" lastFinishedPulling="2025-10-12 07:48:55.978552041 +0000 UTC m=+832.767747543" observedRunningTime="2025-10-12 07:48:56.605452251 +0000 UTC m=+833.394647753" watchObservedRunningTime="2025-10-12 07:48:56.648681298 +0000 UTC m=+833.437876800" Oct 12 07:48:56 crc kubenswrapper[4599]: I1012 07:48:56.654499 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dch7h"] Oct 12 07:48:56 crc kubenswrapper[4599]: I1012 07:48:56.659620 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-dch7h"] Oct 12 07:48:56 crc kubenswrapper[4599]: I1012 07:48:56.672704 4599 scope.go:117] "RemoveContainer" containerID="ca1ea712abc6125505dea3af9d309f94fbce16a566858cd098da8fb6995925d8" Oct 12 07:48:56 crc kubenswrapper[4599]: I1012 07:48:56.714565 4599 scope.go:117] "RemoveContainer" containerID="3b6360e9814c6d047759eec910bda2918a1c91aac39dc75eae01fdcd9793910e" Oct 12 07:48:57 crc kubenswrapper[4599]: I1012 07:48:57.152818 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-9rbk6-config-fcvrx"] Oct 12 07:48:57 crc kubenswrapper[4599]: I1012 07:48:57.159539 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-9rbk6-config-fcvrx"] Oct 12 07:48:57 crc kubenswrapper[4599]: I1012 07:48:57.555316 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12556204-17e0-478c-882d-cd17ccf87c13" path="/var/lib/kubelet/pods/12556204-17e0-478c-882d-cd17ccf87c13/volumes" Oct 12 07:48:57 crc kubenswrapper[4599]: I1012 07:48:57.555912 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44c467ff-3eaf-42e6-94c7-6699da7f8be8" path="/var/lib/kubelet/pods/44c467ff-3eaf-42e6-94c7-6699da7f8be8/volumes" Oct 12 07:48:57 crc kubenswrapper[4599]: I1012 07:48:57.556537 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="946b8921-7749-4631-a3a0-17b48397fb1a" path="/var/lib/kubelet/pods/946b8921-7749-4631-a3a0-17b48397fb1a/volumes" Oct 12 07:48:57 crc kubenswrapper[4599]: I1012 07:48:57.592212 4599 generic.go:334] "Generic (PLEG): container finished" podID="797ba2c7-ef03-4514-b48c-d4267a650fbc" containerID="3ab6a014bd08df342161a19137d03a135cc021fc7bfbdd1778e28481079c111c" exitCode=0 Oct 12 07:48:57 crc kubenswrapper[4599]: I1012 07:48:57.592280 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-mhctb" event={"ID":"797ba2c7-ef03-4514-b48c-d4267a650fbc","Type":"ContainerDied","Data":"3ab6a014bd08df342161a19137d03a135cc021fc7bfbdd1778e28481079c111c"} Oct 12 07:48:57 crc kubenswrapper[4599]: I1012 07:48:57.596708 4599 generic.go:334] "Generic (PLEG): container finished" podID="9f881966-2c70-43ee-bcbd-2fca447e0697" containerID="62f62a406e902ea1e0d5391caf9693d29335193539fae3f9350db1cfaecc71ac" exitCode=0 Oct 12 07:48:57 crc kubenswrapper[4599]: I1012 07:48:57.596765 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-qdm5f" event={"ID":"9f881966-2c70-43ee-bcbd-2fca447e0697","Type":"ContainerDied","Data":"62f62a406e902ea1e0d5391caf9693d29335193539fae3f9350db1cfaecc71ac"} Oct 12 07:48:57 crc kubenswrapper[4599]: I1012 07:48:57.596959 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-qdm5f" event={"ID":"9f881966-2c70-43ee-bcbd-2fca447e0697","Type":"ContainerStarted","Data":"1cc313fa506a3d3ca8b8d694f85d2a527cc2c9f87808aa621373eeed56b4b9b5"} Oct 12 07:48:57 crc kubenswrapper[4599]: I1012 07:48:57.608188 4599 generic.go:334] "Generic (PLEG): container finished" podID="16cbe17e-851b-4275-9a2b-13ca14459b4a" containerID="22506781a9494fc947aaf801d02f63f16e02f92ec872e0b38ed7e3b7fb31f1c3" exitCode=0 Oct 12 07:48:57 crc kubenswrapper[4599]: I1012 07:48:57.609363 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-hmf2g" event={"ID":"16cbe17e-851b-4275-9a2b-13ca14459b4a","Type":"ContainerDied","Data":"22506781a9494fc947aaf801d02f63f16e02f92ec872e0b38ed7e3b7fb31f1c3"} Oct 12 07:48:57 crc kubenswrapper[4599]: I1012 07:48:57.609392 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-hmf2g" event={"ID":"16cbe17e-851b-4275-9a2b-13ca14459b4a","Type":"ContainerStarted","Data":"08e5083a5500068b009930866e3dbc40003d007d0786167666d712fccb640fde"} Oct 12 07:48:59 crc kubenswrapper[4599]: I1012 07:48:59.003429 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-qdm5f" Oct 12 07:48:59 crc kubenswrapper[4599]: I1012 07:48:59.009254 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-hmf2g" Oct 12 07:48:59 crc kubenswrapper[4599]: I1012 07:48:59.013447 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-mhctb" Oct 12 07:48:59 crc kubenswrapper[4599]: I1012 07:48:59.190225 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lts9k\" (UniqueName: \"kubernetes.io/projected/797ba2c7-ef03-4514-b48c-d4267a650fbc-kube-api-access-lts9k\") pod \"797ba2c7-ef03-4514-b48c-d4267a650fbc\" (UID: \"797ba2c7-ef03-4514-b48c-d4267a650fbc\") " Oct 12 07:48:59 crc kubenswrapper[4599]: I1012 07:48:59.190716 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gz456\" (UniqueName: \"kubernetes.io/projected/9f881966-2c70-43ee-bcbd-2fca447e0697-kube-api-access-gz456\") pod \"9f881966-2c70-43ee-bcbd-2fca447e0697\" (UID: \"9f881966-2c70-43ee-bcbd-2fca447e0697\") " Oct 12 07:48:59 crc kubenswrapper[4599]: I1012 07:48:59.190840 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jct2l\" (UniqueName: \"kubernetes.io/projected/16cbe17e-851b-4275-9a2b-13ca14459b4a-kube-api-access-jct2l\") pod \"16cbe17e-851b-4275-9a2b-13ca14459b4a\" (UID: \"16cbe17e-851b-4275-9a2b-13ca14459b4a\") " Oct 12 07:48:59 crc kubenswrapper[4599]: I1012 07:48:59.198724 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f881966-2c70-43ee-bcbd-2fca447e0697-kube-api-access-gz456" (OuterVolumeSpecName: "kube-api-access-gz456") pod "9f881966-2c70-43ee-bcbd-2fca447e0697" (UID: "9f881966-2c70-43ee-bcbd-2fca447e0697"). InnerVolumeSpecName "kube-api-access-gz456". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:48:59 crc kubenswrapper[4599]: I1012 07:48:59.199302 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/797ba2c7-ef03-4514-b48c-d4267a650fbc-kube-api-access-lts9k" (OuterVolumeSpecName: "kube-api-access-lts9k") pod "797ba2c7-ef03-4514-b48c-d4267a650fbc" (UID: "797ba2c7-ef03-4514-b48c-d4267a650fbc"). InnerVolumeSpecName "kube-api-access-lts9k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:48:59 crc kubenswrapper[4599]: I1012 07:48:59.200065 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16cbe17e-851b-4275-9a2b-13ca14459b4a-kube-api-access-jct2l" (OuterVolumeSpecName: "kube-api-access-jct2l") pod "16cbe17e-851b-4275-9a2b-13ca14459b4a" (UID: "16cbe17e-851b-4275-9a2b-13ca14459b4a"). InnerVolumeSpecName "kube-api-access-jct2l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:48:59 crc kubenswrapper[4599]: I1012 07:48:59.292854 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lts9k\" (UniqueName: \"kubernetes.io/projected/797ba2c7-ef03-4514-b48c-d4267a650fbc-kube-api-access-lts9k\") on node \"crc\" DevicePath \"\"" Oct 12 07:48:59 crc kubenswrapper[4599]: I1012 07:48:59.292926 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gz456\" (UniqueName: \"kubernetes.io/projected/9f881966-2c70-43ee-bcbd-2fca447e0697-kube-api-access-gz456\") on node \"crc\" DevicePath \"\"" Oct 12 07:48:59 crc kubenswrapper[4599]: I1012 07:48:59.292938 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jct2l\" (UniqueName: \"kubernetes.io/projected/16cbe17e-851b-4275-9a2b-13ca14459b4a-kube-api-access-jct2l\") on node \"crc\" DevicePath \"\"" Oct 12 07:48:59 crc kubenswrapper[4599]: I1012 07:48:59.625199 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-mhctb" event={"ID":"797ba2c7-ef03-4514-b48c-d4267a650fbc","Type":"ContainerDied","Data":"4275c5ddcf33489a4f177d17de9fe9c0a1a97e0163bd96f3770dd49a189c4a7a"} Oct 12 07:48:59 crc kubenswrapper[4599]: I1012 07:48:59.625234 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-mhctb" Oct 12 07:48:59 crc kubenswrapper[4599]: I1012 07:48:59.625252 4599 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4275c5ddcf33489a4f177d17de9fe9c0a1a97e0163bd96f3770dd49a189c4a7a" Oct 12 07:48:59 crc kubenswrapper[4599]: I1012 07:48:59.627362 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-qdm5f" event={"ID":"9f881966-2c70-43ee-bcbd-2fca447e0697","Type":"ContainerDied","Data":"1cc313fa506a3d3ca8b8d694f85d2a527cc2c9f87808aa621373eeed56b4b9b5"} Oct 12 07:48:59 crc kubenswrapper[4599]: I1012 07:48:59.627410 4599 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1cc313fa506a3d3ca8b8d694f85d2a527cc2c9f87808aa621373eeed56b4b9b5" Oct 12 07:48:59 crc kubenswrapper[4599]: I1012 07:48:59.627436 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-qdm5f" Oct 12 07:48:59 crc kubenswrapper[4599]: I1012 07:48:59.629898 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-hmf2g" event={"ID":"16cbe17e-851b-4275-9a2b-13ca14459b4a","Type":"ContainerDied","Data":"08e5083a5500068b009930866e3dbc40003d007d0786167666d712fccb640fde"} Oct 12 07:48:59 crc kubenswrapper[4599]: I1012 07:48:59.629931 4599 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="08e5083a5500068b009930866e3dbc40003d007d0786167666d712fccb640fde" Oct 12 07:48:59 crc kubenswrapper[4599]: I1012 07:48:59.629934 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-hmf2g" Oct 12 07:49:00 crc kubenswrapper[4599]: I1012 07:49:00.643078 4599 generic.go:334] "Generic (PLEG): container finished" podID="b0041b3a-cda2-439c-96ae-673642206886" containerID="924847be9fa5a8aa64d08a40d77dad3f80609e2a1894277131b7b8f46b054cac" exitCode=0 Oct 12 07:49:00 crc kubenswrapper[4599]: I1012 07:49:00.643154 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-p542w" event={"ID":"b0041b3a-cda2-439c-96ae-673642206886","Type":"ContainerDied","Data":"924847be9fa5a8aa64d08a40d77dad3f80609e2a1894277131b7b8f46b054cac"} Oct 12 07:49:01 crc kubenswrapper[4599]: I1012 07:49:01.655022 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-s54p2" event={"ID":"ff665457-da4e-4e70-a9be-a64b343bd4d0","Type":"ContainerStarted","Data":"64e1a086baf135ecc5b91d8a3df91b4d276ae6107b96740f0e62ff6c6eb47683"} Oct 12 07:49:01 crc kubenswrapper[4599]: I1012 07:49:01.674828 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-s54p2" podStartSLOduration=6.643989745 podStartE2EDuration="11.674807165s" podCreationTimestamp="2025-10-12 07:48:50 +0000 UTC" firstStartedPulling="2025-10-12 07:48:56.472295714 +0000 UTC m=+833.261491216" lastFinishedPulling="2025-10-12 07:49:01.503113135 +0000 UTC m=+838.292308636" observedRunningTime="2025-10-12 07:49:01.672542974 +0000 UTC m=+838.461738476" watchObservedRunningTime="2025-10-12 07:49:01.674807165 +0000 UTC m=+838.464002668" Oct 12 07:49:01 crc kubenswrapper[4599]: I1012 07:49:01.990053 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-p542w" Oct 12 07:49:02 crc kubenswrapper[4599]: I1012 07:49:02.147420 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0041b3a-cda2-439c-96ae-673642206886-combined-ca-bundle\") pod \"b0041b3a-cda2-439c-96ae-673642206886\" (UID: \"b0041b3a-cda2-439c-96ae-673642206886\") " Oct 12 07:49:02 crc kubenswrapper[4599]: I1012 07:49:02.147805 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0041b3a-cda2-439c-96ae-673642206886-config-data\") pod \"b0041b3a-cda2-439c-96ae-673642206886\" (UID: \"b0041b3a-cda2-439c-96ae-673642206886\") " Oct 12 07:49:02 crc kubenswrapper[4599]: I1012 07:49:02.147853 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b0041b3a-cda2-439c-96ae-673642206886-db-sync-config-data\") pod \"b0041b3a-cda2-439c-96ae-673642206886\" (UID: \"b0041b3a-cda2-439c-96ae-673642206886\") " Oct 12 07:49:02 crc kubenswrapper[4599]: I1012 07:49:02.147891 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vcglj\" (UniqueName: \"kubernetes.io/projected/b0041b3a-cda2-439c-96ae-673642206886-kube-api-access-vcglj\") pod \"b0041b3a-cda2-439c-96ae-673642206886\" (UID: \"b0041b3a-cda2-439c-96ae-673642206886\") " Oct 12 07:49:02 crc kubenswrapper[4599]: I1012 07:49:02.154821 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0041b3a-cda2-439c-96ae-673642206886-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "b0041b3a-cda2-439c-96ae-673642206886" (UID: "b0041b3a-cda2-439c-96ae-673642206886"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:49:02 crc kubenswrapper[4599]: I1012 07:49:02.155248 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0041b3a-cda2-439c-96ae-673642206886-kube-api-access-vcglj" (OuterVolumeSpecName: "kube-api-access-vcglj") pod "b0041b3a-cda2-439c-96ae-673642206886" (UID: "b0041b3a-cda2-439c-96ae-673642206886"). InnerVolumeSpecName "kube-api-access-vcglj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:49:02 crc kubenswrapper[4599]: I1012 07:49:02.170551 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0041b3a-cda2-439c-96ae-673642206886-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b0041b3a-cda2-439c-96ae-673642206886" (UID: "b0041b3a-cda2-439c-96ae-673642206886"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:49:02 crc kubenswrapper[4599]: I1012 07:49:02.187323 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0041b3a-cda2-439c-96ae-673642206886-config-data" (OuterVolumeSpecName: "config-data") pod "b0041b3a-cda2-439c-96ae-673642206886" (UID: "b0041b3a-cda2-439c-96ae-673642206886"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:49:02 crc kubenswrapper[4599]: I1012 07:49:02.249518 4599 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b0041b3a-cda2-439c-96ae-673642206886-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:02 crc kubenswrapper[4599]: I1012 07:49:02.249547 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vcglj\" (UniqueName: \"kubernetes.io/projected/b0041b3a-cda2-439c-96ae-673642206886-kube-api-access-vcglj\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:02 crc kubenswrapper[4599]: I1012 07:49:02.249563 4599 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0041b3a-cda2-439c-96ae-673642206886-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:02 crc kubenswrapper[4599]: I1012 07:49:02.249574 4599 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0041b3a-cda2-439c-96ae-673642206886-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:02 crc kubenswrapper[4599]: I1012 07:49:02.682649 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-p542w" Oct 12 07:49:02 crc kubenswrapper[4599]: I1012 07:49:02.682702 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-p542w" event={"ID":"b0041b3a-cda2-439c-96ae-673642206886","Type":"ContainerDied","Data":"900aea57814e3b2ecb9df6b84fee421cc777399b533b0eac91ee939a4126cad8"} Oct 12 07:49:02 crc kubenswrapper[4599]: I1012 07:49:02.682733 4599 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="900aea57814e3b2ecb9df6b84fee421cc777399b533b0eac91ee939a4126cad8" Oct 12 07:49:03 crc kubenswrapper[4599]: I1012 07:49:03.054888 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b47fdb5b7-gd7kj"] Oct 12 07:49:03 crc kubenswrapper[4599]: E1012 07:49:03.055426 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44c467ff-3eaf-42e6-94c7-6699da7f8be8" containerName="registry-server" Oct 12 07:49:03 crc kubenswrapper[4599]: I1012 07:49:03.055455 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="44c467ff-3eaf-42e6-94c7-6699da7f8be8" containerName="registry-server" Oct 12 07:49:03 crc kubenswrapper[4599]: E1012 07:49:03.055475 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="946b8921-7749-4631-a3a0-17b48397fb1a" containerName="init" Oct 12 07:49:03 crc kubenswrapper[4599]: I1012 07:49:03.055494 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="946b8921-7749-4631-a3a0-17b48397fb1a" containerName="init" Oct 12 07:49:03 crc kubenswrapper[4599]: E1012 07:49:03.055517 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0041b3a-cda2-439c-96ae-673642206886" containerName="glance-db-sync" Oct 12 07:49:03 crc kubenswrapper[4599]: I1012 07:49:03.055523 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0041b3a-cda2-439c-96ae-673642206886" containerName="glance-db-sync" Oct 12 07:49:03 crc kubenswrapper[4599]: E1012 07:49:03.055530 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16cbe17e-851b-4275-9a2b-13ca14459b4a" containerName="mariadb-database-create" Oct 12 07:49:03 crc kubenswrapper[4599]: I1012 07:49:03.055552 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="16cbe17e-851b-4275-9a2b-13ca14459b4a" containerName="mariadb-database-create" Oct 12 07:49:03 crc kubenswrapper[4599]: E1012 07:49:03.055562 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="797ba2c7-ef03-4514-b48c-d4267a650fbc" containerName="mariadb-database-create" Oct 12 07:49:03 crc kubenswrapper[4599]: I1012 07:49:03.055569 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="797ba2c7-ef03-4514-b48c-d4267a650fbc" containerName="mariadb-database-create" Oct 12 07:49:03 crc kubenswrapper[4599]: E1012 07:49:03.055591 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f881966-2c70-43ee-bcbd-2fca447e0697" containerName="mariadb-database-create" Oct 12 07:49:03 crc kubenswrapper[4599]: I1012 07:49:03.055605 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f881966-2c70-43ee-bcbd-2fca447e0697" containerName="mariadb-database-create" Oct 12 07:49:03 crc kubenswrapper[4599]: E1012 07:49:03.055613 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44c467ff-3eaf-42e6-94c7-6699da7f8be8" containerName="extract-content" Oct 12 07:49:03 crc kubenswrapper[4599]: I1012 07:49:03.055627 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="44c467ff-3eaf-42e6-94c7-6699da7f8be8" containerName="extract-content" Oct 12 07:49:03 crc kubenswrapper[4599]: E1012 07:49:03.055640 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44c467ff-3eaf-42e6-94c7-6699da7f8be8" containerName="extract-utilities" Oct 12 07:49:03 crc kubenswrapper[4599]: I1012 07:49:03.055656 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="44c467ff-3eaf-42e6-94c7-6699da7f8be8" containerName="extract-utilities" Oct 12 07:49:03 crc kubenswrapper[4599]: E1012 07:49:03.055673 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12556204-17e0-478c-882d-cd17ccf87c13" containerName="ovn-config" Oct 12 07:49:03 crc kubenswrapper[4599]: I1012 07:49:03.055678 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="12556204-17e0-478c-882d-cd17ccf87c13" containerName="ovn-config" Oct 12 07:49:03 crc kubenswrapper[4599]: E1012 07:49:03.055686 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="946b8921-7749-4631-a3a0-17b48397fb1a" containerName="dnsmasq-dns" Oct 12 07:49:03 crc kubenswrapper[4599]: I1012 07:49:03.055692 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="946b8921-7749-4631-a3a0-17b48397fb1a" containerName="dnsmasq-dns" Oct 12 07:49:03 crc kubenswrapper[4599]: I1012 07:49:03.055892 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="797ba2c7-ef03-4514-b48c-d4267a650fbc" containerName="mariadb-database-create" Oct 12 07:49:03 crc kubenswrapper[4599]: I1012 07:49:03.055903 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="12556204-17e0-478c-882d-cd17ccf87c13" containerName="ovn-config" Oct 12 07:49:03 crc kubenswrapper[4599]: I1012 07:49:03.055917 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f881966-2c70-43ee-bcbd-2fca447e0697" containerName="mariadb-database-create" Oct 12 07:49:03 crc kubenswrapper[4599]: I1012 07:49:03.055923 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="946b8921-7749-4631-a3a0-17b48397fb1a" containerName="dnsmasq-dns" Oct 12 07:49:03 crc kubenswrapper[4599]: I1012 07:49:03.055933 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="16cbe17e-851b-4275-9a2b-13ca14459b4a" containerName="mariadb-database-create" Oct 12 07:49:03 crc kubenswrapper[4599]: I1012 07:49:03.055940 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0041b3a-cda2-439c-96ae-673642206886" containerName="glance-db-sync" Oct 12 07:49:03 crc kubenswrapper[4599]: I1012 07:49:03.055962 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="44c467ff-3eaf-42e6-94c7-6699da7f8be8" containerName="registry-server" Oct 12 07:49:03 crc kubenswrapper[4599]: I1012 07:49:03.057494 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b47fdb5b7-gd7kj" Oct 12 07:49:03 crc kubenswrapper[4599]: I1012 07:49:03.059745 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/531f1f31-7281-4e7f-af67-2592fd9733ed-dns-swift-storage-0\") pod \"dnsmasq-dns-6b47fdb5b7-gd7kj\" (UID: \"531f1f31-7281-4e7f-af67-2592fd9733ed\") " pod="openstack/dnsmasq-dns-6b47fdb5b7-gd7kj" Oct 12 07:49:03 crc kubenswrapper[4599]: I1012 07:49:03.059891 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/531f1f31-7281-4e7f-af67-2592fd9733ed-ovsdbserver-nb\") pod \"dnsmasq-dns-6b47fdb5b7-gd7kj\" (UID: \"531f1f31-7281-4e7f-af67-2592fd9733ed\") " pod="openstack/dnsmasq-dns-6b47fdb5b7-gd7kj" Oct 12 07:49:03 crc kubenswrapper[4599]: I1012 07:49:03.059984 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/531f1f31-7281-4e7f-af67-2592fd9733ed-ovsdbserver-sb\") pod \"dnsmasq-dns-6b47fdb5b7-gd7kj\" (UID: \"531f1f31-7281-4e7f-af67-2592fd9733ed\") " pod="openstack/dnsmasq-dns-6b47fdb5b7-gd7kj" Oct 12 07:49:03 crc kubenswrapper[4599]: I1012 07:49:03.060067 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/531f1f31-7281-4e7f-af67-2592fd9733ed-config\") pod \"dnsmasq-dns-6b47fdb5b7-gd7kj\" (UID: \"531f1f31-7281-4e7f-af67-2592fd9733ed\") " pod="openstack/dnsmasq-dns-6b47fdb5b7-gd7kj" Oct 12 07:49:03 crc kubenswrapper[4599]: I1012 07:49:03.060151 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8mk4\" (UniqueName: \"kubernetes.io/projected/531f1f31-7281-4e7f-af67-2592fd9733ed-kube-api-access-b8mk4\") pod \"dnsmasq-dns-6b47fdb5b7-gd7kj\" (UID: \"531f1f31-7281-4e7f-af67-2592fd9733ed\") " pod="openstack/dnsmasq-dns-6b47fdb5b7-gd7kj" Oct 12 07:49:03 crc kubenswrapper[4599]: I1012 07:49:03.060266 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/531f1f31-7281-4e7f-af67-2592fd9733ed-dns-svc\") pod \"dnsmasq-dns-6b47fdb5b7-gd7kj\" (UID: \"531f1f31-7281-4e7f-af67-2592fd9733ed\") " pod="openstack/dnsmasq-dns-6b47fdb5b7-gd7kj" Oct 12 07:49:03 crc kubenswrapper[4599]: I1012 07:49:03.081990 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b47fdb5b7-gd7kj"] Oct 12 07:49:03 crc kubenswrapper[4599]: I1012 07:49:03.161456 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/531f1f31-7281-4e7f-af67-2592fd9733ed-ovsdbserver-sb\") pod \"dnsmasq-dns-6b47fdb5b7-gd7kj\" (UID: \"531f1f31-7281-4e7f-af67-2592fd9733ed\") " pod="openstack/dnsmasq-dns-6b47fdb5b7-gd7kj" Oct 12 07:49:03 crc kubenswrapper[4599]: I1012 07:49:03.161609 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/531f1f31-7281-4e7f-af67-2592fd9733ed-config\") pod \"dnsmasq-dns-6b47fdb5b7-gd7kj\" (UID: \"531f1f31-7281-4e7f-af67-2592fd9733ed\") " pod="openstack/dnsmasq-dns-6b47fdb5b7-gd7kj" Oct 12 07:49:03 crc kubenswrapper[4599]: I1012 07:49:03.161661 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8mk4\" (UniqueName: \"kubernetes.io/projected/531f1f31-7281-4e7f-af67-2592fd9733ed-kube-api-access-b8mk4\") pod \"dnsmasq-dns-6b47fdb5b7-gd7kj\" (UID: \"531f1f31-7281-4e7f-af67-2592fd9733ed\") " pod="openstack/dnsmasq-dns-6b47fdb5b7-gd7kj" Oct 12 07:49:03 crc kubenswrapper[4599]: I1012 07:49:03.161741 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/531f1f31-7281-4e7f-af67-2592fd9733ed-dns-svc\") pod \"dnsmasq-dns-6b47fdb5b7-gd7kj\" (UID: \"531f1f31-7281-4e7f-af67-2592fd9733ed\") " pod="openstack/dnsmasq-dns-6b47fdb5b7-gd7kj" Oct 12 07:49:03 crc kubenswrapper[4599]: I1012 07:49:03.161831 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/531f1f31-7281-4e7f-af67-2592fd9733ed-dns-swift-storage-0\") pod \"dnsmasq-dns-6b47fdb5b7-gd7kj\" (UID: \"531f1f31-7281-4e7f-af67-2592fd9733ed\") " pod="openstack/dnsmasq-dns-6b47fdb5b7-gd7kj" Oct 12 07:49:03 crc kubenswrapper[4599]: I1012 07:49:03.161858 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/531f1f31-7281-4e7f-af67-2592fd9733ed-ovsdbserver-nb\") pod \"dnsmasq-dns-6b47fdb5b7-gd7kj\" (UID: \"531f1f31-7281-4e7f-af67-2592fd9733ed\") " pod="openstack/dnsmasq-dns-6b47fdb5b7-gd7kj" Oct 12 07:49:03 crc kubenswrapper[4599]: I1012 07:49:03.162385 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/531f1f31-7281-4e7f-af67-2592fd9733ed-ovsdbserver-sb\") pod \"dnsmasq-dns-6b47fdb5b7-gd7kj\" (UID: \"531f1f31-7281-4e7f-af67-2592fd9733ed\") " pod="openstack/dnsmasq-dns-6b47fdb5b7-gd7kj" Oct 12 07:49:03 crc kubenswrapper[4599]: I1012 07:49:03.162786 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/531f1f31-7281-4e7f-af67-2592fd9733ed-ovsdbserver-nb\") pod \"dnsmasq-dns-6b47fdb5b7-gd7kj\" (UID: \"531f1f31-7281-4e7f-af67-2592fd9733ed\") " pod="openstack/dnsmasq-dns-6b47fdb5b7-gd7kj" Oct 12 07:49:03 crc kubenswrapper[4599]: I1012 07:49:03.163073 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/531f1f31-7281-4e7f-af67-2592fd9733ed-config\") pod \"dnsmasq-dns-6b47fdb5b7-gd7kj\" (UID: \"531f1f31-7281-4e7f-af67-2592fd9733ed\") " pod="openstack/dnsmasq-dns-6b47fdb5b7-gd7kj" Oct 12 07:49:03 crc kubenswrapper[4599]: I1012 07:49:03.163152 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/531f1f31-7281-4e7f-af67-2592fd9733ed-dns-swift-storage-0\") pod \"dnsmasq-dns-6b47fdb5b7-gd7kj\" (UID: \"531f1f31-7281-4e7f-af67-2592fd9733ed\") " pod="openstack/dnsmasq-dns-6b47fdb5b7-gd7kj" Oct 12 07:49:03 crc kubenswrapper[4599]: I1012 07:49:03.163326 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/531f1f31-7281-4e7f-af67-2592fd9733ed-dns-svc\") pod \"dnsmasq-dns-6b47fdb5b7-gd7kj\" (UID: \"531f1f31-7281-4e7f-af67-2592fd9733ed\") " pod="openstack/dnsmasq-dns-6b47fdb5b7-gd7kj" Oct 12 07:49:03 crc kubenswrapper[4599]: I1012 07:49:03.177889 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8mk4\" (UniqueName: \"kubernetes.io/projected/531f1f31-7281-4e7f-af67-2592fd9733ed-kube-api-access-b8mk4\") pod \"dnsmasq-dns-6b47fdb5b7-gd7kj\" (UID: \"531f1f31-7281-4e7f-af67-2592fd9733ed\") " pod="openstack/dnsmasq-dns-6b47fdb5b7-gd7kj" Oct 12 07:49:03 crc kubenswrapper[4599]: I1012 07:49:03.377428 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b47fdb5b7-gd7kj" Oct 12 07:49:03 crc kubenswrapper[4599]: I1012 07:49:03.691534 4599 generic.go:334] "Generic (PLEG): container finished" podID="ff665457-da4e-4e70-a9be-a64b343bd4d0" containerID="64e1a086baf135ecc5b91d8a3df91b4d276ae6107b96740f0e62ff6c6eb47683" exitCode=0 Oct 12 07:49:03 crc kubenswrapper[4599]: I1012 07:49:03.691599 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-s54p2" event={"ID":"ff665457-da4e-4e70-a9be-a64b343bd4d0","Type":"ContainerDied","Data":"64e1a086baf135ecc5b91d8a3df91b4d276ae6107b96740f0e62ff6c6eb47683"} Oct 12 07:49:03 crc kubenswrapper[4599]: I1012 07:49:03.784204 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b47fdb5b7-gd7kj"] Oct 12 07:49:03 crc kubenswrapper[4599]: W1012 07:49:03.785915 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod531f1f31_7281_4e7f_af67_2592fd9733ed.slice/crio-368101a86eb95476ab63bf60ca9fcfa9c630d1596c32af4e61606390b1a72042 WatchSource:0}: Error finding container 368101a86eb95476ab63bf60ca9fcfa9c630d1596c32af4e61606390b1a72042: Status 404 returned error can't find the container with id 368101a86eb95476ab63bf60ca9fcfa9c630d1596c32af4e61606390b1a72042 Oct 12 07:49:04 crc kubenswrapper[4599]: I1012 07:49:04.699158 4599 generic.go:334] "Generic (PLEG): container finished" podID="531f1f31-7281-4e7f-af67-2592fd9733ed" containerID="885867dbbc62cca91e6642aad02770bb3437ac47176b4e6791222319eccec0ec" exitCode=0 Oct 12 07:49:04 crc kubenswrapper[4599]: I1012 07:49:04.699209 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b47fdb5b7-gd7kj" event={"ID":"531f1f31-7281-4e7f-af67-2592fd9733ed","Type":"ContainerDied","Data":"885867dbbc62cca91e6642aad02770bb3437ac47176b4e6791222319eccec0ec"} Oct 12 07:49:04 crc kubenswrapper[4599]: I1012 07:49:04.699529 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b47fdb5b7-gd7kj" event={"ID":"531f1f31-7281-4e7f-af67-2592fd9733ed","Type":"ContainerStarted","Data":"368101a86eb95476ab63bf60ca9fcfa9c630d1596c32af4e61606390b1a72042"} Oct 12 07:49:04 crc kubenswrapper[4599]: I1012 07:49:04.994311 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-s54p2" Oct 12 07:49:05 crc kubenswrapper[4599]: I1012 07:49:05.196061 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sgv5b\" (UniqueName: \"kubernetes.io/projected/ff665457-da4e-4e70-a9be-a64b343bd4d0-kube-api-access-sgv5b\") pod \"ff665457-da4e-4e70-a9be-a64b343bd4d0\" (UID: \"ff665457-da4e-4e70-a9be-a64b343bd4d0\") " Oct 12 07:49:05 crc kubenswrapper[4599]: I1012 07:49:05.196237 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff665457-da4e-4e70-a9be-a64b343bd4d0-combined-ca-bundle\") pod \"ff665457-da4e-4e70-a9be-a64b343bd4d0\" (UID: \"ff665457-da4e-4e70-a9be-a64b343bd4d0\") " Oct 12 07:49:05 crc kubenswrapper[4599]: I1012 07:49:05.196331 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff665457-da4e-4e70-a9be-a64b343bd4d0-config-data\") pod \"ff665457-da4e-4e70-a9be-a64b343bd4d0\" (UID: \"ff665457-da4e-4e70-a9be-a64b343bd4d0\") " Oct 12 07:49:05 crc kubenswrapper[4599]: I1012 07:49:05.202249 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff665457-da4e-4e70-a9be-a64b343bd4d0-kube-api-access-sgv5b" (OuterVolumeSpecName: "kube-api-access-sgv5b") pod "ff665457-da4e-4e70-a9be-a64b343bd4d0" (UID: "ff665457-da4e-4e70-a9be-a64b343bd4d0"). InnerVolumeSpecName "kube-api-access-sgv5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:49:05 crc kubenswrapper[4599]: I1012 07:49:05.219840 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff665457-da4e-4e70-a9be-a64b343bd4d0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ff665457-da4e-4e70-a9be-a64b343bd4d0" (UID: "ff665457-da4e-4e70-a9be-a64b343bd4d0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:49:05 crc kubenswrapper[4599]: I1012 07:49:05.239967 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff665457-da4e-4e70-a9be-a64b343bd4d0-config-data" (OuterVolumeSpecName: "config-data") pod "ff665457-da4e-4e70-a9be-a64b343bd4d0" (UID: "ff665457-da4e-4e70-a9be-a64b343bd4d0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:49:05 crc kubenswrapper[4599]: I1012 07:49:05.300347 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sgv5b\" (UniqueName: \"kubernetes.io/projected/ff665457-da4e-4e70-a9be-a64b343bd4d0-kube-api-access-sgv5b\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:05 crc kubenswrapper[4599]: I1012 07:49:05.300397 4599 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff665457-da4e-4e70-a9be-a64b343bd4d0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:05 crc kubenswrapper[4599]: I1012 07:49:05.300409 4599 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff665457-da4e-4e70-a9be-a64b343bd4d0-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:05 crc kubenswrapper[4599]: I1012 07:49:05.710555 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-s54p2" event={"ID":"ff665457-da4e-4e70-a9be-a64b343bd4d0","Type":"ContainerDied","Data":"ae204b199e580622a970f2efe4131cd591867f417329ba7c6fa2f810720afda2"} Oct 12 07:49:05 crc kubenswrapper[4599]: I1012 07:49:05.710620 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-s54p2" Oct 12 07:49:05 crc kubenswrapper[4599]: I1012 07:49:05.710633 4599 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae204b199e580622a970f2efe4131cd591867f417329ba7c6fa2f810720afda2" Oct 12 07:49:05 crc kubenswrapper[4599]: I1012 07:49:05.712734 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b47fdb5b7-gd7kj" event={"ID":"531f1f31-7281-4e7f-af67-2592fd9733ed","Type":"ContainerStarted","Data":"700cf6a329fe7f2d5d093a843045e42d66370ba4e8a5cd917f57abdceea7d486"} Oct 12 07:49:05 crc kubenswrapper[4599]: I1012 07:49:05.712937 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6b47fdb5b7-gd7kj" Oct 12 07:49:05 crc kubenswrapper[4599]: I1012 07:49:05.827817 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6b47fdb5b7-gd7kj" podStartSLOduration=2.8278003910000002 podStartE2EDuration="2.827800391s" podCreationTimestamp="2025-10-12 07:49:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:49:05.73432182 +0000 UTC m=+842.523517322" watchObservedRunningTime="2025-10-12 07:49:05.827800391 +0000 UTC m=+842.616995892" Oct 12 07:49:05 crc kubenswrapper[4599]: I1012 07:49:05.828392 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b47fdb5b7-gd7kj"] Oct 12 07:49:05 crc kubenswrapper[4599]: I1012 07:49:05.856016 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6c7fbdd79f-wl96w"] Oct 12 07:49:05 crc kubenswrapper[4599]: E1012 07:49:05.856309 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff665457-da4e-4e70-a9be-a64b343bd4d0" containerName="keystone-db-sync" Oct 12 07:49:05 crc kubenswrapper[4599]: I1012 07:49:05.856324 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff665457-da4e-4e70-a9be-a64b343bd4d0" containerName="keystone-db-sync" Oct 12 07:49:05 crc kubenswrapper[4599]: I1012 07:49:05.856536 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff665457-da4e-4e70-a9be-a64b343bd4d0" containerName="keystone-db-sync" Oct 12 07:49:05 crc kubenswrapper[4599]: I1012 07:49:05.857400 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c7fbdd79f-wl96w" Oct 12 07:49:05 crc kubenswrapper[4599]: I1012 07:49:05.864400 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c7fbdd79f-wl96w"] Oct 12 07:49:05 crc kubenswrapper[4599]: I1012 07:49:05.897446 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-zh5h5"] Oct 12 07:49:05 crc kubenswrapper[4599]: I1012 07:49:05.898352 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-zh5h5" Oct 12 07:49:05 crc kubenswrapper[4599]: I1012 07:49:05.900980 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 12 07:49:05 crc kubenswrapper[4599]: I1012 07:49:05.900990 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 12 07:49:05 crc kubenswrapper[4599]: I1012 07:49:05.902680 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 12 07:49:05 crc kubenswrapper[4599]: I1012 07:49:05.902817 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-6vkdx" Oct 12 07:49:05 crc kubenswrapper[4599]: I1012 07:49:05.921791 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-zh5h5"] Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.008329 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/60c2daa7-082e-4c22-a040-67cbfd077bf4-credential-keys\") pod \"keystone-bootstrap-zh5h5\" (UID: \"60c2daa7-082e-4c22-a040-67cbfd077bf4\") " pod="openstack/keystone-bootstrap-zh5h5" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.008388 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cxd4\" (UniqueName: \"kubernetes.io/projected/2e0df496-5c55-4a44-89b6-5acd786c4aa2-kube-api-access-7cxd4\") pod \"dnsmasq-dns-6c7fbdd79f-wl96w\" (UID: \"2e0df496-5c55-4a44-89b6-5acd786c4aa2\") " pod="openstack/dnsmasq-dns-6c7fbdd79f-wl96w" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.008410 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60c2daa7-082e-4c22-a040-67cbfd077bf4-config-data\") pod \"keystone-bootstrap-zh5h5\" (UID: \"60c2daa7-082e-4c22-a040-67cbfd077bf4\") " pod="openstack/keystone-bootstrap-zh5h5" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.008922 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60c2daa7-082e-4c22-a040-67cbfd077bf4-scripts\") pod \"keystone-bootstrap-zh5h5\" (UID: \"60c2daa7-082e-4c22-a040-67cbfd077bf4\") " pod="openstack/keystone-bootstrap-zh5h5" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.008992 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2e0df496-5c55-4a44-89b6-5acd786c4aa2-dns-swift-storage-0\") pod \"dnsmasq-dns-6c7fbdd79f-wl96w\" (UID: \"2e0df496-5c55-4a44-89b6-5acd786c4aa2\") " pod="openstack/dnsmasq-dns-6c7fbdd79f-wl96w" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.009022 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2e0df496-5c55-4a44-89b6-5acd786c4aa2-dns-svc\") pod \"dnsmasq-dns-6c7fbdd79f-wl96w\" (UID: \"2e0df496-5c55-4a44-89b6-5acd786c4aa2\") " pod="openstack/dnsmasq-dns-6c7fbdd79f-wl96w" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.009044 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/60c2daa7-082e-4c22-a040-67cbfd077bf4-fernet-keys\") pod \"keystone-bootstrap-zh5h5\" (UID: \"60c2daa7-082e-4c22-a040-67cbfd077bf4\") " pod="openstack/keystone-bootstrap-zh5h5" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.009141 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2e0df496-5c55-4a44-89b6-5acd786c4aa2-ovsdbserver-sb\") pod \"dnsmasq-dns-6c7fbdd79f-wl96w\" (UID: \"2e0df496-5c55-4a44-89b6-5acd786c4aa2\") " pod="openstack/dnsmasq-dns-6c7fbdd79f-wl96w" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.009170 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e0df496-5c55-4a44-89b6-5acd786c4aa2-config\") pod \"dnsmasq-dns-6c7fbdd79f-wl96w\" (UID: \"2e0df496-5c55-4a44-89b6-5acd786c4aa2\") " pod="openstack/dnsmasq-dns-6c7fbdd79f-wl96w" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.009295 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2e0df496-5c55-4a44-89b6-5acd786c4aa2-ovsdbserver-nb\") pod \"dnsmasq-dns-6c7fbdd79f-wl96w\" (UID: \"2e0df496-5c55-4a44-89b6-5acd786c4aa2\") " pod="openstack/dnsmasq-dns-6c7fbdd79f-wl96w" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.009416 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60c2daa7-082e-4c22-a040-67cbfd077bf4-combined-ca-bundle\") pod \"keystone-bootstrap-zh5h5\" (UID: \"60c2daa7-082e-4c22-a040-67cbfd077bf4\") " pod="openstack/keystone-bootstrap-zh5h5" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.009462 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhklk\" (UniqueName: \"kubernetes.io/projected/60c2daa7-082e-4c22-a040-67cbfd077bf4-kube-api-access-zhklk\") pod \"keystone-bootstrap-zh5h5\" (UID: \"60c2daa7-082e-4c22-a040-67cbfd077bf4\") " pod="openstack/keystone-bootstrap-zh5h5" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.010717 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.012582 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.014775 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.018780 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.028822 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.110666 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60c2daa7-082e-4c22-a040-67cbfd077bf4-combined-ca-bundle\") pod \"keystone-bootstrap-zh5h5\" (UID: \"60c2daa7-082e-4c22-a040-67cbfd077bf4\") " pod="openstack/keystone-bootstrap-zh5h5" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.110719 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhklk\" (UniqueName: \"kubernetes.io/projected/60c2daa7-082e-4c22-a040-67cbfd077bf4-kube-api-access-zhklk\") pod \"keystone-bootstrap-zh5h5\" (UID: \"60c2daa7-082e-4c22-a040-67cbfd077bf4\") " pod="openstack/keystone-bootstrap-zh5h5" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.110767 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/60c2daa7-082e-4c22-a040-67cbfd077bf4-credential-keys\") pod \"keystone-bootstrap-zh5h5\" (UID: \"60c2daa7-082e-4c22-a040-67cbfd077bf4\") " pod="openstack/keystone-bootstrap-zh5h5" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.110791 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7cxd4\" (UniqueName: \"kubernetes.io/projected/2e0df496-5c55-4a44-89b6-5acd786c4aa2-kube-api-access-7cxd4\") pod \"dnsmasq-dns-6c7fbdd79f-wl96w\" (UID: \"2e0df496-5c55-4a44-89b6-5acd786c4aa2\") " pod="openstack/dnsmasq-dns-6c7fbdd79f-wl96w" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.110808 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60c2daa7-082e-4c22-a040-67cbfd077bf4-config-data\") pod \"keystone-bootstrap-zh5h5\" (UID: \"60c2daa7-082e-4c22-a040-67cbfd077bf4\") " pod="openstack/keystone-bootstrap-zh5h5" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.110841 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60c2daa7-082e-4c22-a040-67cbfd077bf4-scripts\") pod \"keystone-bootstrap-zh5h5\" (UID: \"60c2daa7-082e-4c22-a040-67cbfd077bf4\") " pod="openstack/keystone-bootstrap-zh5h5" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.110880 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2e0df496-5c55-4a44-89b6-5acd786c4aa2-dns-swift-storage-0\") pod \"dnsmasq-dns-6c7fbdd79f-wl96w\" (UID: \"2e0df496-5c55-4a44-89b6-5acd786c4aa2\") " pod="openstack/dnsmasq-dns-6c7fbdd79f-wl96w" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.110904 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2e0df496-5c55-4a44-89b6-5acd786c4aa2-dns-svc\") pod \"dnsmasq-dns-6c7fbdd79f-wl96w\" (UID: \"2e0df496-5c55-4a44-89b6-5acd786c4aa2\") " pod="openstack/dnsmasq-dns-6c7fbdd79f-wl96w" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.110926 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/60c2daa7-082e-4c22-a040-67cbfd077bf4-fernet-keys\") pod \"keystone-bootstrap-zh5h5\" (UID: \"60c2daa7-082e-4c22-a040-67cbfd077bf4\") " pod="openstack/keystone-bootstrap-zh5h5" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.110945 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2e0df496-5c55-4a44-89b6-5acd786c4aa2-ovsdbserver-sb\") pod \"dnsmasq-dns-6c7fbdd79f-wl96w\" (UID: \"2e0df496-5c55-4a44-89b6-5acd786c4aa2\") " pod="openstack/dnsmasq-dns-6c7fbdd79f-wl96w" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.110959 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e0df496-5c55-4a44-89b6-5acd786c4aa2-config\") pod \"dnsmasq-dns-6c7fbdd79f-wl96w\" (UID: \"2e0df496-5c55-4a44-89b6-5acd786c4aa2\") " pod="openstack/dnsmasq-dns-6c7fbdd79f-wl96w" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.110987 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2e0df496-5c55-4a44-89b6-5acd786c4aa2-ovsdbserver-nb\") pod \"dnsmasq-dns-6c7fbdd79f-wl96w\" (UID: \"2e0df496-5c55-4a44-89b6-5acd786c4aa2\") " pod="openstack/dnsmasq-dns-6c7fbdd79f-wl96w" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.112235 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2e0df496-5c55-4a44-89b6-5acd786c4aa2-ovsdbserver-nb\") pod \"dnsmasq-dns-6c7fbdd79f-wl96w\" (UID: \"2e0df496-5c55-4a44-89b6-5acd786c4aa2\") " pod="openstack/dnsmasq-dns-6c7fbdd79f-wl96w" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.113239 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2e0df496-5c55-4a44-89b6-5acd786c4aa2-ovsdbserver-sb\") pod \"dnsmasq-dns-6c7fbdd79f-wl96w\" (UID: \"2e0df496-5c55-4a44-89b6-5acd786c4aa2\") " pod="openstack/dnsmasq-dns-6c7fbdd79f-wl96w" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.113919 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2e0df496-5c55-4a44-89b6-5acd786c4aa2-dns-svc\") pod \"dnsmasq-dns-6c7fbdd79f-wl96w\" (UID: \"2e0df496-5c55-4a44-89b6-5acd786c4aa2\") " pod="openstack/dnsmasq-dns-6c7fbdd79f-wl96w" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.113946 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2e0df496-5c55-4a44-89b6-5acd786c4aa2-dns-swift-storage-0\") pod \"dnsmasq-dns-6c7fbdd79f-wl96w\" (UID: \"2e0df496-5c55-4a44-89b6-5acd786c4aa2\") " pod="openstack/dnsmasq-dns-6c7fbdd79f-wl96w" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.114589 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e0df496-5c55-4a44-89b6-5acd786c4aa2-config\") pod \"dnsmasq-dns-6c7fbdd79f-wl96w\" (UID: \"2e0df496-5c55-4a44-89b6-5acd786c4aa2\") " pod="openstack/dnsmasq-dns-6c7fbdd79f-wl96w" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.116806 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60c2daa7-082e-4c22-a040-67cbfd077bf4-scripts\") pod \"keystone-bootstrap-zh5h5\" (UID: \"60c2daa7-082e-4c22-a040-67cbfd077bf4\") " pod="openstack/keystone-bootstrap-zh5h5" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.118231 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/60c2daa7-082e-4c22-a040-67cbfd077bf4-fernet-keys\") pod \"keystone-bootstrap-zh5h5\" (UID: \"60c2daa7-082e-4c22-a040-67cbfd077bf4\") " pod="openstack/keystone-bootstrap-zh5h5" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.118537 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/60c2daa7-082e-4c22-a040-67cbfd077bf4-credential-keys\") pod \"keystone-bootstrap-zh5h5\" (UID: \"60c2daa7-082e-4c22-a040-67cbfd077bf4\") " pod="openstack/keystone-bootstrap-zh5h5" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.123373 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60c2daa7-082e-4c22-a040-67cbfd077bf4-combined-ca-bundle\") pod \"keystone-bootstrap-zh5h5\" (UID: \"60c2daa7-082e-4c22-a040-67cbfd077bf4\") " pod="openstack/keystone-bootstrap-zh5h5" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.139124 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhklk\" (UniqueName: \"kubernetes.io/projected/60c2daa7-082e-4c22-a040-67cbfd077bf4-kube-api-access-zhklk\") pod \"keystone-bootstrap-zh5h5\" (UID: \"60c2daa7-082e-4c22-a040-67cbfd077bf4\") " pod="openstack/keystone-bootstrap-zh5h5" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.140968 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cxd4\" (UniqueName: \"kubernetes.io/projected/2e0df496-5c55-4a44-89b6-5acd786c4aa2-kube-api-access-7cxd4\") pod \"dnsmasq-dns-6c7fbdd79f-wl96w\" (UID: \"2e0df496-5c55-4a44-89b6-5acd786c4aa2\") " pod="openstack/dnsmasq-dns-6c7fbdd79f-wl96w" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.147916 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60c2daa7-082e-4c22-a040-67cbfd077bf4-config-data\") pod \"keystone-bootstrap-zh5h5\" (UID: \"60c2daa7-082e-4c22-a040-67cbfd077bf4\") " pod="openstack/keystone-bootstrap-zh5h5" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.172595 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c7fbdd79f-wl96w" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.213070 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-pkz8c"] Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.213543 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-zh5h5" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.214415 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f94bf31d-933c-4cca-998a-4c3fc9a451f4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f94bf31d-933c-4cca-998a-4c3fc9a451f4\") " pod="openstack/ceilometer-0" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.214507 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f94bf31d-933c-4cca-998a-4c3fc9a451f4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f94bf31d-933c-4cca-998a-4c3fc9a451f4\") " pod="openstack/ceilometer-0" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.214533 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f94bf31d-933c-4cca-998a-4c3fc9a451f4-run-httpd\") pod \"ceilometer-0\" (UID: \"f94bf31d-933c-4cca-998a-4c3fc9a451f4\") " pod="openstack/ceilometer-0" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.214553 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hlxx\" (UniqueName: \"kubernetes.io/projected/f94bf31d-933c-4cca-998a-4c3fc9a451f4-kube-api-access-5hlxx\") pod \"ceilometer-0\" (UID: \"f94bf31d-933c-4cca-998a-4c3fc9a451f4\") " pod="openstack/ceilometer-0" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.214600 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f94bf31d-933c-4cca-998a-4c3fc9a451f4-scripts\") pod \"ceilometer-0\" (UID: \"f94bf31d-933c-4cca-998a-4c3fc9a451f4\") " pod="openstack/ceilometer-0" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.214620 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f94bf31d-933c-4cca-998a-4c3fc9a451f4-log-httpd\") pod \"ceilometer-0\" (UID: \"f94bf31d-933c-4cca-998a-4c3fc9a451f4\") " pod="openstack/ceilometer-0" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.214676 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f94bf31d-933c-4cca-998a-4c3fc9a451f4-config-data\") pod \"ceilometer-0\" (UID: \"f94bf31d-933c-4cca-998a-4c3fc9a451f4\") " pod="openstack/ceilometer-0" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.223806 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-pkz8c" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.226563 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.226654 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.226759 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-kqzdj" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.226866 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-pkz8c"] Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.252591 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c7fbdd79f-wl96w"] Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.275747 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-977d96ff-6cnmb"] Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.278180 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-977d96ff-6cnmb" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.286134 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-977d96ff-6cnmb"] Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.317636 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f94bf31d-933c-4cca-998a-4c3fc9a451f4-scripts\") pod \"ceilometer-0\" (UID: \"f94bf31d-933c-4cca-998a-4c3fc9a451f4\") " pod="openstack/ceilometer-0" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.317694 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f94bf31d-933c-4cca-998a-4c3fc9a451f4-log-httpd\") pod \"ceilometer-0\" (UID: \"f94bf31d-933c-4cca-998a-4c3fc9a451f4\") " pod="openstack/ceilometer-0" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.317719 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f94bf31d-933c-4cca-998a-4c3fc9a451f4-config-data\") pod \"ceilometer-0\" (UID: \"f94bf31d-933c-4cca-998a-4c3fc9a451f4\") " pod="openstack/ceilometer-0" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.317772 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f94bf31d-933c-4cca-998a-4c3fc9a451f4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f94bf31d-933c-4cca-998a-4c3fc9a451f4\") " pod="openstack/ceilometer-0" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.317827 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f94bf31d-933c-4cca-998a-4c3fc9a451f4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f94bf31d-933c-4cca-998a-4c3fc9a451f4\") " pod="openstack/ceilometer-0" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.317848 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f94bf31d-933c-4cca-998a-4c3fc9a451f4-run-httpd\") pod \"ceilometer-0\" (UID: \"f94bf31d-933c-4cca-998a-4c3fc9a451f4\") " pod="openstack/ceilometer-0" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.317866 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hlxx\" (UniqueName: \"kubernetes.io/projected/f94bf31d-933c-4cca-998a-4c3fc9a451f4-kube-api-access-5hlxx\") pod \"ceilometer-0\" (UID: \"f94bf31d-933c-4cca-998a-4c3fc9a451f4\") " pod="openstack/ceilometer-0" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.323520 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f94bf31d-933c-4cca-998a-4c3fc9a451f4-log-httpd\") pod \"ceilometer-0\" (UID: \"f94bf31d-933c-4cca-998a-4c3fc9a451f4\") " pod="openstack/ceilometer-0" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.323568 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f94bf31d-933c-4cca-998a-4c3fc9a451f4-run-httpd\") pod \"ceilometer-0\" (UID: \"f94bf31d-933c-4cca-998a-4c3fc9a451f4\") " pod="openstack/ceilometer-0" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.328310 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f94bf31d-933c-4cca-998a-4c3fc9a451f4-config-data\") pod \"ceilometer-0\" (UID: \"f94bf31d-933c-4cca-998a-4c3fc9a451f4\") " pod="openstack/ceilometer-0" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.329792 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f94bf31d-933c-4cca-998a-4c3fc9a451f4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f94bf31d-933c-4cca-998a-4c3fc9a451f4\") " pod="openstack/ceilometer-0" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.330708 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f94bf31d-933c-4cca-998a-4c3fc9a451f4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f94bf31d-933c-4cca-998a-4c3fc9a451f4\") " pod="openstack/ceilometer-0" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.333561 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f94bf31d-933c-4cca-998a-4c3fc9a451f4-scripts\") pod \"ceilometer-0\" (UID: \"f94bf31d-933c-4cca-998a-4c3fc9a451f4\") " pod="openstack/ceilometer-0" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.337323 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hlxx\" (UniqueName: \"kubernetes.io/projected/f94bf31d-933c-4cca-998a-4c3fc9a451f4-kube-api-access-5hlxx\") pod \"ceilometer-0\" (UID: \"f94bf31d-933c-4cca-998a-4c3fc9a451f4\") " pod="openstack/ceilometer-0" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.424890 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d43ed11-7175-498a-8dc1-e15cfc41b5c8-config-data\") pod \"placement-db-sync-pkz8c\" (UID: \"9d43ed11-7175-498a-8dc1-e15cfc41b5c8\") " pod="openstack/placement-db-sync-pkz8c" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.425134 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwk57\" (UniqueName: \"kubernetes.io/projected/9d43ed11-7175-498a-8dc1-e15cfc41b5c8-kube-api-access-lwk57\") pod \"placement-db-sync-pkz8c\" (UID: \"9d43ed11-7175-498a-8dc1-e15cfc41b5c8\") " pod="openstack/placement-db-sync-pkz8c" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.425264 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56mvd\" (UniqueName: \"kubernetes.io/projected/76d0d44d-634f-43a2-89b0-59874fdf35b5-kube-api-access-56mvd\") pod \"dnsmasq-dns-977d96ff-6cnmb\" (UID: \"76d0d44d-634f-43a2-89b0-59874fdf35b5\") " pod="openstack/dnsmasq-dns-977d96ff-6cnmb" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.425382 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d43ed11-7175-498a-8dc1-e15cfc41b5c8-scripts\") pod \"placement-db-sync-pkz8c\" (UID: \"9d43ed11-7175-498a-8dc1-e15cfc41b5c8\") " pod="openstack/placement-db-sync-pkz8c" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.425480 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/76d0d44d-634f-43a2-89b0-59874fdf35b5-dns-swift-storage-0\") pod \"dnsmasq-dns-977d96ff-6cnmb\" (UID: \"76d0d44d-634f-43a2-89b0-59874fdf35b5\") " pod="openstack/dnsmasq-dns-977d96ff-6cnmb" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.425591 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/76d0d44d-634f-43a2-89b0-59874fdf35b5-ovsdbserver-nb\") pod \"dnsmasq-dns-977d96ff-6cnmb\" (UID: \"76d0d44d-634f-43a2-89b0-59874fdf35b5\") " pod="openstack/dnsmasq-dns-977d96ff-6cnmb" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.425678 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76d0d44d-634f-43a2-89b0-59874fdf35b5-config\") pod \"dnsmasq-dns-977d96ff-6cnmb\" (UID: \"76d0d44d-634f-43a2-89b0-59874fdf35b5\") " pod="openstack/dnsmasq-dns-977d96ff-6cnmb" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.425798 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d43ed11-7175-498a-8dc1-e15cfc41b5c8-combined-ca-bundle\") pod \"placement-db-sync-pkz8c\" (UID: \"9d43ed11-7175-498a-8dc1-e15cfc41b5c8\") " pod="openstack/placement-db-sync-pkz8c" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.425915 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/76d0d44d-634f-43a2-89b0-59874fdf35b5-dns-svc\") pod \"dnsmasq-dns-977d96ff-6cnmb\" (UID: \"76d0d44d-634f-43a2-89b0-59874fdf35b5\") " pod="openstack/dnsmasq-dns-977d96ff-6cnmb" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.426059 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d43ed11-7175-498a-8dc1-e15cfc41b5c8-logs\") pod \"placement-db-sync-pkz8c\" (UID: \"9d43ed11-7175-498a-8dc1-e15cfc41b5c8\") " pod="openstack/placement-db-sync-pkz8c" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.426164 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/76d0d44d-634f-43a2-89b0-59874fdf35b5-ovsdbserver-sb\") pod \"dnsmasq-dns-977d96ff-6cnmb\" (UID: \"76d0d44d-634f-43a2-89b0-59874fdf35b5\") " pod="openstack/dnsmasq-dns-977d96ff-6cnmb" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.526645 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d43ed11-7175-498a-8dc1-e15cfc41b5c8-combined-ca-bundle\") pod \"placement-db-sync-pkz8c\" (UID: \"9d43ed11-7175-498a-8dc1-e15cfc41b5c8\") " pod="openstack/placement-db-sync-pkz8c" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.526691 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/76d0d44d-634f-43a2-89b0-59874fdf35b5-dns-svc\") pod \"dnsmasq-dns-977d96ff-6cnmb\" (UID: \"76d0d44d-634f-43a2-89b0-59874fdf35b5\") " pod="openstack/dnsmasq-dns-977d96ff-6cnmb" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.526742 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d43ed11-7175-498a-8dc1-e15cfc41b5c8-logs\") pod \"placement-db-sync-pkz8c\" (UID: \"9d43ed11-7175-498a-8dc1-e15cfc41b5c8\") " pod="openstack/placement-db-sync-pkz8c" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.526763 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/76d0d44d-634f-43a2-89b0-59874fdf35b5-ovsdbserver-sb\") pod \"dnsmasq-dns-977d96ff-6cnmb\" (UID: \"76d0d44d-634f-43a2-89b0-59874fdf35b5\") " pod="openstack/dnsmasq-dns-977d96ff-6cnmb" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.526806 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d43ed11-7175-498a-8dc1-e15cfc41b5c8-config-data\") pod \"placement-db-sync-pkz8c\" (UID: \"9d43ed11-7175-498a-8dc1-e15cfc41b5c8\") " pod="openstack/placement-db-sync-pkz8c" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.526826 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwk57\" (UniqueName: \"kubernetes.io/projected/9d43ed11-7175-498a-8dc1-e15cfc41b5c8-kube-api-access-lwk57\") pod \"placement-db-sync-pkz8c\" (UID: \"9d43ed11-7175-498a-8dc1-e15cfc41b5c8\") " pod="openstack/placement-db-sync-pkz8c" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.526844 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56mvd\" (UniqueName: \"kubernetes.io/projected/76d0d44d-634f-43a2-89b0-59874fdf35b5-kube-api-access-56mvd\") pod \"dnsmasq-dns-977d96ff-6cnmb\" (UID: \"76d0d44d-634f-43a2-89b0-59874fdf35b5\") " pod="openstack/dnsmasq-dns-977d96ff-6cnmb" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.526864 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d43ed11-7175-498a-8dc1-e15cfc41b5c8-scripts\") pod \"placement-db-sync-pkz8c\" (UID: \"9d43ed11-7175-498a-8dc1-e15cfc41b5c8\") " pod="openstack/placement-db-sync-pkz8c" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.526884 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/76d0d44d-634f-43a2-89b0-59874fdf35b5-dns-swift-storage-0\") pod \"dnsmasq-dns-977d96ff-6cnmb\" (UID: \"76d0d44d-634f-43a2-89b0-59874fdf35b5\") " pod="openstack/dnsmasq-dns-977d96ff-6cnmb" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.526902 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/76d0d44d-634f-43a2-89b0-59874fdf35b5-ovsdbserver-nb\") pod \"dnsmasq-dns-977d96ff-6cnmb\" (UID: \"76d0d44d-634f-43a2-89b0-59874fdf35b5\") " pod="openstack/dnsmasq-dns-977d96ff-6cnmb" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.526919 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76d0d44d-634f-43a2-89b0-59874fdf35b5-config\") pod \"dnsmasq-dns-977d96ff-6cnmb\" (UID: \"76d0d44d-634f-43a2-89b0-59874fdf35b5\") " pod="openstack/dnsmasq-dns-977d96ff-6cnmb" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.527706 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76d0d44d-634f-43a2-89b0-59874fdf35b5-config\") pod \"dnsmasq-dns-977d96ff-6cnmb\" (UID: \"76d0d44d-634f-43a2-89b0-59874fdf35b5\") " pod="openstack/dnsmasq-dns-977d96ff-6cnmb" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.528108 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/76d0d44d-634f-43a2-89b0-59874fdf35b5-dns-svc\") pod \"dnsmasq-dns-977d96ff-6cnmb\" (UID: \"76d0d44d-634f-43a2-89b0-59874fdf35b5\") " pod="openstack/dnsmasq-dns-977d96ff-6cnmb" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.528408 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d43ed11-7175-498a-8dc1-e15cfc41b5c8-logs\") pod \"placement-db-sync-pkz8c\" (UID: \"9d43ed11-7175-498a-8dc1-e15cfc41b5c8\") " pod="openstack/placement-db-sync-pkz8c" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.528840 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/76d0d44d-634f-43a2-89b0-59874fdf35b5-ovsdbserver-sb\") pod \"dnsmasq-dns-977d96ff-6cnmb\" (UID: \"76d0d44d-634f-43a2-89b0-59874fdf35b5\") " pod="openstack/dnsmasq-dns-977d96ff-6cnmb" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.529484 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/76d0d44d-634f-43a2-89b0-59874fdf35b5-ovsdbserver-nb\") pod \"dnsmasq-dns-977d96ff-6cnmb\" (UID: \"76d0d44d-634f-43a2-89b0-59874fdf35b5\") " pod="openstack/dnsmasq-dns-977d96ff-6cnmb" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.532746 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d43ed11-7175-498a-8dc1-e15cfc41b5c8-scripts\") pod \"placement-db-sync-pkz8c\" (UID: \"9d43ed11-7175-498a-8dc1-e15cfc41b5c8\") " pod="openstack/placement-db-sync-pkz8c" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.532871 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/76d0d44d-634f-43a2-89b0-59874fdf35b5-dns-swift-storage-0\") pod \"dnsmasq-dns-977d96ff-6cnmb\" (UID: \"76d0d44d-634f-43a2-89b0-59874fdf35b5\") " pod="openstack/dnsmasq-dns-977d96ff-6cnmb" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.532937 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d43ed11-7175-498a-8dc1-e15cfc41b5c8-combined-ca-bundle\") pod \"placement-db-sync-pkz8c\" (UID: \"9d43ed11-7175-498a-8dc1-e15cfc41b5c8\") " pod="openstack/placement-db-sync-pkz8c" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.533196 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d43ed11-7175-498a-8dc1-e15cfc41b5c8-config-data\") pod \"placement-db-sync-pkz8c\" (UID: \"9d43ed11-7175-498a-8dc1-e15cfc41b5c8\") " pod="openstack/placement-db-sync-pkz8c" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.540724 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56mvd\" (UniqueName: \"kubernetes.io/projected/76d0d44d-634f-43a2-89b0-59874fdf35b5-kube-api-access-56mvd\") pod \"dnsmasq-dns-977d96ff-6cnmb\" (UID: \"76d0d44d-634f-43a2-89b0-59874fdf35b5\") " pod="openstack/dnsmasq-dns-977d96ff-6cnmb" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.543978 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwk57\" (UniqueName: \"kubernetes.io/projected/9d43ed11-7175-498a-8dc1-e15cfc41b5c8-kube-api-access-lwk57\") pod \"placement-db-sync-pkz8c\" (UID: \"9d43ed11-7175-498a-8dc1-e15cfc41b5c8\") " pod="openstack/placement-db-sync-pkz8c" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.601857 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-pkz8c" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.612132 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-977d96ff-6cnmb" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.625062 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.669666 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c7fbdd79f-wl96w"] Oct 12 07:49:06 crc kubenswrapper[4599]: W1012 07:49:06.673007 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2e0df496_5c55_4a44_89b6_5acd786c4aa2.slice/crio-c0aab737be881d70693362e48275d96b86726ee5c965fdce364be4b659a7e648 WatchSource:0}: Error finding container c0aab737be881d70693362e48275d96b86726ee5c965fdce364be4b659a7e648: Status 404 returned error can't find the container with id c0aab737be881d70693362e48275d96b86726ee5c965fdce364be4b659a7e648 Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.731807 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c7fbdd79f-wl96w" event={"ID":"2e0df496-5c55-4a44-89b6-5acd786c4aa2","Type":"ContainerStarted","Data":"c0aab737be881d70693362e48275d96b86726ee5c965fdce364be4b659a7e648"} Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.753396 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-zh5h5"] Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.959223 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.960914 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.968644 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.968805 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-sxsrd" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.968928 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.968971 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 12 07:49:06 crc kubenswrapper[4599]: I1012 07:49:06.979005 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 12 07:49:07 crc kubenswrapper[4599]: I1012 07:49:07.003610 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 12 07:49:07 crc kubenswrapper[4599]: I1012 07:49:07.011545 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 12 07:49:07 crc kubenswrapper[4599]: I1012 07:49:07.013427 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 12 07:49:07 crc kubenswrapper[4599]: I1012 07:49:07.013845 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 12 07:49:07 crc kubenswrapper[4599]: I1012 07:49:07.017114 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 12 07:49:07 crc kubenswrapper[4599]: I1012 07:49:07.043860 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-977d96ff-6cnmb"] Oct 12 07:49:07 crc kubenswrapper[4599]: W1012 07:49:07.044189 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod76d0d44d_634f_43a2_89b0_59874fdf35b5.slice/crio-6ac04af0c17716c449dcc8f620738a11533e7a1cbf2ab11fc097d6d01477fd06 WatchSource:0}: Error finding container 6ac04af0c17716c449dcc8f620738a11533e7a1cbf2ab11fc097d6d01477fd06: Status 404 returned error can't find the container with id 6ac04af0c17716c449dcc8f620738a11533e7a1cbf2ab11fc097d6d01477fd06 Oct 12 07:49:07 crc kubenswrapper[4599]: I1012 07:49:07.100116 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 12 07:49:07 crc kubenswrapper[4599]: I1012 07:49:07.103537 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-pkz8c"] Oct 12 07:49:07 crc kubenswrapper[4599]: W1012 07:49:07.105266 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf94bf31d_933c_4cca_998a_4c3fc9a451f4.slice/crio-289d224b3977712361e5b93f8c1a0b7e85fe80edaf86f173cd6b5e2b00d8de30 WatchSource:0}: Error finding container 289d224b3977712361e5b93f8c1a0b7e85fe80edaf86f173cd6b5e2b00d8de30: Status 404 returned error can't find the container with id 289d224b3977712361e5b93f8c1a0b7e85fe80edaf86f173cd6b5e2b00d8de30 Oct 12 07:49:07 crc kubenswrapper[4599]: W1012 07:49:07.107676 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9d43ed11_7175_498a_8dc1_e15cfc41b5c8.slice/crio-3b98a862f2a26e6aab4d936817b0e3e55f0f67169e1044617a0be1125a66b3cd WatchSource:0}: Error finding container 3b98a862f2a26e6aab4d936817b0e3e55f0f67169e1044617a0be1125a66b3cd: Status 404 returned error can't find the container with id 3b98a862f2a26e6aab4d936817b0e3e55f0f67169e1044617a0be1125a66b3cd Oct 12 07:49:07 crc kubenswrapper[4599]: I1012 07:49:07.144463 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c477ca80-6743-4c07-b8ab-d9a083b87bcd-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c477ca80-6743-4c07-b8ab-d9a083b87bcd\") " pod="openstack/glance-default-internal-api-0" Oct 12 07:49:07 crc kubenswrapper[4599]: I1012 07:49:07.144507 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c477ca80-6743-4c07-b8ab-d9a083b87bcd-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c477ca80-6743-4c07-b8ab-d9a083b87bcd\") " pod="openstack/glance-default-internal-api-0" Oct 12 07:49:07 crc kubenswrapper[4599]: I1012 07:49:07.144611 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c477ca80-6743-4c07-b8ab-d9a083b87bcd-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c477ca80-6743-4c07-b8ab-d9a083b87bcd\") " pod="openstack/glance-default-internal-api-0" Oct 12 07:49:07 crc kubenswrapper[4599]: I1012 07:49:07.144785 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c477ca80-6743-4c07-b8ab-d9a083b87bcd-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c477ca80-6743-4c07-b8ab-d9a083b87bcd\") " pod="openstack/glance-default-internal-api-0" Oct 12 07:49:07 crc kubenswrapper[4599]: I1012 07:49:07.144828 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"c477ca80-6743-4c07-b8ab-d9a083b87bcd\") " pod="openstack/glance-default-internal-api-0" Oct 12 07:49:07 crc kubenswrapper[4599]: I1012 07:49:07.144854 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wkr7\" (UniqueName: \"kubernetes.io/projected/c477ca80-6743-4c07-b8ab-d9a083b87bcd-kube-api-access-7wkr7\") pod \"glance-default-internal-api-0\" (UID: \"c477ca80-6743-4c07-b8ab-d9a083b87bcd\") " pod="openstack/glance-default-internal-api-0" Oct 12 07:49:07 crc kubenswrapper[4599]: I1012 07:49:07.144925 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"ba6bf0af-7d41-4886-9222-3c46d8ba55e2\") " pod="openstack/glance-default-external-api-0" Oct 12 07:49:07 crc kubenswrapper[4599]: I1012 07:49:07.145588 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba6bf0af-7d41-4886-9222-3c46d8ba55e2-config-data\") pod \"glance-default-external-api-0\" (UID: \"ba6bf0af-7d41-4886-9222-3c46d8ba55e2\") " pod="openstack/glance-default-external-api-0" Oct 12 07:49:07 crc kubenswrapper[4599]: I1012 07:49:07.145631 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c477ca80-6743-4c07-b8ab-d9a083b87bcd-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c477ca80-6743-4c07-b8ab-d9a083b87bcd\") " pod="openstack/glance-default-internal-api-0" Oct 12 07:49:07 crc kubenswrapper[4599]: I1012 07:49:07.145682 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ba6bf0af-7d41-4886-9222-3c46d8ba55e2-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ba6bf0af-7d41-4886-9222-3c46d8ba55e2\") " pod="openstack/glance-default-external-api-0" Oct 12 07:49:07 crc kubenswrapper[4599]: I1012 07:49:07.145797 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba6bf0af-7d41-4886-9222-3c46d8ba55e2-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ba6bf0af-7d41-4886-9222-3c46d8ba55e2\") " pod="openstack/glance-default-external-api-0" Oct 12 07:49:07 crc kubenswrapper[4599]: I1012 07:49:07.145880 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxxqv\" (UniqueName: \"kubernetes.io/projected/ba6bf0af-7d41-4886-9222-3c46d8ba55e2-kube-api-access-rxxqv\") pod \"glance-default-external-api-0\" (UID: \"ba6bf0af-7d41-4886-9222-3c46d8ba55e2\") " pod="openstack/glance-default-external-api-0" Oct 12 07:49:07 crc kubenswrapper[4599]: I1012 07:49:07.145935 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba6bf0af-7d41-4886-9222-3c46d8ba55e2-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ba6bf0af-7d41-4886-9222-3c46d8ba55e2\") " pod="openstack/glance-default-external-api-0" Oct 12 07:49:07 crc kubenswrapper[4599]: I1012 07:49:07.145960 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba6bf0af-7d41-4886-9222-3c46d8ba55e2-scripts\") pod \"glance-default-external-api-0\" (UID: \"ba6bf0af-7d41-4886-9222-3c46d8ba55e2\") " pod="openstack/glance-default-external-api-0" Oct 12 07:49:07 crc kubenswrapper[4599]: I1012 07:49:07.145980 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba6bf0af-7d41-4886-9222-3c46d8ba55e2-logs\") pod \"glance-default-external-api-0\" (UID: \"ba6bf0af-7d41-4886-9222-3c46d8ba55e2\") " pod="openstack/glance-default-external-api-0" Oct 12 07:49:07 crc kubenswrapper[4599]: I1012 07:49:07.145999 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c477ca80-6743-4c07-b8ab-d9a083b87bcd-logs\") pod \"glance-default-internal-api-0\" (UID: \"c477ca80-6743-4c07-b8ab-d9a083b87bcd\") " pod="openstack/glance-default-internal-api-0" Oct 12 07:49:07 crc kubenswrapper[4599]: I1012 07:49:07.247280 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxxqv\" (UniqueName: \"kubernetes.io/projected/ba6bf0af-7d41-4886-9222-3c46d8ba55e2-kube-api-access-rxxqv\") pod \"glance-default-external-api-0\" (UID: \"ba6bf0af-7d41-4886-9222-3c46d8ba55e2\") " pod="openstack/glance-default-external-api-0" Oct 12 07:49:07 crc kubenswrapper[4599]: I1012 07:49:07.247364 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba6bf0af-7d41-4886-9222-3c46d8ba55e2-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ba6bf0af-7d41-4886-9222-3c46d8ba55e2\") " pod="openstack/glance-default-external-api-0" Oct 12 07:49:07 crc kubenswrapper[4599]: I1012 07:49:07.247388 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba6bf0af-7d41-4886-9222-3c46d8ba55e2-scripts\") pod \"glance-default-external-api-0\" (UID: \"ba6bf0af-7d41-4886-9222-3c46d8ba55e2\") " pod="openstack/glance-default-external-api-0" Oct 12 07:49:07 crc kubenswrapper[4599]: I1012 07:49:07.247405 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba6bf0af-7d41-4886-9222-3c46d8ba55e2-logs\") pod \"glance-default-external-api-0\" (UID: \"ba6bf0af-7d41-4886-9222-3c46d8ba55e2\") " pod="openstack/glance-default-external-api-0" Oct 12 07:49:07 crc kubenswrapper[4599]: I1012 07:49:07.247429 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c477ca80-6743-4c07-b8ab-d9a083b87bcd-logs\") pod \"glance-default-internal-api-0\" (UID: \"c477ca80-6743-4c07-b8ab-d9a083b87bcd\") " pod="openstack/glance-default-internal-api-0" Oct 12 07:49:07 crc kubenswrapper[4599]: I1012 07:49:07.247449 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c477ca80-6743-4c07-b8ab-d9a083b87bcd-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c477ca80-6743-4c07-b8ab-d9a083b87bcd\") " pod="openstack/glance-default-internal-api-0" Oct 12 07:49:07 crc kubenswrapper[4599]: I1012 07:49:07.247464 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c477ca80-6743-4c07-b8ab-d9a083b87bcd-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c477ca80-6743-4c07-b8ab-d9a083b87bcd\") " pod="openstack/glance-default-internal-api-0" Oct 12 07:49:07 crc kubenswrapper[4599]: I1012 07:49:07.247496 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c477ca80-6743-4c07-b8ab-d9a083b87bcd-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c477ca80-6743-4c07-b8ab-d9a083b87bcd\") " pod="openstack/glance-default-internal-api-0" Oct 12 07:49:07 crc kubenswrapper[4599]: I1012 07:49:07.247517 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c477ca80-6743-4c07-b8ab-d9a083b87bcd-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c477ca80-6743-4c07-b8ab-d9a083b87bcd\") " pod="openstack/glance-default-internal-api-0" Oct 12 07:49:07 crc kubenswrapper[4599]: I1012 07:49:07.247534 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"c477ca80-6743-4c07-b8ab-d9a083b87bcd\") " pod="openstack/glance-default-internal-api-0" Oct 12 07:49:07 crc kubenswrapper[4599]: I1012 07:49:07.247548 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wkr7\" (UniqueName: \"kubernetes.io/projected/c477ca80-6743-4c07-b8ab-d9a083b87bcd-kube-api-access-7wkr7\") pod \"glance-default-internal-api-0\" (UID: \"c477ca80-6743-4c07-b8ab-d9a083b87bcd\") " pod="openstack/glance-default-internal-api-0" Oct 12 07:49:07 crc kubenswrapper[4599]: I1012 07:49:07.247572 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"ba6bf0af-7d41-4886-9222-3c46d8ba55e2\") " pod="openstack/glance-default-external-api-0" Oct 12 07:49:07 crc kubenswrapper[4599]: I1012 07:49:07.247600 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba6bf0af-7d41-4886-9222-3c46d8ba55e2-config-data\") pod \"glance-default-external-api-0\" (UID: \"ba6bf0af-7d41-4886-9222-3c46d8ba55e2\") " pod="openstack/glance-default-external-api-0" Oct 12 07:49:07 crc kubenswrapper[4599]: I1012 07:49:07.247619 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c477ca80-6743-4c07-b8ab-d9a083b87bcd-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c477ca80-6743-4c07-b8ab-d9a083b87bcd\") " pod="openstack/glance-default-internal-api-0" Oct 12 07:49:07 crc kubenswrapper[4599]: I1012 07:49:07.247652 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ba6bf0af-7d41-4886-9222-3c46d8ba55e2-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ba6bf0af-7d41-4886-9222-3c46d8ba55e2\") " pod="openstack/glance-default-external-api-0" Oct 12 07:49:07 crc kubenswrapper[4599]: I1012 07:49:07.247670 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba6bf0af-7d41-4886-9222-3c46d8ba55e2-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ba6bf0af-7d41-4886-9222-3c46d8ba55e2\") " pod="openstack/glance-default-external-api-0" Oct 12 07:49:07 crc kubenswrapper[4599]: I1012 07:49:07.248237 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c477ca80-6743-4c07-b8ab-d9a083b87bcd-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c477ca80-6743-4c07-b8ab-d9a083b87bcd\") " pod="openstack/glance-default-internal-api-0" Oct 12 07:49:07 crc kubenswrapper[4599]: I1012 07:49:07.248572 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba6bf0af-7d41-4886-9222-3c46d8ba55e2-logs\") pod \"glance-default-external-api-0\" (UID: \"ba6bf0af-7d41-4886-9222-3c46d8ba55e2\") " pod="openstack/glance-default-external-api-0" Oct 12 07:49:07 crc kubenswrapper[4599]: I1012 07:49:07.248790 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c477ca80-6743-4c07-b8ab-d9a083b87bcd-logs\") pod \"glance-default-internal-api-0\" (UID: \"c477ca80-6743-4c07-b8ab-d9a083b87bcd\") " pod="openstack/glance-default-internal-api-0" Oct 12 07:49:07 crc kubenswrapper[4599]: I1012 07:49:07.249758 4599 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"ba6bf0af-7d41-4886-9222-3c46d8ba55e2\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-external-api-0" Oct 12 07:49:07 crc kubenswrapper[4599]: I1012 07:49:07.249795 4599 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"c477ca80-6743-4c07-b8ab-d9a083b87bcd\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-internal-api-0" Oct 12 07:49:07 crc kubenswrapper[4599]: I1012 07:49:07.253125 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba6bf0af-7d41-4886-9222-3c46d8ba55e2-scripts\") pod \"glance-default-external-api-0\" (UID: \"ba6bf0af-7d41-4886-9222-3c46d8ba55e2\") " pod="openstack/glance-default-external-api-0" Oct 12 07:49:07 crc kubenswrapper[4599]: I1012 07:49:07.253298 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c477ca80-6743-4c07-b8ab-d9a083b87bcd-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c477ca80-6743-4c07-b8ab-d9a083b87bcd\") " pod="openstack/glance-default-internal-api-0" Oct 12 07:49:07 crc kubenswrapper[4599]: I1012 07:49:07.254519 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c477ca80-6743-4c07-b8ab-d9a083b87bcd-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c477ca80-6743-4c07-b8ab-d9a083b87bcd\") " pod="openstack/glance-default-internal-api-0" Oct 12 07:49:07 crc kubenswrapper[4599]: I1012 07:49:07.254530 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ba6bf0af-7d41-4886-9222-3c46d8ba55e2-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ba6bf0af-7d41-4886-9222-3c46d8ba55e2\") " pod="openstack/glance-default-external-api-0" Oct 12 07:49:07 crc kubenswrapper[4599]: I1012 07:49:07.258434 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c477ca80-6743-4c07-b8ab-d9a083b87bcd-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c477ca80-6743-4c07-b8ab-d9a083b87bcd\") " pod="openstack/glance-default-internal-api-0" Oct 12 07:49:07 crc kubenswrapper[4599]: I1012 07:49:07.261178 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c477ca80-6743-4c07-b8ab-d9a083b87bcd-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c477ca80-6743-4c07-b8ab-d9a083b87bcd\") " pod="openstack/glance-default-internal-api-0" Oct 12 07:49:07 crc kubenswrapper[4599]: I1012 07:49:07.263398 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba6bf0af-7d41-4886-9222-3c46d8ba55e2-config-data\") pod \"glance-default-external-api-0\" (UID: \"ba6bf0af-7d41-4886-9222-3c46d8ba55e2\") " pod="openstack/glance-default-external-api-0" Oct 12 07:49:07 crc kubenswrapper[4599]: I1012 07:49:07.264672 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba6bf0af-7d41-4886-9222-3c46d8ba55e2-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ba6bf0af-7d41-4886-9222-3c46d8ba55e2\") " pod="openstack/glance-default-external-api-0" Oct 12 07:49:07 crc kubenswrapper[4599]: I1012 07:49:07.264701 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba6bf0af-7d41-4886-9222-3c46d8ba55e2-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ba6bf0af-7d41-4886-9222-3c46d8ba55e2\") " pod="openstack/glance-default-external-api-0" Oct 12 07:49:07 crc kubenswrapper[4599]: I1012 07:49:07.267419 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wkr7\" (UniqueName: \"kubernetes.io/projected/c477ca80-6743-4c07-b8ab-d9a083b87bcd-kube-api-access-7wkr7\") pod \"glance-default-internal-api-0\" (UID: \"c477ca80-6743-4c07-b8ab-d9a083b87bcd\") " pod="openstack/glance-default-internal-api-0" Oct 12 07:49:07 crc kubenswrapper[4599]: I1012 07:49:07.267627 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxxqv\" (UniqueName: \"kubernetes.io/projected/ba6bf0af-7d41-4886-9222-3c46d8ba55e2-kube-api-access-rxxqv\") pod \"glance-default-external-api-0\" (UID: \"ba6bf0af-7d41-4886-9222-3c46d8ba55e2\") " pod="openstack/glance-default-external-api-0" Oct 12 07:49:07 crc kubenswrapper[4599]: I1012 07:49:07.295626 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"c477ca80-6743-4c07-b8ab-d9a083b87bcd\") " pod="openstack/glance-default-internal-api-0" Oct 12 07:49:07 crc kubenswrapper[4599]: I1012 07:49:07.309301 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"ba6bf0af-7d41-4886-9222-3c46d8ba55e2\") " pod="openstack/glance-default-external-api-0" Oct 12 07:49:07 crc kubenswrapper[4599]: I1012 07:49:07.335272 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 12 07:49:07 crc kubenswrapper[4599]: I1012 07:49:07.341794 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 12 07:49:07 crc kubenswrapper[4599]: I1012 07:49:07.741065 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f94bf31d-933c-4cca-998a-4c3fc9a451f4","Type":"ContainerStarted","Data":"289d224b3977712361e5b93f8c1a0b7e85fe80edaf86f173cd6b5e2b00d8de30"} Oct 12 07:49:07 crc kubenswrapper[4599]: I1012 07:49:07.742779 4599 generic.go:334] "Generic (PLEG): container finished" podID="76d0d44d-634f-43a2-89b0-59874fdf35b5" containerID="43519f28e0fdc3576efd7baa3c616384644e0f881b9d82799f685dcef1e6361c" exitCode=0 Oct 12 07:49:07 crc kubenswrapper[4599]: I1012 07:49:07.742864 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-977d96ff-6cnmb" event={"ID":"76d0d44d-634f-43a2-89b0-59874fdf35b5","Type":"ContainerDied","Data":"43519f28e0fdc3576efd7baa3c616384644e0f881b9d82799f685dcef1e6361c"} Oct 12 07:49:07 crc kubenswrapper[4599]: I1012 07:49:07.742906 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-977d96ff-6cnmb" event={"ID":"76d0d44d-634f-43a2-89b0-59874fdf35b5","Type":"ContainerStarted","Data":"6ac04af0c17716c449dcc8f620738a11533e7a1cbf2ab11fc097d6d01477fd06"} Oct 12 07:49:07 crc kubenswrapper[4599]: I1012 07:49:07.744395 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-pkz8c" event={"ID":"9d43ed11-7175-498a-8dc1-e15cfc41b5c8","Type":"ContainerStarted","Data":"3b98a862f2a26e6aab4d936817b0e3e55f0f67169e1044617a0be1125a66b3cd"} Oct 12 07:49:07 crc kubenswrapper[4599]: I1012 07:49:07.745868 4599 generic.go:334] "Generic (PLEG): container finished" podID="2e0df496-5c55-4a44-89b6-5acd786c4aa2" containerID="c63941cf77aeda2bbec1d8bc2f4675d44f2a8e3b145525631fe8146099a3bf3e" exitCode=0 Oct 12 07:49:07 crc kubenswrapper[4599]: I1012 07:49:07.746130 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c7fbdd79f-wl96w" event={"ID":"2e0df496-5c55-4a44-89b6-5acd786c4aa2","Type":"ContainerDied","Data":"c63941cf77aeda2bbec1d8bc2f4675d44f2a8e3b145525631fe8146099a3bf3e"} Oct 12 07:49:07 crc kubenswrapper[4599]: I1012 07:49:07.753561 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6b47fdb5b7-gd7kj" podUID="531f1f31-7281-4e7f-af67-2592fd9733ed" containerName="dnsmasq-dns" containerID="cri-o://700cf6a329fe7f2d5d093a843045e42d66370ba4e8a5cd917f57abdceea7d486" gracePeriod=10 Oct 12 07:49:07 crc kubenswrapper[4599]: I1012 07:49:07.754458 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-zh5h5" event={"ID":"60c2daa7-082e-4c22-a040-67cbfd077bf4","Type":"ContainerStarted","Data":"e42beb0063eaf01d7c5f602a38d181a55b0b694e53c7f9a9dd627c1a22637b2d"} Oct 12 07:49:07 crc kubenswrapper[4599]: I1012 07:49:07.754479 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-zh5h5" event={"ID":"60c2daa7-082e-4c22-a040-67cbfd077bf4","Type":"ContainerStarted","Data":"e0d880548bb07640f6d1418cbbe02d9884e9a3998566c51b24caa1443687778e"} Oct 12 07:49:08 crc kubenswrapper[4599]: I1012 07:49:07.819252 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-zh5h5" podStartSLOduration=2.819229409 podStartE2EDuration="2.819229409s" podCreationTimestamp="2025-10-12 07:49:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:49:07.803600543 +0000 UTC m=+844.592796046" watchObservedRunningTime="2025-10-12 07:49:07.819229409 +0000 UTC m=+844.608424911" Oct 12 07:49:08 crc kubenswrapper[4599]: I1012 07:49:07.850835 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 12 07:49:08 crc kubenswrapper[4599]: W1012 07:49:07.861588 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc477ca80_6743_4c07_b8ab_d9a083b87bcd.slice/crio-621dcac8e6439133ed39eb3897124cbef040db79720909504948f9366308590c WatchSource:0}: Error finding container 621dcac8e6439133ed39eb3897124cbef040db79720909504948f9366308590c: Status 404 returned error can't find the container with id 621dcac8e6439133ed39eb3897124cbef040db79720909504948f9366308590c Oct 12 07:49:08 crc kubenswrapper[4599]: I1012 07:49:08.024093 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 12 07:49:08 crc kubenswrapper[4599]: I1012 07:49:08.671066 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c7fbdd79f-wl96w" Oct 12 07:49:08 crc kubenswrapper[4599]: I1012 07:49:08.707664 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b47fdb5b7-gd7kj" Oct 12 07:49:08 crc kubenswrapper[4599]: I1012 07:49:08.777286 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2e0df496-5c55-4a44-89b6-5acd786c4aa2-ovsdbserver-nb\") pod \"2e0df496-5c55-4a44-89b6-5acd786c4aa2\" (UID: \"2e0df496-5c55-4a44-89b6-5acd786c4aa2\") " Oct 12 07:49:08 crc kubenswrapper[4599]: I1012 07:49:08.777400 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e0df496-5c55-4a44-89b6-5acd786c4aa2-config\") pod \"2e0df496-5c55-4a44-89b6-5acd786c4aa2\" (UID: \"2e0df496-5c55-4a44-89b6-5acd786c4aa2\") " Oct 12 07:49:08 crc kubenswrapper[4599]: I1012 07:49:08.777447 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7cxd4\" (UniqueName: \"kubernetes.io/projected/2e0df496-5c55-4a44-89b6-5acd786c4aa2-kube-api-access-7cxd4\") pod \"2e0df496-5c55-4a44-89b6-5acd786c4aa2\" (UID: \"2e0df496-5c55-4a44-89b6-5acd786c4aa2\") " Oct 12 07:49:08 crc kubenswrapper[4599]: I1012 07:49:08.777476 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2e0df496-5c55-4a44-89b6-5acd786c4aa2-dns-swift-storage-0\") pod \"2e0df496-5c55-4a44-89b6-5acd786c4aa2\" (UID: \"2e0df496-5c55-4a44-89b6-5acd786c4aa2\") " Oct 12 07:49:08 crc kubenswrapper[4599]: I1012 07:49:08.777605 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2e0df496-5c55-4a44-89b6-5acd786c4aa2-ovsdbserver-sb\") pod \"2e0df496-5c55-4a44-89b6-5acd786c4aa2\" (UID: \"2e0df496-5c55-4a44-89b6-5acd786c4aa2\") " Oct 12 07:49:08 crc kubenswrapper[4599]: I1012 07:49:08.777618 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2e0df496-5c55-4a44-89b6-5acd786c4aa2-dns-svc\") pod \"2e0df496-5c55-4a44-89b6-5acd786c4aa2\" (UID: \"2e0df496-5c55-4a44-89b6-5acd786c4aa2\") " Oct 12 07:49:08 crc kubenswrapper[4599]: I1012 07:49:08.785272 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e0df496-5c55-4a44-89b6-5acd786c4aa2-kube-api-access-7cxd4" (OuterVolumeSpecName: "kube-api-access-7cxd4") pod "2e0df496-5c55-4a44-89b6-5acd786c4aa2" (UID: "2e0df496-5c55-4a44-89b6-5acd786c4aa2"). InnerVolumeSpecName "kube-api-access-7cxd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:49:08 crc kubenswrapper[4599]: I1012 07:49:08.786141 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-977d96ff-6cnmb" event={"ID":"76d0d44d-634f-43a2-89b0-59874fdf35b5","Type":"ContainerStarted","Data":"47b68a2bf58dedea2ed87f85daf28cc535cac673a563c130380a309b685a6627"} Oct 12 07:49:08 crc kubenswrapper[4599]: I1012 07:49:08.787250 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-977d96ff-6cnmb" Oct 12 07:49:08 crc kubenswrapper[4599]: I1012 07:49:08.795682 4599 generic.go:334] "Generic (PLEG): container finished" podID="531f1f31-7281-4e7f-af67-2592fd9733ed" containerID="700cf6a329fe7f2d5d093a843045e42d66370ba4e8a5cd917f57abdceea7d486" exitCode=0 Oct 12 07:49:08 crc kubenswrapper[4599]: I1012 07:49:08.795753 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b47fdb5b7-gd7kj" event={"ID":"531f1f31-7281-4e7f-af67-2592fd9733ed","Type":"ContainerDied","Data":"700cf6a329fe7f2d5d093a843045e42d66370ba4e8a5cd917f57abdceea7d486"} Oct 12 07:49:08 crc kubenswrapper[4599]: I1012 07:49:08.795785 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b47fdb5b7-gd7kj" event={"ID":"531f1f31-7281-4e7f-af67-2592fd9733ed","Type":"ContainerDied","Data":"368101a86eb95476ab63bf60ca9fcfa9c630d1596c32af4e61606390b1a72042"} Oct 12 07:49:08 crc kubenswrapper[4599]: I1012 07:49:08.795806 4599 scope.go:117] "RemoveContainer" containerID="700cf6a329fe7f2d5d093a843045e42d66370ba4e8a5cd917f57abdceea7d486" Oct 12 07:49:08 crc kubenswrapper[4599]: I1012 07:49:08.795936 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b47fdb5b7-gd7kj" Oct 12 07:49:08 crc kubenswrapper[4599]: I1012 07:49:08.814702 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-977d96ff-6cnmb" podStartSLOduration=2.814686605 podStartE2EDuration="2.814686605s" podCreationTimestamp="2025-10-12 07:49:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:49:08.803003566 +0000 UTC m=+845.592199068" watchObservedRunningTime="2025-10-12 07:49:08.814686605 +0000 UTC m=+845.603882107" Oct 12 07:49:08 crc kubenswrapper[4599]: I1012 07:49:08.815442 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e0df496-5c55-4a44-89b6-5acd786c4aa2-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2e0df496-5c55-4a44-89b6-5acd786c4aa2" (UID: "2e0df496-5c55-4a44-89b6-5acd786c4aa2"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:49:08 crc kubenswrapper[4599]: I1012 07:49:08.821962 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ba6bf0af-7d41-4886-9222-3c46d8ba55e2","Type":"ContainerStarted","Data":"77aed42684f97a3603ff2909ec9719ff2c14c136e27aa20e9688ba4a364c5739"} Oct 12 07:49:08 crc kubenswrapper[4599]: I1012 07:49:08.822030 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ba6bf0af-7d41-4886-9222-3c46d8ba55e2","Type":"ContainerStarted","Data":"8742dff11f6ccf82fa611a06f474052f8fa67fb23f37bea59fb07df94f665108"} Oct 12 07:49:08 crc kubenswrapper[4599]: I1012 07:49:08.822493 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e0df496-5c55-4a44-89b6-5acd786c4aa2-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "2e0df496-5c55-4a44-89b6-5acd786c4aa2" (UID: "2e0df496-5c55-4a44-89b6-5acd786c4aa2"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:49:08 crc kubenswrapper[4599]: I1012 07:49:08.829431 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c7fbdd79f-wl96w" event={"ID":"2e0df496-5c55-4a44-89b6-5acd786c4aa2","Type":"ContainerDied","Data":"c0aab737be881d70693362e48275d96b86726ee5c965fdce364be4b659a7e648"} Oct 12 07:49:08 crc kubenswrapper[4599]: I1012 07:49:08.829477 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c7fbdd79f-wl96w" Oct 12 07:49:08 crc kubenswrapper[4599]: I1012 07:49:08.832036 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e0df496-5c55-4a44-89b6-5acd786c4aa2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2e0df496-5c55-4a44-89b6-5acd786c4aa2" (UID: "2e0df496-5c55-4a44-89b6-5acd786c4aa2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:49:08 crc kubenswrapper[4599]: I1012 07:49:08.832562 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e0df496-5c55-4a44-89b6-5acd786c4aa2-config" (OuterVolumeSpecName: "config") pod "2e0df496-5c55-4a44-89b6-5acd786c4aa2" (UID: "2e0df496-5c55-4a44-89b6-5acd786c4aa2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:49:08 crc kubenswrapper[4599]: I1012 07:49:08.833647 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e0df496-5c55-4a44-89b6-5acd786c4aa2-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2e0df496-5c55-4a44-89b6-5acd786c4aa2" (UID: "2e0df496-5c55-4a44-89b6-5acd786c4aa2"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:49:08 crc kubenswrapper[4599]: I1012 07:49:08.835301 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c477ca80-6743-4c07-b8ab-d9a083b87bcd","Type":"ContainerStarted","Data":"d58fa2adf48aab55a1f6b72219075412b7688b4dbe7c1061af4723835d667f7c"} Oct 12 07:49:08 crc kubenswrapper[4599]: I1012 07:49:08.835432 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c477ca80-6743-4c07-b8ab-d9a083b87bcd","Type":"ContainerStarted","Data":"621dcac8e6439133ed39eb3897124cbef040db79720909504948f9366308590c"} Oct 12 07:49:08 crc kubenswrapper[4599]: I1012 07:49:08.847097 4599 scope.go:117] "RemoveContainer" containerID="885867dbbc62cca91e6642aad02770bb3437ac47176b4e6791222319eccec0ec" Oct 12 07:49:08 crc kubenswrapper[4599]: I1012 07:49:08.880114 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/531f1f31-7281-4e7f-af67-2592fd9733ed-config\") pod \"531f1f31-7281-4e7f-af67-2592fd9733ed\" (UID: \"531f1f31-7281-4e7f-af67-2592fd9733ed\") " Oct 12 07:49:08 crc kubenswrapper[4599]: I1012 07:49:08.880253 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/531f1f31-7281-4e7f-af67-2592fd9733ed-dns-svc\") pod \"531f1f31-7281-4e7f-af67-2592fd9733ed\" (UID: \"531f1f31-7281-4e7f-af67-2592fd9733ed\") " Oct 12 07:49:08 crc kubenswrapper[4599]: I1012 07:49:08.880317 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/531f1f31-7281-4e7f-af67-2592fd9733ed-ovsdbserver-sb\") pod \"531f1f31-7281-4e7f-af67-2592fd9733ed\" (UID: \"531f1f31-7281-4e7f-af67-2592fd9733ed\") " Oct 12 07:49:08 crc kubenswrapper[4599]: I1012 07:49:08.880393 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/531f1f31-7281-4e7f-af67-2592fd9733ed-ovsdbserver-nb\") pod \"531f1f31-7281-4e7f-af67-2592fd9733ed\" (UID: \"531f1f31-7281-4e7f-af67-2592fd9733ed\") " Oct 12 07:49:08 crc kubenswrapper[4599]: I1012 07:49:08.880429 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b8mk4\" (UniqueName: \"kubernetes.io/projected/531f1f31-7281-4e7f-af67-2592fd9733ed-kube-api-access-b8mk4\") pod \"531f1f31-7281-4e7f-af67-2592fd9733ed\" (UID: \"531f1f31-7281-4e7f-af67-2592fd9733ed\") " Oct 12 07:49:08 crc kubenswrapper[4599]: I1012 07:49:08.880473 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/531f1f31-7281-4e7f-af67-2592fd9733ed-dns-swift-storage-0\") pod \"531f1f31-7281-4e7f-af67-2592fd9733ed\" (UID: \"531f1f31-7281-4e7f-af67-2592fd9733ed\") " Oct 12 07:49:08 crc kubenswrapper[4599]: I1012 07:49:08.880836 4599 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e0df496-5c55-4a44-89b6-5acd786c4aa2-config\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:08 crc kubenswrapper[4599]: I1012 07:49:08.880853 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7cxd4\" (UniqueName: \"kubernetes.io/projected/2e0df496-5c55-4a44-89b6-5acd786c4aa2-kube-api-access-7cxd4\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:08 crc kubenswrapper[4599]: I1012 07:49:08.880865 4599 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2e0df496-5c55-4a44-89b6-5acd786c4aa2-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:08 crc kubenswrapper[4599]: I1012 07:49:08.880874 4599 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2e0df496-5c55-4a44-89b6-5acd786c4aa2-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:08 crc kubenswrapper[4599]: I1012 07:49:08.880882 4599 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2e0df496-5c55-4a44-89b6-5acd786c4aa2-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:08 crc kubenswrapper[4599]: I1012 07:49:08.880889 4599 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2e0df496-5c55-4a44-89b6-5acd786c4aa2-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:08 crc kubenswrapper[4599]: I1012 07:49:08.894604 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/531f1f31-7281-4e7f-af67-2592fd9733ed-kube-api-access-b8mk4" (OuterVolumeSpecName: "kube-api-access-b8mk4") pod "531f1f31-7281-4e7f-af67-2592fd9733ed" (UID: "531f1f31-7281-4e7f-af67-2592fd9733ed"). InnerVolumeSpecName "kube-api-access-b8mk4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:49:08 crc kubenswrapper[4599]: I1012 07:49:08.925908 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/531f1f31-7281-4e7f-af67-2592fd9733ed-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "531f1f31-7281-4e7f-af67-2592fd9733ed" (UID: "531f1f31-7281-4e7f-af67-2592fd9733ed"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:49:08 crc kubenswrapper[4599]: I1012 07:49:08.929998 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/531f1f31-7281-4e7f-af67-2592fd9733ed-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "531f1f31-7281-4e7f-af67-2592fd9733ed" (UID: "531f1f31-7281-4e7f-af67-2592fd9733ed"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:49:08 crc kubenswrapper[4599]: I1012 07:49:08.934539 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/531f1f31-7281-4e7f-af67-2592fd9733ed-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "531f1f31-7281-4e7f-af67-2592fd9733ed" (UID: "531f1f31-7281-4e7f-af67-2592fd9733ed"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:49:08 crc kubenswrapper[4599]: I1012 07:49:08.936945 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/531f1f31-7281-4e7f-af67-2592fd9733ed-config" (OuterVolumeSpecName: "config") pod "531f1f31-7281-4e7f-af67-2592fd9733ed" (UID: "531f1f31-7281-4e7f-af67-2592fd9733ed"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:49:08 crc kubenswrapper[4599]: I1012 07:49:08.951519 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/531f1f31-7281-4e7f-af67-2592fd9733ed-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "531f1f31-7281-4e7f-af67-2592fd9733ed" (UID: "531f1f31-7281-4e7f-af67-2592fd9733ed"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:49:08 crc kubenswrapper[4599]: I1012 07:49:08.982437 4599 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/531f1f31-7281-4e7f-af67-2592fd9733ed-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:08 crc kubenswrapper[4599]: I1012 07:49:08.982469 4599 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/531f1f31-7281-4e7f-af67-2592fd9733ed-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:08 crc kubenswrapper[4599]: I1012 07:49:08.982479 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b8mk4\" (UniqueName: \"kubernetes.io/projected/531f1f31-7281-4e7f-af67-2592fd9733ed-kube-api-access-b8mk4\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:08 crc kubenswrapper[4599]: I1012 07:49:08.982489 4599 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/531f1f31-7281-4e7f-af67-2592fd9733ed-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:08 crc kubenswrapper[4599]: I1012 07:49:08.982499 4599 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/531f1f31-7281-4e7f-af67-2592fd9733ed-config\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:08 crc kubenswrapper[4599]: I1012 07:49:08.982507 4599 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/531f1f31-7281-4e7f-af67-2592fd9733ed-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:09 crc kubenswrapper[4599]: I1012 07:49:09.013638 4599 scope.go:117] "RemoveContainer" containerID="700cf6a329fe7f2d5d093a843045e42d66370ba4e8a5cd917f57abdceea7d486" Oct 12 07:49:09 crc kubenswrapper[4599]: E1012 07:49:09.014157 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"700cf6a329fe7f2d5d093a843045e42d66370ba4e8a5cd917f57abdceea7d486\": container with ID starting with 700cf6a329fe7f2d5d093a843045e42d66370ba4e8a5cd917f57abdceea7d486 not found: ID does not exist" containerID="700cf6a329fe7f2d5d093a843045e42d66370ba4e8a5cd917f57abdceea7d486" Oct 12 07:49:09 crc kubenswrapper[4599]: I1012 07:49:09.014221 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"700cf6a329fe7f2d5d093a843045e42d66370ba4e8a5cd917f57abdceea7d486"} err="failed to get container status \"700cf6a329fe7f2d5d093a843045e42d66370ba4e8a5cd917f57abdceea7d486\": rpc error: code = NotFound desc = could not find container \"700cf6a329fe7f2d5d093a843045e42d66370ba4e8a5cd917f57abdceea7d486\": container with ID starting with 700cf6a329fe7f2d5d093a843045e42d66370ba4e8a5cd917f57abdceea7d486 not found: ID does not exist" Oct 12 07:49:09 crc kubenswrapper[4599]: I1012 07:49:09.014261 4599 scope.go:117] "RemoveContainer" containerID="885867dbbc62cca91e6642aad02770bb3437ac47176b4e6791222319eccec0ec" Oct 12 07:49:09 crc kubenswrapper[4599]: E1012 07:49:09.014749 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"885867dbbc62cca91e6642aad02770bb3437ac47176b4e6791222319eccec0ec\": container with ID starting with 885867dbbc62cca91e6642aad02770bb3437ac47176b4e6791222319eccec0ec not found: ID does not exist" containerID="885867dbbc62cca91e6642aad02770bb3437ac47176b4e6791222319eccec0ec" Oct 12 07:49:09 crc kubenswrapper[4599]: I1012 07:49:09.014794 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"885867dbbc62cca91e6642aad02770bb3437ac47176b4e6791222319eccec0ec"} err="failed to get container status \"885867dbbc62cca91e6642aad02770bb3437ac47176b4e6791222319eccec0ec\": rpc error: code = NotFound desc = could not find container \"885867dbbc62cca91e6642aad02770bb3437ac47176b4e6791222319eccec0ec\": container with ID starting with 885867dbbc62cca91e6642aad02770bb3437ac47176b4e6791222319eccec0ec not found: ID does not exist" Oct 12 07:49:09 crc kubenswrapper[4599]: I1012 07:49:09.014846 4599 scope.go:117] "RemoveContainer" containerID="c63941cf77aeda2bbec1d8bc2f4675d44f2a8e3b145525631fe8146099a3bf3e" Oct 12 07:49:09 crc kubenswrapper[4599]: I1012 07:49:09.056071 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 12 07:49:09 crc kubenswrapper[4599]: I1012 07:49:09.104268 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 12 07:49:09 crc kubenswrapper[4599]: I1012 07:49:09.111593 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 12 07:49:09 crc kubenswrapper[4599]: I1012 07:49:09.151416 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b47fdb5b7-gd7kj"] Oct 12 07:49:09 crc kubenswrapper[4599]: I1012 07:49:09.157781 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6b47fdb5b7-gd7kj"] Oct 12 07:49:09 crc kubenswrapper[4599]: I1012 07:49:09.194370 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c7fbdd79f-wl96w"] Oct 12 07:49:09 crc kubenswrapper[4599]: I1012 07:49:09.214787 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6c7fbdd79f-wl96w"] Oct 12 07:49:09 crc kubenswrapper[4599]: I1012 07:49:09.570404 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e0df496-5c55-4a44-89b6-5acd786c4aa2" path="/var/lib/kubelet/pods/2e0df496-5c55-4a44-89b6-5acd786c4aa2/volumes" Oct 12 07:49:09 crc kubenswrapper[4599]: I1012 07:49:09.571194 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="531f1f31-7281-4e7f-af67-2592fd9733ed" path="/var/lib/kubelet/pods/531f1f31-7281-4e7f-af67-2592fd9733ed/volumes" Oct 12 07:49:09 crc kubenswrapper[4599]: I1012 07:49:09.848136 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ba6bf0af-7d41-4886-9222-3c46d8ba55e2","Type":"ContainerStarted","Data":"bbd6b705d355fa9f3c6818a39b0b8d62c104a44ae2b710f18d60e1be788303da"} Oct 12 07:49:09 crc kubenswrapper[4599]: I1012 07:49:09.848383 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="ba6bf0af-7d41-4886-9222-3c46d8ba55e2" containerName="glance-log" containerID="cri-o://77aed42684f97a3603ff2909ec9719ff2c14c136e27aa20e9688ba4a364c5739" gracePeriod=30 Oct 12 07:49:09 crc kubenswrapper[4599]: I1012 07:49:09.848870 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="ba6bf0af-7d41-4886-9222-3c46d8ba55e2" containerName="glance-httpd" containerID="cri-o://bbd6b705d355fa9f3c6818a39b0b8d62c104a44ae2b710f18d60e1be788303da" gracePeriod=30 Oct 12 07:49:09 crc kubenswrapper[4599]: I1012 07:49:09.853604 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c477ca80-6743-4c07-b8ab-d9a083b87bcd","Type":"ContainerStarted","Data":"29f4ca0a60b4c730c3823c30d3eb113f3ee8a2400a96bbf3dd010ceec274b23b"} Oct 12 07:49:09 crc kubenswrapper[4599]: I1012 07:49:09.853710 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="c477ca80-6743-4c07-b8ab-d9a083b87bcd" containerName="glance-log" containerID="cri-o://d58fa2adf48aab55a1f6b72219075412b7688b4dbe7c1061af4723835d667f7c" gracePeriod=30 Oct 12 07:49:09 crc kubenswrapper[4599]: I1012 07:49:09.853787 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="c477ca80-6743-4c07-b8ab-d9a083b87bcd" containerName="glance-httpd" containerID="cri-o://29f4ca0a60b4c730c3823c30d3eb113f3ee8a2400a96bbf3dd010ceec274b23b" gracePeriod=30 Oct 12 07:49:09 crc kubenswrapper[4599]: I1012 07:49:09.871489 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.871469928 podStartE2EDuration="4.871469928s" podCreationTimestamp="2025-10-12 07:49:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:49:09.868477862 +0000 UTC m=+846.657673364" watchObservedRunningTime="2025-10-12 07:49:09.871469928 +0000 UTC m=+846.660665431" Oct 12 07:49:09 crc kubenswrapper[4599]: I1012 07:49:09.888646 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.888628311 podStartE2EDuration="4.888628311s" podCreationTimestamp="2025-10-12 07:49:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:49:09.886592598 +0000 UTC m=+846.675788101" watchObservedRunningTime="2025-10-12 07:49:09.888628311 +0000 UTC m=+846.677823813" Oct 12 07:49:09 crc kubenswrapper[4599]: I1012 07:49:09.946217 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-156a-account-create-6tgjb"] Oct 12 07:49:09 crc kubenswrapper[4599]: E1012 07:49:09.946578 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="531f1f31-7281-4e7f-af67-2592fd9733ed" containerName="dnsmasq-dns" Oct 12 07:49:09 crc kubenswrapper[4599]: I1012 07:49:09.946597 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="531f1f31-7281-4e7f-af67-2592fd9733ed" containerName="dnsmasq-dns" Oct 12 07:49:09 crc kubenswrapper[4599]: E1012 07:49:09.946620 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e0df496-5c55-4a44-89b6-5acd786c4aa2" containerName="init" Oct 12 07:49:09 crc kubenswrapper[4599]: I1012 07:49:09.946626 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e0df496-5c55-4a44-89b6-5acd786c4aa2" containerName="init" Oct 12 07:49:09 crc kubenswrapper[4599]: E1012 07:49:09.946638 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="531f1f31-7281-4e7f-af67-2592fd9733ed" containerName="init" Oct 12 07:49:09 crc kubenswrapper[4599]: I1012 07:49:09.946643 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="531f1f31-7281-4e7f-af67-2592fd9733ed" containerName="init" Oct 12 07:49:09 crc kubenswrapper[4599]: I1012 07:49:09.946792 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e0df496-5c55-4a44-89b6-5acd786c4aa2" containerName="init" Oct 12 07:49:09 crc kubenswrapper[4599]: I1012 07:49:09.946807 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="531f1f31-7281-4e7f-af67-2592fd9733ed" containerName="dnsmasq-dns" Oct 12 07:49:09 crc kubenswrapper[4599]: I1012 07:49:09.947357 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-156a-account-create-6tgjb" Oct 12 07:49:09 crc kubenswrapper[4599]: I1012 07:49:09.949286 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Oct 12 07:49:09 crc kubenswrapper[4599]: I1012 07:49:09.957198 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-156a-account-create-6tgjb"] Oct 12 07:49:10 crc kubenswrapper[4599]: I1012 07:49:10.053016 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-c095-account-create-4scdb"] Oct 12 07:49:10 crc kubenswrapper[4599]: I1012 07:49:10.054319 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-c095-account-create-4scdb" Oct 12 07:49:10 crc kubenswrapper[4599]: I1012 07:49:10.060168 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-c095-account-create-4scdb"] Oct 12 07:49:10 crc kubenswrapper[4599]: I1012 07:49:10.062130 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Oct 12 07:49:10 crc kubenswrapper[4599]: I1012 07:49:10.107033 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5564g\" (UniqueName: \"kubernetes.io/projected/3775409f-2e10-405a-b777-66ce4f084bd7-kube-api-access-5564g\") pod \"cinder-156a-account-create-6tgjb\" (UID: \"3775409f-2e10-405a-b777-66ce4f084bd7\") " pod="openstack/cinder-156a-account-create-6tgjb" Oct 12 07:49:10 crc kubenswrapper[4599]: I1012 07:49:10.209016 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vp2f\" (UniqueName: \"kubernetes.io/projected/ef050d1a-fc88-429d-a19a-eb55d2933057-kube-api-access-7vp2f\") pod \"barbican-c095-account-create-4scdb\" (UID: \"ef050d1a-fc88-429d-a19a-eb55d2933057\") " pod="openstack/barbican-c095-account-create-4scdb" Oct 12 07:49:10 crc kubenswrapper[4599]: I1012 07:49:10.209169 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5564g\" (UniqueName: \"kubernetes.io/projected/3775409f-2e10-405a-b777-66ce4f084bd7-kube-api-access-5564g\") pod \"cinder-156a-account-create-6tgjb\" (UID: \"3775409f-2e10-405a-b777-66ce4f084bd7\") " pod="openstack/cinder-156a-account-create-6tgjb" Oct 12 07:49:10 crc kubenswrapper[4599]: I1012 07:49:10.232861 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5564g\" (UniqueName: \"kubernetes.io/projected/3775409f-2e10-405a-b777-66ce4f084bd7-kube-api-access-5564g\") pod \"cinder-156a-account-create-6tgjb\" (UID: \"3775409f-2e10-405a-b777-66ce4f084bd7\") " pod="openstack/cinder-156a-account-create-6tgjb" Oct 12 07:49:10 crc kubenswrapper[4599]: I1012 07:49:10.311499 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vp2f\" (UniqueName: \"kubernetes.io/projected/ef050d1a-fc88-429d-a19a-eb55d2933057-kube-api-access-7vp2f\") pod \"barbican-c095-account-create-4scdb\" (UID: \"ef050d1a-fc88-429d-a19a-eb55d2933057\") " pod="openstack/barbican-c095-account-create-4scdb" Oct 12 07:49:10 crc kubenswrapper[4599]: I1012 07:49:10.333285 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vp2f\" (UniqueName: \"kubernetes.io/projected/ef050d1a-fc88-429d-a19a-eb55d2933057-kube-api-access-7vp2f\") pod \"barbican-c095-account-create-4scdb\" (UID: \"ef050d1a-fc88-429d-a19a-eb55d2933057\") " pod="openstack/barbican-c095-account-create-4scdb" Oct 12 07:49:10 crc kubenswrapper[4599]: I1012 07:49:10.350252 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-0e9d-account-create-xlkmf"] Oct 12 07:49:10 crc kubenswrapper[4599]: I1012 07:49:10.351230 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-0e9d-account-create-xlkmf" Oct 12 07:49:10 crc kubenswrapper[4599]: I1012 07:49:10.355238 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Oct 12 07:49:10 crc kubenswrapper[4599]: I1012 07:49:10.356896 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-0e9d-account-create-xlkmf"] Oct 12 07:49:10 crc kubenswrapper[4599]: I1012 07:49:10.466955 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-156a-account-create-6tgjb" Oct 12 07:49:10 crc kubenswrapper[4599]: I1012 07:49:10.485882 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-c095-account-create-4scdb" Oct 12 07:49:10 crc kubenswrapper[4599]: I1012 07:49:10.514901 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqdgv\" (UniqueName: \"kubernetes.io/projected/616ae17d-da7f-4f46-9d1e-234bcb028377-kube-api-access-vqdgv\") pod \"neutron-0e9d-account-create-xlkmf\" (UID: \"616ae17d-da7f-4f46-9d1e-234bcb028377\") " pod="openstack/neutron-0e9d-account-create-xlkmf" Oct 12 07:49:10 crc kubenswrapper[4599]: I1012 07:49:10.616454 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqdgv\" (UniqueName: \"kubernetes.io/projected/616ae17d-da7f-4f46-9d1e-234bcb028377-kube-api-access-vqdgv\") pod \"neutron-0e9d-account-create-xlkmf\" (UID: \"616ae17d-da7f-4f46-9d1e-234bcb028377\") " pod="openstack/neutron-0e9d-account-create-xlkmf" Oct 12 07:49:10 crc kubenswrapper[4599]: I1012 07:49:10.632629 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqdgv\" (UniqueName: \"kubernetes.io/projected/616ae17d-da7f-4f46-9d1e-234bcb028377-kube-api-access-vqdgv\") pod \"neutron-0e9d-account-create-xlkmf\" (UID: \"616ae17d-da7f-4f46-9d1e-234bcb028377\") " pod="openstack/neutron-0e9d-account-create-xlkmf" Oct 12 07:49:10 crc kubenswrapper[4599]: I1012 07:49:10.666624 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-0e9d-account-create-xlkmf" Oct 12 07:49:10 crc kubenswrapper[4599]: I1012 07:49:10.880272 4599 generic.go:334] "Generic (PLEG): container finished" podID="60c2daa7-082e-4c22-a040-67cbfd077bf4" containerID="e42beb0063eaf01d7c5f602a38d181a55b0b694e53c7f9a9dd627c1a22637b2d" exitCode=0 Oct 12 07:49:10 crc kubenswrapper[4599]: I1012 07:49:10.880446 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-zh5h5" event={"ID":"60c2daa7-082e-4c22-a040-67cbfd077bf4","Type":"ContainerDied","Data":"e42beb0063eaf01d7c5f602a38d181a55b0b694e53c7f9a9dd627c1a22637b2d"} Oct 12 07:49:10 crc kubenswrapper[4599]: I1012 07:49:10.898805 4599 generic.go:334] "Generic (PLEG): container finished" podID="c477ca80-6743-4c07-b8ab-d9a083b87bcd" containerID="29f4ca0a60b4c730c3823c30d3eb113f3ee8a2400a96bbf3dd010ceec274b23b" exitCode=0 Oct 12 07:49:10 crc kubenswrapper[4599]: I1012 07:49:10.898843 4599 generic.go:334] "Generic (PLEG): container finished" podID="c477ca80-6743-4c07-b8ab-d9a083b87bcd" containerID="d58fa2adf48aab55a1f6b72219075412b7688b4dbe7c1061af4723835d667f7c" exitCode=143 Oct 12 07:49:10 crc kubenswrapper[4599]: I1012 07:49:10.898931 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c477ca80-6743-4c07-b8ab-d9a083b87bcd","Type":"ContainerDied","Data":"29f4ca0a60b4c730c3823c30d3eb113f3ee8a2400a96bbf3dd010ceec274b23b"} Oct 12 07:49:10 crc kubenswrapper[4599]: I1012 07:49:10.898962 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c477ca80-6743-4c07-b8ab-d9a083b87bcd","Type":"ContainerDied","Data":"d58fa2adf48aab55a1f6b72219075412b7688b4dbe7c1061af4723835d667f7c"} Oct 12 07:49:10 crc kubenswrapper[4599]: I1012 07:49:10.902490 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-156a-account-create-6tgjb"] Oct 12 07:49:10 crc kubenswrapper[4599]: I1012 07:49:10.907545 4599 generic.go:334] "Generic (PLEG): container finished" podID="ba6bf0af-7d41-4886-9222-3c46d8ba55e2" containerID="bbd6b705d355fa9f3c6818a39b0b8d62c104a44ae2b710f18d60e1be788303da" exitCode=0 Oct 12 07:49:10 crc kubenswrapper[4599]: I1012 07:49:10.907589 4599 generic.go:334] "Generic (PLEG): container finished" podID="ba6bf0af-7d41-4886-9222-3c46d8ba55e2" containerID="77aed42684f97a3603ff2909ec9719ff2c14c136e27aa20e9688ba4a364c5739" exitCode=143 Oct 12 07:49:10 crc kubenswrapper[4599]: I1012 07:49:10.908698 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ba6bf0af-7d41-4886-9222-3c46d8ba55e2","Type":"ContainerDied","Data":"bbd6b705d355fa9f3c6818a39b0b8d62c104a44ae2b710f18d60e1be788303da"} Oct 12 07:49:10 crc kubenswrapper[4599]: I1012 07:49:10.908726 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ba6bf0af-7d41-4886-9222-3c46d8ba55e2","Type":"ContainerDied","Data":"77aed42684f97a3603ff2909ec9719ff2c14c136e27aa20e9688ba4a364c5739"} Oct 12 07:49:11 crc kubenswrapper[4599]: I1012 07:49:11.016675 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-c095-account-create-4scdb"] Oct 12 07:49:11 crc kubenswrapper[4599]: I1012 07:49:11.061455 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 12 07:49:11 crc kubenswrapper[4599]: I1012 07:49:11.159984 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-0e9d-account-create-xlkmf"] Oct 12 07:49:11 crc kubenswrapper[4599]: I1012 07:49:11.232460 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"c477ca80-6743-4c07-b8ab-d9a083b87bcd\" (UID: \"c477ca80-6743-4c07-b8ab-d9a083b87bcd\") " Oct 12 07:49:11 crc kubenswrapper[4599]: I1012 07:49:11.233296 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c477ca80-6743-4c07-b8ab-d9a083b87bcd-config-data\") pod \"c477ca80-6743-4c07-b8ab-d9a083b87bcd\" (UID: \"c477ca80-6743-4c07-b8ab-d9a083b87bcd\") " Oct 12 07:49:11 crc kubenswrapper[4599]: I1012 07:49:11.233367 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c477ca80-6743-4c07-b8ab-d9a083b87bcd-scripts\") pod \"c477ca80-6743-4c07-b8ab-d9a083b87bcd\" (UID: \"c477ca80-6743-4c07-b8ab-d9a083b87bcd\") " Oct 12 07:49:11 crc kubenswrapper[4599]: I1012 07:49:11.233400 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c477ca80-6743-4c07-b8ab-d9a083b87bcd-combined-ca-bundle\") pod \"c477ca80-6743-4c07-b8ab-d9a083b87bcd\" (UID: \"c477ca80-6743-4c07-b8ab-d9a083b87bcd\") " Oct 12 07:49:11 crc kubenswrapper[4599]: I1012 07:49:11.233418 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c477ca80-6743-4c07-b8ab-d9a083b87bcd-internal-tls-certs\") pod \"c477ca80-6743-4c07-b8ab-d9a083b87bcd\" (UID: \"c477ca80-6743-4c07-b8ab-d9a083b87bcd\") " Oct 12 07:49:11 crc kubenswrapper[4599]: I1012 07:49:11.233447 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c477ca80-6743-4c07-b8ab-d9a083b87bcd-logs\") pod \"c477ca80-6743-4c07-b8ab-d9a083b87bcd\" (UID: \"c477ca80-6743-4c07-b8ab-d9a083b87bcd\") " Oct 12 07:49:11 crc kubenswrapper[4599]: I1012 07:49:11.233477 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7wkr7\" (UniqueName: \"kubernetes.io/projected/c477ca80-6743-4c07-b8ab-d9a083b87bcd-kube-api-access-7wkr7\") pod \"c477ca80-6743-4c07-b8ab-d9a083b87bcd\" (UID: \"c477ca80-6743-4c07-b8ab-d9a083b87bcd\") " Oct 12 07:49:11 crc kubenswrapper[4599]: I1012 07:49:11.233498 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c477ca80-6743-4c07-b8ab-d9a083b87bcd-httpd-run\") pod \"c477ca80-6743-4c07-b8ab-d9a083b87bcd\" (UID: \"c477ca80-6743-4c07-b8ab-d9a083b87bcd\") " Oct 12 07:49:11 crc kubenswrapper[4599]: I1012 07:49:11.234069 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c477ca80-6743-4c07-b8ab-d9a083b87bcd-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "c477ca80-6743-4c07-b8ab-d9a083b87bcd" (UID: "c477ca80-6743-4c07-b8ab-d9a083b87bcd"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:49:11 crc kubenswrapper[4599]: I1012 07:49:11.234469 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c477ca80-6743-4c07-b8ab-d9a083b87bcd-logs" (OuterVolumeSpecName: "logs") pod "c477ca80-6743-4c07-b8ab-d9a083b87bcd" (UID: "c477ca80-6743-4c07-b8ab-d9a083b87bcd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:49:11 crc kubenswrapper[4599]: I1012 07:49:11.237881 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c477ca80-6743-4c07-b8ab-d9a083b87bcd-scripts" (OuterVolumeSpecName: "scripts") pod "c477ca80-6743-4c07-b8ab-d9a083b87bcd" (UID: "c477ca80-6743-4c07-b8ab-d9a083b87bcd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:49:11 crc kubenswrapper[4599]: I1012 07:49:11.241521 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "c477ca80-6743-4c07-b8ab-d9a083b87bcd" (UID: "c477ca80-6743-4c07-b8ab-d9a083b87bcd"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 12 07:49:11 crc kubenswrapper[4599]: I1012 07:49:11.241920 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c477ca80-6743-4c07-b8ab-d9a083b87bcd-kube-api-access-7wkr7" (OuterVolumeSpecName: "kube-api-access-7wkr7") pod "c477ca80-6743-4c07-b8ab-d9a083b87bcd" (UID: "c477ca80-6743-4c07-b8ab-d9a083b87bcd"). InnerVolumeSpecName "kube-api-access-7wkr7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:49:11 crc kubenswrapper[4599]: I1012 07:49:11.259548 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c477ca80-6743-4c07-b8ab-d9a083b87bcd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c477ca80-6743-4c07-b8ab-d9a083b87bcd" (UID: "c477ca80-6743-4c07-b8ab-d9a083b87bcd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:49:11 crc kubenswrapper[4599]: I1012 07:49:11.282183 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c477ca80-6743-4c07-b8ab-d9a083b87bcd-config-data" (OuterVolumeSpecName: "config-data") pod "c477ca80-6743-4c07-b8ab-d9a083b87bcd" (UID: "c477ca80-6743-4c07-b8ab-d9a083b87bcd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:49:11 crc kubenswrapper[4599]: I1012 07:49:11.283488 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c477ca80-6743-4c07-b8ab-d9a083b87bcd-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "c477ca80-6743-4c07-b8ab-d9a083b87bcd" (UID: "c477ca80-6743-4c07-b8ab-d9a083b87bcd"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:49:11 crc kubenswrapper[4599]: I1012 07:49:11.335156 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7wkr7\" (UniqueName: \"kubernetes.io/projected/c477ca80-6743-4c07-b8ab-d9a083b87bcd-kube-api-access-7wkr7\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:11 crc kubenswrapper[4599]: I1012 07:49:11.335194 4599 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c477ca80-6743-4c07-b8ab-d9a083b87bcd-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:11 crc kubenswrapper[4599]: I1012 07:49:11.335223 4599 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Oct 12 07:49:11 crc kubenswrapper[4599]: I1012 07:49:11.335233 4599 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c477ca80-6743-4c07-b8ab-d9a083b87bcd-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:11 crc kubenswrapper[4599]: I1012 07:49:11.335242 4599 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c477ca80-6743-4c07-b8ab-d9a083b87bcd-scripts\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:11 crc kubenswrapper[4599]: I1012 07:49:11.335254 4599 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c477ca80-6743-4c07-b8ab-d9a083b87bcd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:11 crc kubenswrapper[4599]: I1012 07:49:11.335264 4599 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c477ca80-6743-4c07-b8ab-d9a083b87bcd-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:11 crc kubenswrapper[4599]: I1012 07:49:11.335272 4599 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c477ca80-6743-4c07-b8ab-d9a083b87bcd-logs\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:11 crc kubenswrapper[4599]: I1012 07:49:11.351099 4599 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Oct 12 07:49:11 crc kubenswrapper[4599]: I1012 07:49:11.436809 4599 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:11 crc kubenswrapper[4599]: I1012 07:49:11.939001 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 12 07:49:11 crc kubenswrapper[4599]: I1012 07:49:11.939003 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c477ca80-6743-4c07-b8ab-d9a083b87bcd","Type":"ContainerDied","Data":"621dcac8e6439133ed39eb3897124cbef040db79720909504948f9366308590c"} Oct 12 07:49:11 crc kubenswrapper[4599]: I1012 07:49:11.940373 4599 scope.go:117] "RemoveContainer" containerID="29f4ca0a60b4c730c3823c30d3eb113f3ee8a2400a96bbf3dd010ceec274b23b" Oct 12 07:49:11 crc kubenswrapper[4599]: I1012 07:49:11.940548 4599 generic.go:334] "Generic (PLEG): container finished" podID="3775409f-2e10-405a-b777-66ce4f084bd7" containerID="69696fa14b4229b999555a09f3c99d56d1ed2999a72dc2386a779e21cb70446a" exitCode=0 Oct 12 07:49:11 crc kubenswrapper[4599]: I1012 07:49:11.940590 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-156a-account-create-6tgjb" event={"ID":"3775409f-2e10-405a-b777-66ce4f084bd7","Type":"ContainerDied","Data":"69696fa14b4229b999555a09f3c99d56d1ed2999a72dc2386a779e21cb70446a"} Oct 12 07:49:11 crc kubenswrapper[4599]: I1012 07:49:11.940606 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-156a-account-create-6tgjb" event={"ID":"3775409f-2e10-405a-b777-66ce4f084bd7","Type":"ContainerStarted","Data":"82fca52d1e90a3707adfc7f8640431e2ad4b5f4d9ab45819863f1b73fe0b49ab"} Oct 12 07:49:11 crc kubenswrapper[4599]: I1012 07:49:11.943568 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-0e9d-account-create-xlkmf" event={"ID":"616ae17d-da7f-4f46-9d1e-234bcb028377","Type":"ContainerStarted","Data":"cae965b3e05b72852df4432fceb26eb9b849c4ca4dac57f2211ea9e00b732d0e"} Oct 12 07:49:11 crc kubenswrapper[4599]: I1012 07:49:11.946923 4599 generic.go:334] "Generic (PLEG): container finished" podID="ef050d1a-fc88-429d-a19a-eb55d2933057" containerID="af771c5b9656ff27ab51a4ad7330e600b927200350e0d20ebade0d7bcffd2bc7" exitCode=0 Oct 12 07:49:11 crc kubenswrapper[4599]: I1012 07:49:11.947222 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-c095-account-create-4scdb" event={"ID":"ef050d1a-fc88-429d-a19a-eb55d2933057","Type":"ContainerDied","Data":"af771c5b9656ff27ab51a4ad7330e600b927200350e0d20ebade0d7bcffd2bc7"} Oct 12 07:49:11 crc kubenswrapper[4599]: I1012 07:49:11.947283 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-c095-account-create-4scdb" event={"ID":"ef050d1a-fc88-429d-a19a-eb55d2933057","Type":"ContainerStarted","Data":"10a8c5ff0f7349c985ba2db9c21303ab49f3252dc86fe2cf1f02fc0851ef6b8a"} Oct 12 07:49:11 crc kubenswrapper[4599]: I1012 07:49:11.987489 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 12 07:49:11 crc kubenswrapper[4599]: I1012 07:49:11.993908 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 12 07:49:12 crc kubenswrapper[4599]: I1012 07:49:12.011501 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 12 07:49:12 crc kubenswrapper[4599]: E1012 07:49:12.012006 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c477ca80-6743-4c07-b8ab-d9a083b87bcd" containerName="glance-httpd" Oct 12 07:49:12 crc kubenswrapper[4599]: I1012 07:49:12.012025 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="c477ca80-6743-4c07-b8ab-d9a083b87bcd" containerName="glance-httpd" Oct 12 07:49:12 crc kubenswrapper[4599]: E1012 07:49:12.012062 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c477ca80-6743-4c07-b8ab-d9a083b87bcd" containerName="glance-log" Oct 12 07:49:12 crc kubenswrapper[4599]: I1012 07:49:12.012071 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="c477ca80-6743-4c07-b8ab-d9a083b87bcd" containerName="glance-log" Oct 12 07:49:12 crc kubenswrapper[4599]: I1012 07:49:12.012221 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="c477ca80-6743-4c07-b8ab-d9a083b87bcd" containerName="glance-log" Oct 12 07:49:12 crc kubenswrapper[4599]: I1012 07:49:12.012245 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="c477ca80-6743-4c07-b8ab-d9a083b87bcd" containerName="glance-httpd" Oct 12 07:49:12 crc kubenswrapper[4599]: I1012 07:49:12.013385 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 12 07:49:12 crc kubenswrapper[4599]: I1012 07:49:12.018910 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 12 07:49:12 crc kubenswrapper[4599]: I1012 07:49:12.029110 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 12 07:49:12 crc kubenswrapper[4599]: I1012 07:49:12.030914 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 12 07:49:12 crc kubenswrapper[4599]: I1012 07:49:12.148966 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dc31b00c-21aa-4427-9b7a-c505c0520a2b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"dc31b00c-21aa-4427-9b7a-c505c0520a2b\") " pod="openstack/glance-default-internal-api-0" Oct 12 07:49:12 crc kubenswrapper[4599]: I1012 07:49:12.149071 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc31b00c-21aa-4427-9b7a-c505c0520a2b-logs\") pod \"glance-default-internal-api-0\" (UID: \"dc31b00c-21aa-4427-9b7a-c505c0520a2b\") " pod="openstack/glance-default-internal-api-0" Oct 12 07:49:12 crc kubenswrapper[4599]: I1012 07:49:12.149097 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc31b00c-21aa-4427-9b7a-c505c0520a2b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"dc31b00c-21aa-4427-9b7a-c505c0520a2b\") " pod="openstack/glance-default-internal-api-0" Oct 12 07:49:12 crc kubenswrapper[4599]: I1012 07:49:12.149231 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc31b00c-21aa-4427-9b7a-c505c0520a2b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"dc31b00c-21aa-4427-9b7a-c505c0520a2b\") " pod="openstack/glance-default-internal-api-0" Oct 12 07:49:12 crc kubenswrapper[4599]: I1012 07:49:12.149316 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6k49\" (UniqueName: \"kubernetes.io/projected/dc31b00c-21aa-4427-9b7a-c505c0520a2b-kube-api-access-x6k49\") pod \"glance-default-internal-api-0\" (UID: \"dc31b00c-21aa-4427-9b7a-c505c0520a2b\") " pod="openstack/glance-default-internal-api-0" Oct 12 07:49:12 crc kubenswrapper[4599]: I1012 07:49:12.149371 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc31b00c-21aa-4427-9b7a-c505c0520a2b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"dc31b00c-21aa-4427-9b7a-c505c0520a2b\") " pod="openstack/glance-default-internal-api-0" Oct 12 07:49:12 crc kubenswrapper[4599]: I1012 07:49:12.149402 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc31b00c-21aa-4427-9b7a-c505c0520a2b-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"dc31b00c-21aa-4427-9b7a-c505c0520a2b\") " pod="openstack/glance-default-internal-api-0" Oct 12 07:49:12 crc kubenswrapper[4599]: I1012 07:49:12.149805 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"dc31b00c-21aa-4427-9b7a-c505c0520a2b\") " pod="openstack/glance-default-internal-api-0" Oct 12 07:49:12 crc kubenswrapper[4599]: I1012 07:49:12.251180 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dc31b00c-21aa-4427-9b7a-c505c0520a2b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"dc31b00c-21aa-4427-9b7a-c505c0520a2b\") " pod="openstack/glance-default-internal-api-0" Oct 12 07:49:12 crc kubenswrapper[4599]: I1012 07:49:12.251233 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc31b00c-21aa-4427-9b7a-c505c0520a2b-logs\") pod \"glance-default-internal-api-0\" (UID: \"dc31b00c-21aa-4427-9b7a-c505c0520a2b\") " pod="openstack/glance-default-internal-api-0" Oct 12 07:49:12 crc kubenswrapper[4599]: I1012 07:49:12.251254 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc31b00c-21aa-4427-9b7a-c505c0520a2b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"dc31b00c-21aa-4427-9b7a-c505c0520a2b\") " pod="openstack/glance-default-internal-api-0" Oct 12 07:49:12 crc kubenswrapper[4599]: I1012 07:49:12.251284 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc31b00c-21aa-4427-9b7a-c505c0520a2b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"dc31b00c-21aa-4427-9b7a-c505c0520a2b\") " pod="openstack/glance-default-internal-api-0" Oct 12 07:49:12 crc kubenswrapper[4599]: I1012 07:49:12.251308 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6k49\" (UniqueName: \"kubernetes.io/projected/dc31b00c-21aa-4427-9b7a-c505c0520a2b-kube-api-access-x6k49\") pod \"glance-default-internal-api-0\" (UID: \"dc31b00c-21aa-4427-9b7a-c505c0520a2b\") " pod="openstack/glance-default-internal-api-0" Oct 12 07:49:12 crc kubenswrapper[4599]: I1012 07:49:12.251322 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc31b00c-21aa-4427-9b7a-c505c0520a2b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"dc31b00c-21aa-4427-9b7a-c505c0520a2b\") " pod="openstack/glance-default-internal-api-0" Oct 12 07:49:12 crc kubenswrapper[4599]: I1012 07:49:12.251350 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc31b00c-21aa-4427-9b7a-c505c0520a2b-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"dc31b00c-21aa-4427-9b7a-c505c0520a2b\") " pod="openstack/glance-default-internal-api-0" Oct 12 07:49:12 crc kubenswrapper[4599]: I1012 07:49:12.251439 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"dc31b00c-21aa-4427-9b7a-c505c0520a2b\") " pod="openstack/glance-default-internal-api-0" Oct 12 07:49:12 crc kubenswrapper[4599]: I1012 07:49:12.251736 4599 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"dc31b00c-21aa-4427-9b7a-c505c0520a2b\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-internal-api-0" Oct 12 07:49:12 crc kubenswrapper[4599]: I1012 07:49:12.251794 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc31b00c-21aa-4427-9b7a-c505c0520a2b-logs\") pod \"glance-default-internal-api-0\" (UID: \"dc31b00c-21aa-4427-9b7a-c505c0520a2b\") " pod="openstack/glance-default-internal-api-0" Oct 12 07:49:12 crc kubenswrapper[4599]: I1012 07:49:12.252093 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dc31b00c-21aa-4427-9b7a-c505c0520a2b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"dc31b00c-21aa-4427-9b7a-c505c0520a2b\") " pod="openstack/glance-default-internal-api-0" Oct 12 07:49:12 crc kubenswrapper[4599]: I1012 07:49:12.257347 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc31b00c-21aa-4427-9b7a-c505c0520a2b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"dc31b00c-21aa-4427-9b7a-c505c0520a2b\") " pod="openstack/glance-default-internal-api-0" Oct 12 07:49:12 crc kubenswrapper[4599]: I1012 07:49:12.257963 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc31b00c-21aa-4427-9b7a-c505c0520a2b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"dc31b00c-21aa-4427-9b7a-c505c0520a2b\") " pod="openstack/glance-default-internal-api-0" Oct 12 07:49:12 crc kubenswrapper[4599]: I1012 07:49:12.258503 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc31b00c-21aa-4427-9b7a-c505c0520a2b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"dc31b00c-21aa-4427-9b7a-c505c0520a2b\") " pod="openstack/glance-default-internal-api-0" Oct 12 07:49:12 crc kubenswrapper[4599]: I1012 07:49:12.260688 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc31b00c-21aa-4427-9b7a-c505c0520a2b-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"dc31b00c-21aa-4427-9b7a-c505c0520a2b\") " pod="openstack/glance-default-internal-api-0" Oct 12 07:49:12 crc kubenswrapper[4599]: I1012 07:49:12.268717 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6k49\" (UniqueName: \"kubernetes.io/projected/dc31b00c-21aa-4427-9b7a-c505c0520a2b-kube-api-access-x6k49\") pod \"glance-default-internal-api-0\" (UID: \"dc31b00c-21aa-4427-9b7a-c505c0520a2b\") " pod="openstack/glance-default-internal-api-0" Oct 12 07:49:12 crc kubenswrapper[4599]: I1012 07:49:12.281409 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"dc31b00c-21aa-4427-9b7a-c505c0520a2b\") " pod="openstack/glance-default-internal-api-0" Oct 12 07:49:12 crc kubenswrapper[4599]: I1012 07:49:12.344706 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 12 07:49:12 crc kubenswrapper[4599]: I1012 07:49:12.853580 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-zh5h5" Oct 12 07:49:12 crc kubenswrapper[4599]: I1012 07:49:12.860503 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 12 07:49:12 crc kubenswrapper[4599]: I1012 07:49:12.948718 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-zh5h5"] Oct 12 07:49:12 crc kubenswrapper[4599]: I1012 07:49:12.953497 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-zh5h5"] Oct 12 07:49:12 crc kubenswrapper[4599]: I1012 07:49:12.959855 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ba6bf0af-7d41-4886-9222-3c46d8ba55e2","Type":"ContainerDied","Data":"8742dff11f6ccf82fa611a06f474052f8fa67fb23f37bea59fb07df94f665108"} Oct 12 07:49:12 crc kubenswrapper[4599]: I1012 07:49:12.959878 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 12 07:49:12 crc kubenswrapper[4599]: I1012 07:49:12.961561 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-zh5h5" Oct 12 07:49:12 crc kubenswrapper[4599]: I1012 07:49:12.961627 4599 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e0d880548bb07640f6d1418cbbe02d9884e9a3998566c51b24caa1443687778e" Oct 12 07:49:12 crc kubenswrapper[4599]: I1012 07:49:12.966809 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rxxqv\" (UniqueName: \"kubernetes.io/projected/ba6bf0af-7d41-4886-9222-3c46d8ba55e2-kube-api-access-rxxqv\") pod \"ba6bf0af-7d41-4886-9222-3c46d8ba55e2\" (UID: \"ba6bf0af-7d41-4886-9222-3c46d8ba55e2\") " Oct 12 07:49:12 crc kubenswrapper[4599]: I1012 07:49:12.966853 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba6bf0af-7d41-4886-9222-3c46d8ba55e2-combined-ca-bundle\") pod \"ba6bf0af-7d41-4886-9222-3c46d8ba55e2\" (UID: \"ba6bf0af-7d41-4886-9222-3c46d8ba55e2\") " Oct 12 07:49:12 crc kubenswrapper[4599]: I1012 07:49:12.966892 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba6bf0af-7d41-4886-9222-3c46d8ba55e2-public-tls-certs\") pod \"ba6bf0af-7d41-4886-9222-3c46d8ba55e2\" (UID: \"ba6bf0af-7d41-4886-9222-3c46d8ba55e2\") " Oct 12 07:49:12 crc kubenswrapper[4599]: I1012 07:49:12.966916 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ba6bf0af-7d41-4886-9222-3c46d8ba55e2\" (UID: \"ba6bf0af-7d41-4886-9222-3c46d8ba55e2\") " Oct 12 07:49:12 crc kubenswrapper[4599]: I1012 07:49:12.966974 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/60c2daa7-082e-4c22-a040-67cbfd077bf4-fernet-keys\") pod \"60c2daa7-082e-4c22-a040-67cbfd077bf4\" (UID: \"60c2daa7-082e-4c22-a040-67cbfd077bf4\") " Oct 12 07:49:12 crc kubenswrapper[4599]: I1012 07:49:12.966996 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60c2daa7-082e-4c22-a040-67cbfd077bf4-combined-ca-bundle\") pod \"60c2daa7-082e-4c22-a040-67cbfd077bf4\" (UID: \"60c2daa7-082e-4c22-a040-67cbfd077bf4\") " Oct 12 07:49:12 crc kubenswrapper[4599]: I1012 07:49:12.967052 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/60c2daa7-082e-4c22-a040-67cbfd077bf4-credential-keys\") pod \"60c2daa7-082e-4c22-a040-67cbfd077bf4\" (UID: \"60c2daa7-082e-4c22-a040-67cbfd077bf4\") " Oct 12 07:49:12 crc kubenswrapper[4599]: I1012 07:49:12.967094 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60c2daa7-082e-4c22-a040-67cbfd077bf4-config-data\") pod \"60c2daa7-082e-4c22-a040-67cbfd077bf4\" (UID: \"60c2daa7-082e-4c22-a040-67cbfd077bf4\") " Oct 12 07:49:12 crc kubenswrapper[4599]: I1012 07:49:12.967128 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba6bf0af-7d41-4886-9222-3c46d8ba55e2-logs\") pod \"ba6bf0af-7d41-4886-9222-3c46d8ba55e2\" (UID: \"ba6bf0af-7d41-4886-9222-3c46d8ba55e2\") " Oct 12 07:49:12 crc kubenswrapper[4599]: I1012 07:49:12.967156 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba6bf0af-7d41-4886-9222-3c46d8ba55e2-scripts\") pod \"ba6bf0af-7d41-4886-9222-3c46d8ba55e2\" (UID: \"ba6bf0af-7d41-4886-9222-3c46d8ba55e2\") " Oct 12 07:49:12 crc kubenswrapper[4599]: I1012 07:49:12.967186 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zhklk\" (UniqueName: \"kubernetes.io/projected/60c2daa7-082e-4c22-a040-67cbfd077bf4-kube-api-access-zhklk\") pod \"60c2daa7-082e-4c22-a040-67cbfd077bf4\" (UID: \"60c2daa7-082e-4c22-a040-67cbfd077bf4\") " Oct 12 07:49:12 crc kubenswrapper[4599]: I1012 07:49:12.967207 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba6bf0af-7d41-4886-9222-3c46d8ba55e2-config-data\") pod \"ba6bf0af-7d41-4886-9222-3c46d8ba55e2\" (UID: \"ba6bf0af-7d41-4886-9222-3c46d8ba55e2\") " Oct 12 07:49:12 crc kubenswrapper[4599]: I1012 07:49:12.967231 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60c2daa7-082e-4c22-a040-67cbfd077bf4-scripts\") pod \"60c2daa7-082e-4c22-a040-67cbfd077bf4\" (UID: \"60c2daa7-082e-4c22-a040-67cbfd077bf4\") " Oct 12 07:49:12 crc kubenswrapper[4599]: I1012 07:49:12.967268 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ba6bf0af-7d41-4886-9222-3c46d8ba55e2-httpd-run\") pod \"ba6bf0af-7d41-4886-9222-3c46d8ba55e2\" (UID: \"ba6bf0af-7d41-4886-9222-3c46d8ba55e2\") " Oct 12 07:49:12 crc kubenswrapper[4599]: I1012 07:49:12.967866 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba6bf0af-7d41-4886-9222-3c46d8ba55e2-logs" (OuterVolumeSpecName: "logs") pod "ba6bf0af-7d41-4886-9222-3c46d8ba55e2" (UID: "ba6bf0af-7d41-4886-9222-3c46d8ba55e2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:49:12 crc kubenswrapper[4599]: I1012 07:49:12.971629 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba6bf0af-7d41-4886-9222-3c46d8ba55e2-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "ba6bf0af-7d41-4886-9222-3c46d8ba55e2" (UID: "ba6bf0af-7d41-4886-9222-3c46d8ba55e2"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:49:12 crc kubenswrapper[4599]: I1012 07:49:12.972190 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60c2daa7-082e-4c22-a040-67cbfd077bf4-scripts" (OuterVolumeSpecName: "scripts") pod "60c2daa7-082e-4c22-a040-67cbfd077bf4" (UID: "60c2daa7-082e-4c22-a040-67cbfd077bf4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:49:12 crc kubenswrapper[4599]: I1012 07:49:12.972449 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60c2daa7-082e-4c22-a040-67cbfd077bf4-kube-api-access-zhklk" (OuterVolumeSpecName: "kube-api-access-zhklk") pod "60c2daa7-082e-4c22-a040-67cbfd077bf4" (UID: "60c2daa7-082e-4c22-a040-67cbfd077bf4"). InnerVolumeSpecName "kube-api-access-zhklk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:49:12 crc kubenswrapper[4599]: I1012 07:49:12.974459 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60c2daa7-082e-4c22-a040-67cbfd077bf4-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "60c2daa7-082e-4c22-a040-67cbfd077bf4" (UID: "60c2daa7-082e-4c22-a040-67cbfd077bf4"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:49:12 crc kubenswrapper[4599]: I1012 07:49:12.974721 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60c2daa7-082e-4c22-a040-67cbfd077bf4-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "60c2daa7-082e-4c22-a040-67cbfd077bf4" (UID: "60c2daa7-082e-4c22-a040-67cbfd077bf4"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:49:12 crc kubenswrapper[4599]: I1012 07:49:12.975513 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "ba6bf0af-7d41-4886-9222-3c46d8ba55e2" (UID: "ba6bf0af-7d41-4886-9222-3c46d8ba55e2"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 12 07:49:12 crc kubenswrapper[4599]: I1012 07:49:12.979078 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba6bf0af-7d41-4886-9222-3c46d8ba55e2-scripts" (OuterVolumeSpecName: "scripts") pod "ba6bf0af-7d41-4886-9222-3c46d8ba55e2" (UID: "ba6bf0af-7d41-4886-9222-3c46d8ba55e2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:49:12 crc kubenswrapper[4599]: I1012 07:49:12.988267 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba6bf0af-7d41-4886-9222-3c46d8ba55e2-kube-api-access-rxxqv" (OuterVolumeSpecName: "kube-api-access-rxxqv") pod "ba6bf0af-7d41-4886-9222-3c46d8ba55e2" (UID: "ba6bf0af-7d41-4886-9222-3c46d8ba55e2"). InnerVolumeSpecName "kube-api-access-rxxqv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:49:13 crc kubenswrapper[4599]: I1012 07:49:13.000436 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60c2daa7-082e-4c22-a040-67cbfd077bf4-config-data" (OuterVolumeSpecName: "config-data") pod "60c2daa7-082e-4c22-a040-67cbfd077bf4" (UID: "60c2daa7-082e-4c22-a040-67cbfd077bf4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:49:13 crc kubenswrapper[4599]: I1012 07:49:13.001627 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba6bf0af-7d41-4886-9222-3c46d8ba55e2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ba6bf0af-7d41-4886-9222-3c46d8ba55e2" (UID: "ba6bf0af-7d41-4886-9222-3c46d8ba55e2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:49:13 crc kubenswrapper[4599]: I1012 07:49:13.003601 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60c2daa7-082e-4c22-a040-67cbfd077bf4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "60c2daa7-082e-4c22-a040-67cbfd077bf4" (UID: "60c2daa7-082e-4c22-a040-67cbfd077bf4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:49:13 crc kubenswrapper[4599]: I1012 07:49:13.015995 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba6bf0af-7d41-4886-9222-3c46d8ba55e2-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ba6bf0af-7d41-4886-9222-3c46d8ba55e2" (UID: "ba6bf0af-7d41-4886-9222-3c46d8ba55e2"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:49:13 crc kubenswrapper[4599]: I1012 07:49:13.016768 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba6bf0af-7d41-4886-9222-3c46d8ba55e2-config-data" (OuterVolumeSpecName: "config-data") pod "ba6bf0af-7d41-4886-9222-3c46d8ba55e2" (UID: "ba6bf0af-7d41-4886-9222-3c46d8ba55e2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:49:13 crc kubenswrapper[4599]: I1012 07:49:13.058368 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-tsv76"] Oct 12 07:49:13 crc kubenswrapper[4599]: E1012 07:49:13.058991 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba6bf0af-7d41-4886-9222-3c46d8ba55e2" containerName="glance-log" Oct 12 07:49:13 crc kubenswrapper[4599]: I1012 07:49:13.059009 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba6bf0af-7d41-4886-9222-3c46d8ba55e2" containerName="glance-log" Oct 12 07:49:13 crc kubenswrapper[4599]: E1012 07:49:13.059043 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba6bf0af-7d41-4886-9222-3c46d8ba55e2" containerName="glance-httpd" Oct 12 07:49:13 crc kubenswrapper[4599]: I1012 07:49:13.059049 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba6bf0af-7d41-4886-9222-3c46d8ba55e2" containerName="glance-httpd" Oct 12 07:49:13 crc kubenswrapper[4599]: E1012 07:49:13.059071 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60c2daa7-082e-4c22-a040-67cbfd077bf4" containerName="keystone-bootstrap" Oct 12 07:49:13 crc kubenswrapper[4599]: I1012 07:49:13.059077 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="60c2daa7-082e-4c22-a040-67cbfd077bf4" containerName="keystone-bootstrap" Oct 12 07:49:13 crc kubenswrapper[4599]: I1012 07:49:13.059221 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="60c2daa7-082e-4c22-a040-67cbfd077bf4" containerName="keystone-bootstrap" Oct 12 07:49:13 crc kubenswrapper[4599]: I1012 07:49:13.059234 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba6bf0af-7d41-4886-9222-3c46d8ba55e2" containerName="glance-httpd" Oct 12 07:49:13 crc kubenswrapper[4599]: I1012 07:49:13.059252 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba6bf0af-7d41-4886-9222-3c46d8ba55e2" containerName="glance-log" Oct 12 07:49:13 crc kubenswrapper[4599]: I1012 07:49:13.059865 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-tsv76" Oct 12 07:49:13 crc kubenswrapper[4599]: I1012 07:49:13.070609 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe135e30-f182-4517-8605-1097e7391663-config-data\") pod \"keystone-bootstrap-tsv76\" (UID: \"fe135e30-f182-4517-8605-1097e7391663\") " pod="openstack/keystone-bootstrap-tsv76" Oct 12 07:49:13 crc kubenswrapper[4599]: I1012 07:49:13.070678 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe135e30-f182-4517-8605-1097e7391663-scripts\") pod \"keystone-bootstrap-tsv76\" (UID: \"fe135e30-f182-4517-8605-1097e7391663\") " pod="openstack/keystone-bootstrap-tsv76" Oct 12 07:49:13 crc kubenswrapper[4599]: I1012 07:49:13.070772 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpx8x\" (UniqueName: \"kubernetes.io/projected/fe135e30-f182-4517-8605-1097e7391663-kube-api-access-wpx8x\") pod \"keystone-bootstrap-tsv76\" (UID: \"fe135e30-f182-4517-8605-1097e7391663\") " pod="openstack/keystone-bootstrap-tsv76" Oct 12 07:49:13 crc kubenswrapper[4599]: I1012 07:49:13.070813 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fe135e30-f182-4517-8605-1097e7391663-credential-keys\") pod \"keystone-bootstrap-tsv76\" (UID: \"fe135e30-f182-4517-8605-1097e7391663\") " pod="openstack/keystone-bootstrap-tsv76" Oct 12 07:49:13 crc kubenswrapper[4599]: I1012 07:49:13.070843 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fe135e30-f182-4517-8605-1097e7391663-fernet-keys\") pod \"keystone-bootstrap-tsv76\" (UID: \"fe135e30-f182-4517-8605-1097e7391663\") " pod="openstack/keystone-bootstrap-tsv76" Oct 12 07:49:13 crc kubenswrapper[4599]: I1012 07:49:13.070861 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe135e30-f182-4517-8605-1097e7391663-combined-ca-bundle\") pod \"keystone-bootstrap-tsv76\" (UID: \"fe135e30-f182-4517-8605-1097e7391663\") " pod="openstack/keystone-bootstrap-tsv76" Oct 12 07:49:13 crc kubenswrapper[4599]: I1012 07:49:13.070936 4599 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ba6bf0af-7d41-4886-9222-3c46d8ba55e2-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:13 crc kubenswrapper[4599]: I1012 07:49:13.070953 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rxxqv\" (UniqueName: \"kubernetes.io/projected/ba6bf0af-7d41-4886-9222-3c46d8ba55e2-kube-api-access-rxxqv\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:13 crc kubenswrapper[4599]: I1012 07:49:13.070963 4599 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba6bf0af-7d41-4886-9222-3c46d8ba55e2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:13 crc kubenswrapper[4599]: I1012 07:49:13.070971 4599 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba6bf0af-7d41-4886-9222-3c46d8ba55e2-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:13 crc kubenswrapper[4599]: I1012 07:49:13.070991 4599 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Oct 12 07:49:13 crc kubenswrapper[4599]: I1012 07:49:13.070999 4599 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/60c2daa7-082e-4c22-a040-67cbfd077bf4-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:13 crc kubenswrapper[4599]: I1012 07:49:13.071007 4599 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60c2daa7-082e-4c22-a040-67cbfd077bf4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:13 crc kubenswrapper[4599]: I1012 07:49:13.071017 4599 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/60c2daa7-082e-4c22-a040-67cbfd077bf4-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:13 crc kubenswrapper[4599]: I1012 07:49:13.071026 4599 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60c2daa7-082e-4c22-a040-67cbfd077bf4-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:13 crc kubenswrapper[4599]: I1012 07:49:13.071035 4599 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba6bf0af-7d41-4886-9222-3c46d8ba55e2-logs\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:13 crc kubenswrapper[4599]: I1012 07:49:13.071043 4599 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba6bf0af-7d41-4886-9222-3c46d8ba55e2-scripts\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:13 crc kubenswrapper[4599]: I1012 07:49:13.071051 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zhklk\" (UniqueName: \"kubernetes.io/projected/60c2daa7-082e-4c22-a040-67cbfd077bf4-kube-api-access-zhklk\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:13 crc kubenswrapper[4599]: I1012 07:49:13.071069 4599 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba6bf0af-7d41-4886-9222-3c46d8ba55e2-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:13 crc kubenswrapper[4599]: I1012 07:49:13.071077 4599 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60c2daa7-082e-4c22-a040-67cbfd077bf4-scripts\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:13 crc kubenswrapper[4599]: I1012 07:49:13.077875 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-tsv76"] Oct 12 07:49:13 crc kubenswrapper[4599]: I1012 07:49:13.086791 4599 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Oct 12 07:49:13 crc kubenswrapper[4599]: I1012 07:49:13.172413 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpx8x\" (UniqueName: \"kubernetes.io/projected/fe135e30-f182-4517-8605-1097e7391663-kube-api-access-wpx8x\") pod \"keystone-bootstrap-tsv76\" (UID: \"fe135e30-f182-4517-8605-1097e7391663\") " pod="openstack/keystone-bootstrap-tsv76" Oct 12 07:49:13 crc kubenswrapper[4599]: I1012 07:49:13.172486 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fe135e30-f182-4517-8605-1097e7391663-credential-keys\") pod \"keystone-bootstrap-tsv76\" (UID: \"fe135e30-f182-4517-8605-1097e7391663\") " pod="openstack/keystone-bootstrap-tsv76" Oct 12 07:49:13 crc kubenswrapper[4599]: I1012 07:49:13.172531 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fe135e30-f182-4517-8605-1097e7391663-fernet-keys\") pod \"keystone-bootstrap-tsv76\" (UID: \"fe135e30-f182-4517-8605-1097e7391663\") " pod="openstack/keystone-bootstrap-tsv76" Oct 12 07:49:13 crc kubenswrapper[4599]: I1012 07:49:13.172549 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe135e30-f182-4517-8605-1097e7391663-combined-ca-bundle\") pod \"keystone-bootstrap-tsv76\" (UID: \"fe135e30-f182-4517-8605-1097e7391663\") " pod="openstack/keystone-bootstrap-tsv76" Oct 12 07:49:13 crc kubenswrapper[4599]: I1012 07:49:13.172632 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe135e30-f182-4517-8605-1097e7391663-config-data\") pod \"keystone-bootstrap-tsv76\" (UID: \"fe135e30-f182-4517-8605-1097e7391663\") " pod="openstack/keystone-bootstrap-tsv76" Oct 12 07:49:13 crc kubenswrapper[4599]: I1012 07:49:13.172681 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe135e30-f182-4517-8605-1097e7391663-scripts\") pod \"keystone-bootstrap-tsv76\" (UID: \"fe135e30-f182-4517-8605-1097e7391663\") " pod="openstack/keystone-bootstrap-tsv76" Oct 12 07:49:13 crc kubenswrapper[4599]: I1012 07:49:13.172761 4599 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:13 crc kubenswrapper[4599]: I1012 07:49:13.176355 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe135e30-f182-4517-8605-1097e7391663-scripts\") pod \"keystone-bootstrap-tsv76\" (UID: \"fe135e30-f182-4517-8605-1097e7391663\") " pod="openstack/keystone-bootstrap-tsv76" Oct 12 07:49:13 crc kubenswrapper[4599]: I1012 07:49:13.177093 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fe135e30-f182-4517-8605-1097e7391663-fernet-keys\") pod \"keystone-bootstrap-tsv76\" (UID: \"fe135e30-f182-4517-8605-1097e7391663\") " pod="openstack/keystone-bootstrap-tsv76" Oct 12 07:49:13 crc kubenswrapper[4599]: I1012 07:49:13.177895 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe135e30-f182-4517-8605-1097e7391663-combined-ca-bundle\") pod \"keystone-bootstrap-tsv76\" (UID: \"fe135e30-f182-4517-8605-1097e7391663\") " pod="openstack/keystone-bootstrap-tsv76" Oct 12 07:49:13 crc kubenswrapper[4599]: I1012 07:49:13.179278 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe135e30-f182-4517-8605-1097e7391663-config-data\") pod \"keystone-bootstrap-tsv76\" (UID: \"fe135e30-f182-4517-8605-1097e7391663\") " pod="openstack/keystone-bootstrap-tsv76" Oct 12 07:49:13 crc kubenswrapper[4599]: I1012 07:49:13.182431 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fe135e30-f182-4517-8605-1097e7391663-credential-keys\") pod \"keystone-bootstrap-tsv76\" (UID: \"fe135e30-f182-4517-8605-1097e7391663\") " pod="openstack/keystone-bootstrap-tsv76" Oct 12 07:49:13 crc kubenswrapper[4599]: I1012 07:49:13.186407 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpx8x\" (UniqueName: \"kubernetes.io/projected/fe135e30-f182-4517-8605-1097e7391663-kube-api-access-wpx8x\") pod \"keystone-bootstrap-tsv76\" (UID: \"fe135e30-f182-4517-8605-1097e7391663\") " pod="openstack/keystone-bootstrap-tsv76" Oct 12 07:49:13 crc kubenswrapper[4599]: I1012 07:49:13.312537 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 12 07:49:13 crc kubenswrapper[4599]: I1012 07:49:13.326589 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 12 07:49:13 crc kubenswrapper[4599]: I1012 07:49:13.342401 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 12 07:49:13 crc kubenswrapper[4599]: I1012 07:49:13.357804 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 12 07:49:13 crc kubenswrapper[4599]: I1012 07:49:13.358351 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 12 07:49:13 crc kubenswrapper[4599]: I1012 07:49:13.360686 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 12 07:49:13 crc kubenswrapper[4599]: I1012 07:49:13.360703 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 12 07:49:13 crc kubenswrapper[4599]: I1012 07:49:13.377564 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-tsv76" Oct 12 07:49:13 crc kubenswrapper[4599]: E1012 07:49:13.387209 4599 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podba6bf0af_7d41_4886_9222_3c46d8ba55e2.slice/crio-8742dff11f6ccf82fa611a06f474052f8fa67fb23f37bea59fb07df94f665108\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60c2daa7_082e_4c22_a040_67cbfd077bf4.slice/crio-e0d880548bb07640f6d1418cbbe02d9884e9a3998566c51b24caa1443687778e\": RecentStats: unable to find data in memory cache]" Oct 12 07:49:13 crc kubenswrapper[4599]: I1012 07:49:13.402578 4599 scope.go:117] "RemoveContainer" containerID="d58fa2adf48aab55a1f6b72219075412b7688b4dbe7c1061af4723835d667f7c" Oct 12 07:49:13 crc kubenswrapper[4599]: I1012 07:49:13.418104 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-156a-account-create-6tgjb" Oct 12 07:49:13 crc kubenswrapper[4599]: I1012 07:49:13.427272 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-c095-account-create-4scdb" Oct 12 07:49:13 crc kubenswrapper[4599]: I1012 07:49:13.479869 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/88d1ee0b-4340-44c2-b357-1eb1741e30ca-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"88d1ee0b-4340-44c2-b357-1eb1741e30ca\") " pod="openstack/glance-default-external-api-0" Oct 12 07:49:13 crc kubenswrapper[4599]: I1012 07:49:13.479918 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88d1ee0b-4340-44c2-b357-1eb1741e30ca-config-data\") pod \"glance-default-external-api-0\" (UID: \"88d1ee0b-4340-44c2-b357-1eb1741e30ca\") " pod="openstack/glance-default-external-api-0" Oct 12 07:49:13 crc kubenswrapper[4599]: I1012 07:49:13.479943 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/88d1ee0b-4340-44c2-b357-1eb1741e30ca-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"88d1ee0b-4340-44c2-b357-1eb1741e30ca\") " pod="openstack/glance-default-external-api-0" Oct 12 07:49:13 crc kubenswrapper[4599]: I1012 07:49:13.480144 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88d1ee0b-4340-44c2-b357-1eb1741e30ca-logs\") pod \"glance-default-external-api-0\" (UID: \"88d1ee0b-4340-44c2-b357-1eb1741e30ca\") " pod="openstack/glance-default-external-api-0" Oct 12 07:49:13 crc kubenswrapper[4599]: I1012 07:49:13.480267 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76449\" (UniqueName: \"kubernetes.io/projected/88d1ee0b-4340-44c2-b357-1eb1741e30ca-kube-api-access-76449\") pod \"glance-default-external-api-0\" (UID: \"88d1ee0b-4340-44c2-b357-1eb1741e30ca\") " pod="openstack/glance-default-external-api-0" Oct 12 07:49:13 crc kubenswrapper[4599]: I1012 07:49:13.480307 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"88d1ee0b-4340-44c2-b357-1eb1741e30ca\") " pod="openstack/glance-default-external-api-0" Oct 12 07:49:13 crc kubenswrapper[4599]: I1012 07:49:13.480441 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88d1ee0b-4340-44c2-b357-1eb1741e30ca-scripts\") pod \"glance-default-external-api-0\" (UID: \"88d1ee0b-4340-44c2-b357-1eb1741e30ca\") " pod="openstack/glance-default-external-api-0" Oct 12 07:49:13 crc kubenswrapper[4599]: I1012 07:49:13.480538 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88d1ee0b-4340-44c2-b357-1eb1741e30ca-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"88d1ee0b-4340-44c2-b357-1eb1741e30ca\") " pod="openstack/glance-default-external-api-0" Oct 12 07:49:13 crc kubenswrapper[4599]: I1012 07:49:13.559247 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60c2daa7-082e-4c22-a040-67cbfd077bf4" path="/var/lib/kubelet/pods/60c2daa7-082e-4c22-a040-67cbfd077bf4/volumes" Oct 12 07:49:13 crc kubenswrapper[4599]: I1012 07:49:13.560293 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba6bf0af-7d41-4886-9222-3c46d8ba55e2" path="/var/lib/kubelet/pods/ba6bf0af-7d41-4886-9222-3c46d8ba55e2/volumes" Oct 12 07:49:13 crc kubenswrapper[4599]: I1012 07:49:13.560951 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c477ca80-6743-4c07-b8ab-d9a083b87bcd" path="/var/lib/kubelet/pods/c477ca80-6743-4c07-b8ab-d9a083b87bcd/volumes" Oct 12 07:49:13 crc kubenswrapper[4599]: I1012 07:49:13.582402 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7vp2f\" (UniqueName: \"kubernetes.io/projected/ef050d1a-fc88-429d-a19a-eb55d2933057-kube-api-access-7vp2f\") pod \"ef050d1a-fc88-429d-a19a-eb55d2933057\" (UID: \"ef050d1a-fc88-429d-a19a-eb55d2933057\") " Oct 12 07:49:13 crc kubenswrapper[4599]: I1012 07:49:13.582791 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5564g\" (UniqueName: \"kubernetes.io/projected/3775409f-2e10-405a-b777-66ce4f084bd7-kube-api-access-5564g\") pod \"3775409f-2e10-405a-b777-66ce4f084bd7\" (UID: \"3775409f-2e10-405a-b777-66ce4f084bd7\") " Oct 12 07:49:13 crc kubenswrapper[4599]: I1012 07:49:13.583434 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88d1ee0b-4340-44c2-b357-1eb1741e30ca-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"88d1ee0b-4340-44c2-b357-1eb1741e30ca\") " pod="openstack/glance-default-external-api-0" Oct 12 07:49:13 crc kubenswrapper[4599]: I1012 07:49:13.583485 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/88d1ee0b-4340-44c2-b357-1eb1741e30ca-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"88d1ee0b-4340-44c2-b357-1eb1741e30ca\") " pod="openstack/glance-default-external-api-0" Oct 12 07:49:13 crc kubenswrapper[4599]: I1012 07:49:13.583546 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88d1ee0b-4340-44c2-b357-1eb1741e30ca-config-data\") pod \"glance-default-external-api-0\" (UID: \"88d1ee0b-4340-44c2-b357-1eb1741e30ca\") " pod="openstack/glance-default-external-api-0" Oct 12 07:49:13 crc kubenswrapper[4599]: I1012 07:49:13.583567 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/88d1ee0b-4340-44c2-b357-1eb1741e30ca-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"88d1ee0b-4340-44c2-b357-1eb1741e30ca\") " pod="openstack/glance-default-external-api-0" Oct 12 07:49:13 crc kubenswrapper[4599]: I1012 07:49:13.583636 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88d1ee0b-4340-44c2-b357-1eb1741e30ca-logs\") pod \"glance-default-external-api-0\" (UID: \"88d1ee0b-4340-44c2-b357-1eb1741e30ca\") " pod="openstack/glance-default-external-api-0" Oct 12 07:49:13 crc kubenswrapper[4599]: I1012 07:49:13.583697 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76449\" (UniqueName: \"kubernetes.io/projected/88d1ee0b-4340-44c2-b357-1eb1741e30ca-kube-api-access-76449\") pod \"glance-default-external-api-0\" (UID: \"88d1ee0b-4340-44c2-b357-1eb1741e30ca\") " pod="openstack/glance-default-external-api-0" Oct 12 07:49:13 crc kubenswrapper[4599]: I1012 07:49:13.583724 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"88d1ee0b-4340-44c2-b357-1eb1741e30ca\") " pod="openstack/glance-default-external-api-0" Oct 12 07:49:13 crc kubenswrapper[4599]: I1012 07:49:13.583783 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88d1ee0b-4340-44c2-b357-1eb1741e30ca-scripts\") pod \"glance-default-external-api-0\" (UID: \"88d1ee0b-4340-44c2-b357-1eb1741e30ca\") " pod="openstack/glance-default-external-api-0" Oct 12 07:49:13 crc kubenswrapper[4599]: I1012 07:49:13.585573 4599 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"88d1ee0b-4340-44c2-b357-1eb1741e30ca\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-external-api-0" Oct 12 07:49:13 crc kubenswrapper[4599]: I1012 07:49:13.585579 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88d1ee0b-4340-44c2-b357-1eb1741e30ca-logs\") pod \"glance-default-external-api-0\" (UID: \"88d1ee0b-4340-44c2-b357-1eb1741e30ca\") " pod="openstack/glance-default-external-api-0" Oct 12 07:49:13 crc kubenswrapper[4599]: I1012 07:49:13.589503 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/88d1ee0b-4340-44c2-b357-1eb1741e30ca-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"88d1ee0b-4340-44c2-b357-1eb1741e30ca\") " pod="openstack/glance-default-external-api-0" Oct 12 07:49:13 crc kubenswrapper[4599]: I1012 07:49:13.592185 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef050d1a-fc88-429d-a19a-eb55d2933057-kube-api-access-7vp2f" (OuterVolumeSpecName: "kube-api-access-7vp2f") pod "ef050d1a-fc88-429d-a19a-eb55d2933057" (UID: "ef050d1a-fc88-429d-a19a-eb55d2933057"). InnerVolumeSpecName "kube-api-access-7vp2f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:49:13 crc kubenswrapper[4599]: I1012 07:49:13.592528 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/88d1ee0b-4340-44c2-b357-1eb1741e30ca-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"88d1ee0b-4340-44c2-b357-1eb1741e30ca\") " pod="openstack/glance-default-external-api-0" Oct 12 07:49:13 crc kubenswrapper[4599]: I1012 07:49:13.593879 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88d1ee0b-4340-44c2-b357-1eb1741e30ca-scripts\") pod \"glance-default-external-api-0\" (UID: \"88d1ee0b-4340-44c2-b357-1eb1741e30ca\") " pod="openstack/glance-default-external-api-0" Oct 12 07:49:13 crc kubenswrapper[4599]: I1012 07:49:13.593892 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88d1ee0b-4340-44c2-b357-1eb1741e30ca-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"88d1ee0b-4340-44c2-b357-1eb1741e30ca\") " pod="openstack/glance-default-external-api-0" Oct 12 07:49:13 crc kubenswrapper[4599]: I1012 07:49:13.594192 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88d1ee0b-4340-44c2-b357-1eb1741e30ca-config-data\") pod \"glance-default-external-api-0\" (UID: \"88d1ee0b-4340-44c2-b357-1eb1741e30ca\") " pod="openstack/glance-default-external-api-0" Oct 12 07:49:13 crc kubenswrapper[4599]: I1012 07:49:13.600959 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3775409f-2e10-405a-b777-66ce4f084bd7-kube-api-access-5564g" (OuterVolumeSpecName: "kube-api-access-5564g") pod "3775409f-2e10-405a-b777-66ce4f084bd7" (UID: "3775409f-2e10-405a-b777-66ce4f084bd7"). InnerVolumeSpecName "kube-api-access-5564g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:49:13 crc kubenswrapper[4599]: I1012 07:49:13.605699 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76449\" (UniqueName: \"kubernetes.io/projected/88d1ee0b-4340-44c2-b357-1eb1741e30ca-kube-api-access-76449\") pod \"glance-default-external-api-0\" (UID: \"88d1ee0b-4340-44c2-b357-1eb1741e30ca\") " pod="openstack/glance-default-external-api-0" Oct 12 07:49:13 crc kubenswrapper[4599]: I1012 07:49:13.615739 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"88d1ee0b-4340-44c2-b357-1eb1741e30ca\") " pod="openstack/glance-default-external-api-0" Oct 12 07:49:13 crc kubenswrapper[4599]: I1012 07:49:13.690411 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7vp2f\" (UniqueName: \"kubernetes.io/projected/ef050d1a-fc88-429d-a19a-eb55d2933057-kube-api-access-7vp2f\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:13 crc kubenswrapper[4599]: I1012 07:49:13.690451 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5564g\" (UniqueName: \"kubernetes.io/projected/3775409f-2e10-405a-b777-66ce4f084bd7-kube-api-access-5564g\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:13 crc kubenswrapper[4599]: I1012 07:49:13.715656 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 12 07:49:13 crc kubenswrapper[4599]: I1012 07:49:13.977271 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-c095-account-create-4scdb" event={"ID":"ef050d1a-fc88-429d-a19a-eb55d2933057","Type":"ContainerDied","Data":"10a8c5ff0f7349c985ba2db9c21303ab49f3252dc86fe2cf1f02fc0851ef6b8a"} Oct 12 07:49:13 crc kubenswrapper[4599]: I1012 07:49:13.977320 4599 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="10a8c5ff0f7349c985ba2db9c21303ab49f3252dc86fe2cf1f02fc0851ef6b8a" Oct 12 07:49:13 crc kubenswrapper[4599]: I1012 07:49:13.977457 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-c095-account-create-4scdb" Oct 12 07:49:13 crc kubenswrapper[4599]: I1012 07:49:13.985723 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-156a-account-create-6tgjb" event={"ID":"3775409f-2e10-405a-b777-66ce4f084bd7","Type":"ContainerDied","Data":"82fca52d1e90a3707adfc7f8640431e2ad4b5f4d9ab45819863f1b73fe0b49ab"} Oct 12 07:49:13 crc kubenswrapper[4599]: I1012 07:49:13.985785 4599 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="82fca52d1e90a3707adfc7f8640431e2ad4b5f4d9ab45819863f1b73fe0b49ab" Oct 12 07:49:13 crc kubenswrapper[4599]: I1012 07:49:13.985848 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-156a-account-create-6tgjb" Oct 12 07:49:14 crc kubenswrapper[4599]: I1012 07:49:14.906569 4599 scope.go:117] "RemoveContainer" containerID="bbd6b705d355fa9f3c6818a39b0b8d62c104a44ae2b710f18d60e1be788303da" Oct 12 07:49:15 crc kubenswrapper[4599]: I1012 07:49:15.030342 4599 scope.go:117] "RemoveContainer" containerID="77aed42684f97a3603ff2909ec9719ff2c14c136e27aa20e9688ba4a364c5739" Oct 12 07:49:15 crc kubenswrapper[4599]: I1012 07:49:15.171682 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-x47gg"] Oct 12 07:49:15 crc kubenswrapper[4599]: E1012 07:49:15.172218 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef050d1a-fc88-429d-a19a-eb55d2933057" containerName="mariadb-account-create" Oct 12 07:49:15 crc kubenswrapper[4599]: I1012 07:49:15.172283 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef050d1a-fc88-429d-a19a-eb55d2933057" containerName="mariadb-account-create" Oct 12 07:49:15 crc kubenswrapper[4599]: E1012 07:49:15.172363 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3775409f-2e10-405a-b777-66ce4f084bd7" containerName="mariadb-account-create" Oct 12 07:49:15 crc kubenswrapper[4599]: I1012 07:49:15.172412 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="3775409f-2e10-405a-b777-66ce4f084bd7" containerName="mariadb-account-create" Oct 12 07:49:15 crc kubenswrapper[4599]: I1012 07:49:15.172599 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef050d1a-fc88-429d-a19a-eb55d2933057" containerName="mariadb-account-create" Oct 12 07:49:15 crc kubenswrapper[4599]: I1012 07:49:15.172670 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="3775409f-2e10-405a-b777-66ce4f084bd7" containerName="mariadb-account-create" Oct 12 07:49:15 crc kubenswrapper[4599]: I1012 07:49:15.173531 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-x47gg" Oct 12 07:49:15 crc kubenswrapper[4599]: I1012 07:49:15.176517 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-nchcj" Oct 12 07:49:15 crc kubenswrapper[4599]: I1012 07:49:15.177664 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 12 07:49:15 crc kubenswrapper[4599]: I1012 07:49:15.177866 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 12 07:49:15 crc kubenswrapper[4599]: I1012 07:49:15.185355 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-x47gg"] Oct 12 07:49:15 crc kubenswrapper[4599]: I1012 07:49:15.238986 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc0cde93-5cbb-4652-9dc3-05666e908b49-config-data\") pod \"cinder-db-sync-x47gg\" (UID: \"dc0cde93-5cbb-4652-9dc3-05666e908b49\") " pod="openstack/cinder-db-sync-x47gg" Oct 12 07:49:15 crc kubenswrapper[4599]: I1012 07:49:15.239040 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc0cde93-5cbb-4652-9dc3-05666e908b49-scripts\") pod \"cinder-db-sync-x47gg\" (UID: \"dc0cde93-5cbb-4652-9dc3-05666e908b49\") " pod="openstack/cinder-db-sync-x47gg" Oct 12 07:49:15 crc kubenswrapper[4599]: I1012 07:49:15.239081 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4d7cz\" (UniqueName: \"kubernetes.io/projected/dc0cde93-5cbb-4652-9dc3-05666e908b49-kube-api-access-4d7cz\") pod \"cinder-db-sync-x47gg\" (UID: \"dc0cde93-5cbb-4652-9dc3-05666e908b49\") " pod="openstack/cinder-db-sync-x47gg" Oct 12 07:49:15 crc kubenswrapper[4599]: I1012 07:49:15.239125 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dc0cde93-5cbb-4652-9dc3-05666e908b49-etc-machine-id\") pod \"cinder-db-sync-x47gg\" (UID: \"dc0cde93-5cbb-4652-9dc3-05666e908b49\") " pod="openstack/cinder-db-sync-x47gg" Oct 12 07:49:15 crc kubenswrapper[4599]: I1012 07:49:15.239145 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dc0cde93-5cbb-4652-9dc3-05666e908b49-db-sync-config-data\") pod \"cinder-db-sync-x47gg\" (UID: \"dc0cde93-5cbb-4652-9dc3-05666e908b49\") " pod="openstack/cinder-db-sync-x47gg" Oct 12 07:49:15 crc kubenswrapper[4599]: I1012 07:49:15.239186 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc0cde93-5cbb-4652-9dc3-05666e908b49-combined-ca-bundle\") pod \"cinder-db-sync-x47gg\" (UID: \"dc0cde93-5cbb-4652-9dc3-05666e908b49\") " pod="openstack/cinder-db-sync-x47gg" Oct 12 07:49:15 crc kubenswrapper[4599]: I1012 07:49:15.340189 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc0cde93-5cbb-4652-9dc3-05666e908b49-combined-ca-bundle\") pod \"cinder-db-sync-x47gg\" (UID: \"dc0cde93-5cbb-4652-9dc3-05666e908b49\") " pod="openstack/cinder-db-sync-x47gg" Oct 12 07:49:15 crc kubenswrapper[4599]: I1012 07:49:15.340263 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc0cde93-5cbb-4652-9dc3-05666e908b49-config-data\") pod \"cinder-db-sync-x47gg\" (UID: \"dc0cde93-5cbb-4652-9dc3-05666e908b49\") " pod="openstack/cinder-db-sync-x47gg" Oct 12 07:49:15 crc kubenswrapper[4599]: I1012 07:49:15.340311 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc0cde93-5cbb-4652-9dc3-05666e908b49-scripts\") pod \"cinder-db-sync-x47gg\" (UID: \"dc0cde93-5cbb-4652-9dc3-05666e908b49\") " pod="openstack/cinder-db-sync-x47gg" Oct 12 07:49:15 crc kubenswrapper[4599]: I1012 07:49:15.340351 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4d7cz\" (UniqueName: \"kubernetes.io/projected/dc0cde93-5cbb-4652-9dc3-05666e908b49-kube-api-access-4d7cz\") pod \"cinder-db-sync-x47gg\" (UID: \"dc0cde93-5cbb-4652-9dc3-05666e908b49\") " pod="openstack/cinder-db-sync-x47gg" Oct 12 07:49:15 crc kubenswrapper[4599]: I1012 07:49:15.340403 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dc0cde93-5cbb-4652-9dc3-05666e908b49-etc-machine-id\") pod \"cinder-db-sync-x47gg\" (UID: \"dc0cde93-5cbb-4652-9dc3-05666e908b49\") " pod="openstack/cinder-db-sync-x47gg" Oct 12 07:49:15 crc kubenswrapper[4599]: I1012 07:49:15.340433 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dc0cde93-5cbb-4652-9dc3-05666e908b49-db-sync-config-data\") pod \"cinder-db-sync-x47gg\" (UID: \"dc0cde93-5cbb-4652-9dc3-05666e908b49\") " pod="openstack/cinder-db-sync-x47gg" Oct 12 07:49:15 crc kubenswrapper[4599]: I1012 07:49:15.340917 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dc0cde93-5cbb-4652-9dc3-05666e908b49-etc-machine-id\") pod \"cinder-db-sync-x47gg\" (UID: \"dc0cde93-5cbb-4652-9dc3-05666e908b49\") " pod="openstack/cinder-db-sync-x47gg" Oct 12 07:49:15 crc kubenswrapper[4599]: I1012 07:49:15.348441 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc0cde93-5cbb-4652-9dc3-05666e908b49-config-data\") pod \"cinder-db-sync-x47gg\" (UID: \"dc0cde93-5cbb-4652-9dc3-05666e908b49\") " pod="openstack/cinder-db-sync-x47gg" Oct 12 07:49:15 crc kubenswrapper[4599]: I1012 07:49:15.348812 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc0cde93-5cbb-4652-9dc3-05666e908b49-scripts\") pod \"cinder-db-sync-x47gg\" (UID: \"dc0cde93-5cbb-4652-9dc3-05666e908b49\") " pod="openstack/cinder-db-sync-x47gg" Oct 12 07:49:15 crc kubenswrapper[4599]: I1012 07:49:15.351757 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dc0cde93-5cbb-4652-9dc3-05666e908b49-db-sync-config-data\") pod \"cinder-db-sync-x47gg\" (UID: \"dc0cde93-5cbb-4652-9dc3-05666e908b49\") " pod="openstack/cinder-db-sync-x47gg" Oct 12 07:49:15 crc kubenswrapper[4599]: I1012 07:49:15.355510 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc0cde93-5cbb-4652-9dc3-05666e908b49-combined-ca-bundle\") pod \"cinder-db-sync-x47gg\" (UID: \"dc0cde93-5cbb-4652-9dc3-05666e908b49\") " pod="openstack/cinder-db-sync-x47gg" Oct 12 07:49:15 crc kubenswrapper[4599]: I1012 07:49:15.369669 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 12 07:49:15 crc kubenswrapper[4599]: I1012 07:49:15.380680 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-h6mzn"] Oct 12 07:49:15 crc kubenswrapper[4599]: I1012 07:49:15.385442 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-h6mzn" Oct 12 07:49:15 crc kubenswrapper[4599]: I1012 07:49:15.388998 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-96cb4" Oct 12 07:49:15 crc kubenswrapper[4599]: I1012 07:49:15.389317 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 12 07:49:15 crc kubenswrapper[4599]: I1012 07:49:15.390959 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4d7cz\" (UniqueName: \"kubernetes.io/projected/dc0cde93-5cbb-4652-9dc3-05666e908b49-kube-api-access-4d7cz\") pod \"cinder-db-sync-x47gg\" (UID: \"dc0cde93-5cbb-4652-9dc3-05666e908b49\") " pod="openstack/cinder-db-sync-x47gg" Oct 12 07:49:15 crc kubenswrapper[4599]: I1012 07:49:15.418743 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-h6mzn"] Oct 12 07:49:15 crc kubenswrapper[4599]: I1012 07:49:15.436393 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-tsv76"] Oct 12 07:49:15 crc kubenswrapper[4599]: I1012 07:49:15.444977 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8468e667-fc53-4e39-939f-2221c78d4313-combined-ca-bundle\") pod \"barbican-db-sync-h6mzn\" (UID: \"8468e667-fc53-4e39-939f-2221c78d4313\") " pod="openstack/barbican-db-sync-h6mzn" Oct 12 07:49:15 crc kubenswrapper[4599]: I1012 07:49:15.445047 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrttr\" (UniqueName: \"kubernetes.io/projected/8468e667-fc53-4e39-939f-2221c78d4313-kube-api-access-vrttr\") pod \"barbican-db-sync-h6mzn\" (UID: \"8468e667-fc53-4e39-939f-2221c78d4313\") " pod="openstack/barbican-db-sync-h6mzn" Oct 12 07:49:15 crc kubenswrapper[4599]: I1012 07:49:15.445154 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8468e667-fc53-4e39-939f-2221c78d4313-db-sync-config-data\") pod \"barbican-db-sync-h6mzn\" (UID: \"8468e667-fc53-4e39-939f-2221c78d4313\") " pod="openstack/barbican-db-sync-h6mzn" Oct 12 07:49:15 crc kubenswrapper[4599]: I1012 07:49:15.495706 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 12 07:49:15 crc kubenswrapper[4599]: I1012 07:49:15.506527 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-x47gg" Oct 12 07:49:15 crc kubenswrapper[4599]: I1012 07:49:15.546744 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8468e667-fc53-4e39-939f-2221c78d4313-combined-ca-bundle\") pod \"barbican-db-sync-h6mzn\" (UID: \"8468e667-fc53-4e39-939f-2221c78d4313\") " pod="openstack/barbican-db-sync-h6mzn" Oct 12 07:49:15 crc kubenswrapper[4599]: I1012 07:49:15.546806 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrttr\" (UniqueName: \"kubernetes.io/projected/8468e667-fc53-4e39-939f-2221c78d4313-kube-api-access-vrttr\") pod \"barbican-db-sync-h6mzn\" (UID: \"8468e667-fc53-4e39-939f-2221c78d4313\") " pod="openstack/barbican-db-sync-h6mzn" Oct 12 07:49:15 crc kubenswrapper[4599]: I1012 07:49:15.546838 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8468e667-fc53-4e39-939f-2221c78d4313-db-sync-config-data\") pod \"barbican-db-sync-h6mzn\" (UID: \"8468e667-fc53-4e39-939f-2221c78d4313\") " pod="openstack/barbican-db-sync-h6mzn" Oct 12 07:49:15 crc kubenswrapper[4599]: I1012 07:49:15.550648 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8468e667-fc53-4e39-939f-2221c78d4313-combined-ca-bundle\") pod \"barbican-db-sync-h6mzn\" (UID: \"8468e667-fc53-4e39-939f-2221c78d4313\") " pod="openstack/barbican-db-sync-h6mzn" Oct 12 07:49:15 crc kubenswrapper[4599]: I1012 07:49:15.551591 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8468e667-fc53-4e39-939f-2221c78d4313-db-sync-config-data\") pod \"barbican-db-sync-h6mzn\" (UID: \"8468e667-fc53-4e39-939f-2221c78d4313\") " pod="openstack/barbican-db-sync-h6mzn" Oct 12 07:49:15 crc kubenswrapper[4599]: I1012 07:49:15.562639 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrttr\" (UniqueName: \"kubernetes.io/projected/8468e667-fc53-4e39-939f-2221c78d4313-kube-api-access-vrttr\") pod \"barbican-db-sync-h6mzn\" (UID: \"8468e667-fc53-4e39-939f-2221c78d4313\") " pod="openstack/barbican-db-sync-h6mzn" Oct 12 07:49:15 crc kubenswrapper[4599]: I1012 07:49:15.762245 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-h6mzn" Oct 12 07:49:15 crc kubenswrapper[4599]: I1012 07:49:15.947423 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-x47gg"] Oct 12 07:49:15 crc kubenswrapper[4599]: W1012 07:49:15.951824 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc0cde93_5cbb_4652_9dc3_05666e908b49.slice/crio-e3cc40fdec404afbb24efe8268d6fb4ae39de440a6f89184660b1c77f3be41d2 WatchSource:0}: Error finding container e3cc40fdec404afbb24efe8268d6fb4ae39de440a6f89184660b1c77f3be41d2: Status 404 returned error can't find the container with id e3cc40fdec404afbb24efe8268d6fb4ae39de440a6f89184660b1c77f3be41d2 Oct 12 07:49:16 crc kubenswrapper[4599]: I1012 07:49:16.031769 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"dc31b00c-21aa-4427-9b7a-c505c0520a2b","Type":"ContainerStarted","Data":"159b7718922d0ed3a94be25af6599182de8723adcf8b49b4193ff782af3811cd"} Oct 12 07:49:16 crc kubenswrapper[4599]: I1012 07:49:16.033579 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-x47gg" event={"ID":"dc0cde93-5cbb-4652-9dc3-05666e908b49","Type":"ContainerStarted","Data":"e3cc40fdec404afbb24efe8268d6fb4ae39de440a6f89184660b1c77f3be41d2"} Oct 12 07:49:16 crc kubenswrapper[4599]: I1012 07:49:16.034669 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"88d1ee0b-4340-44c2-b357-1eb1741e30ca","Type":"ContainerStarted","Data":"4a806162947aca9c71f7c24295e69d42ef5b1ff04a34e49bf337c8f4cdc1d598"} Oct 12 07:49:16 crc kubenswrapper[4599]: I1012 07:49:16.036211 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-tsv76" event={"ID":"fe135e30-f182-4517-8605-1097e7391663","Type":"ContainerStarted","Data":"04f82369cc5cd945d983ea3d11c618039f7d944f00bfb98dab9a030da4affe36"} Oct 12 07:49:16 crc kubenswrapper[4599]: I1012 07:49:16.154181 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-h6mzn"] Oct 12 07:49:16 crc kubenswrapper[4599]: W1012 07:49:16.157275 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8468e667_fc53_4e39_939f_2221c78d4313.slice/crio-6fa4c11bf7f8c99e9df06afc0918bd58d2acc2bf3c6bf895e33bdb67868ee9ab WatchSource:0}: Error finding container 6fa4c11bf7f8c99e9df06afc0918bd58d2acc2bf3c6bf895e33bdb67868ee9ab: Status 404 returned error can't find the container with id 6fa4c11bf7f8c99e9df06afc0918bd58d2acc2bf3c6bf895e33bdb67868ee9ab Oct 12 07:49:16 crc kubenswrapper[4599]: I1012 07:49:16.614782 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-977d96ff-6cnmb" Oct 12 07:49:16 crc kubenswrapper[4599]: I1012 07:49:16.660528 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5dcf8755f-m2r8l"] Oct 12 07:49:16 crc kubenswrapper[4599]: I1012 07:49:16.660743 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5dcf8755f-m2r8l" podUID="d0568da7-e8d7-4506-b405-8f7488ce28f9" containerName="dnsmasq-dns" containerID="cri-o://ad652b3ca8bdcc1691f62d028ce31f4973911eb7043f562c6184fa5e4c28e41e" gracePeriod=10 Oct 12 07:49:17 crc kubenswrapper[4599]: I1012 07:49:17.043670 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-h6mzn" event={"ID":"8468e667-fc53-4e39-939f-2221c78d4313","Type":"ContainerStarted","Data":"6fa4c11bf7f8c99e9df06afc0918bd58d2acc2bf3c6bf895e33bdb67868ee9ab"} Oct 12 07:49:18 crc kubenswrapper[4599]: I1012 07:49:18.058625 4599 generic.go:334] "Generic (PLEG): container finished" podID="d0568da7-e8d7-4506-b405-8f7488ce28f9" containerID="ad652b3ca8bdcc1691f62d028ce31f4973911eb7043f562c6184fa5e4c28e41e" exitCode=0 Oct 12 07:49:18 crc kubenswrapper[4599]: I1012 07:49:18.058901 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dcf8755f-m2r8l" event={"ID":"d0568da7-e8d7-4506-b405-8f7488ce28f9","Type":"ContainerDied","Data":"ad652b3ca8bdcc1691f62d028ce31f4973911eb7043f562c6184fa5e4c28e41e"} Oct 12 07:49:18 crc kubenswrapper[4599]: I1012 07:49:18.063692 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f94bf31d-933c-4cca-998a-4c3fc9a451f4","Type":"ContainerStarted","Data":"7f1c45d5e3d2dfbf584a7cb0627a5bc4ccaff62de2ca235d6cde978919ccda81"} Oct 12 07:49:18 crc kubenswrapper[4599]: I1012 07:49:18.065363 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-pkz8c" event={"ID":"9d43ed11-7175-498a-8dc1-e15cfc41b5c8","Type":"ContainerStarted","Data":"5a4ad9a8d063c11b64ee6e2b27292cec66ec28f8bd6209138160a1fec33b2c1d"} Oct 12 07:49:18 crc kubenswrapper[4599]: I1012 07:49:18.076996 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"dc31b00c-21aa-4427-9b7a-c505c0520a2b","Type":"ContainerStarted","Data":"f912827e300eaa1554f6da57f1cfb2f16457c99ae40ae89ae4358b683feb7234"} Oct 12 07:49:18 crc kubenswrapper[4599]: I1012 07:49:18.077030 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"dc31b00c-21aa-4427-9b7a-c505c0520a2b","Type":"ContainerStarted","Data":"66e2798ff6540355b4f8a0476df8a51f5593324c7dc6dcff58b90c70631d878f"} Oct 12 07:49:18 crc kubenswrapper[4599]: I1012 07:49:18.082296 4599 generic.go:334] "Generic (PLEG): container finished" podID="616ae17d-da7f-4f46-9d1e-234bcb028377" containerID="47a28c2366261edb36aec145141f5c0e6ab03e13b3b034083d35349f45c1deeb" exitCode=0 Oct 12 07:49:18 crc kubenswrapper[4599]: I1012 07:49:18.082511 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-0e9d-account-create-xlkmf" event={"ID":"616ae17d-da7f-4f46-9d1e-234bcb028377","Type":"ContainerDied","Data":"47a28c2366261edb36aec145141f5c0e6ab03e13b3b034083d35349f45c1deeb"} Oct 12 07:49:18 crc kubenswrapper[4599]: I1012 07:49:18.083216 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-pkz8c" podStartSLOduration=4.268964297 podStartE2EDuration="12.083204008s" podCreationTimestamp="2025-10-12 07:49:06 +0000 UTC" firstStartedPulling="2025-10-12 07:49:07.109931506 +0000 UTC m=+843.899127007" lastFinishedPulling="2025-10-12 07:49:14.924171217 +0000 UTC m=+851.713366718" observedRunningTime="2025-10-12 07:49:18.081677948 +0000 UTC m=+854.870873450" watchObservedRunningTime="2025-10-12 07:49:18.083204008 +0000 UTC m=+854.872399511" Oct 12 07:49:18 crc kubenswrapper[4599]: I1012 07:49:18.100579 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"88d1ee0b-4340-44c2-b357-1eb1741e30ca","Type":"ContainerStarted","Data":"c1e366286f4e9ddc378f0672a98684e65e1219d248fbd57600ff6fceda569cec"} Oct 12 07:49:18 crc kubenswrapper[4599]: I1012 07:49:18.100621 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"88d1ee0b-4340-44c2-b357-1eb1741e30ca","Type":"ContainerStarted","Data":"69ac54b513822bd5d17bba9e6347a4aa4221ed49830122d6174fde84876ecdd4"} Oct 12 07:49:18 crc kubenswrapper[4599]: I1012 07:49:18.102378 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=7.102366042 podStartE2EDuration="7.102366042s" podCreationTimestamp="2025-10-12 07:49:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:49:18.101903067 +0000 UTC m=+854.891098570" watchObservedRunningTime="2025-10-12 07:49:18.102366042 +0000 UTC m=+854.891561544" Oct 12 07:49:18 crc kubenswrapper[4599]: I1012 07:49:18.107554 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-tsv76" event={"ID":"fe135e30-f182-4517-8605-1097e7391663","Type":"ContainerStarted","Data":"dec604968bf1207c013ff70bfd46a24b591d3d267a5174aeffe1f5dd2b74886c"} Oct 12 07:49:18 crc kubenswrapper[4599]: I1012 07:49:18.122928 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.122915213 podStartE2EDuration="5.122915213s" podCreationTimestamp="2025-10-12 07:49:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:49:18.121220424 +0000 UTC m=+854.910415927" watchObservedRunningTime="2025-10-12 07:49:18.122915213 +0000 UTC m=+854.912110715" Oct 12 07:49:18 crc kubenswrapper[4599]: I1012 07:49:18.151003 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-tsv76" podStartSLOduration=5.150992281 podStartE2EDuration="5.150992281s" podCreationTimestamp="2025-10-12 07:49:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:49:18.148746052 +0000 UTC m=+854.937941574" watchObservedRunningTime="2025-10-12 07:49:18.150992281 +0000 UTC m=+854.940187783" Oct 12 07:49:18 crc kubenswrapper[4599]: I1012 07:49:18.265900 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dcf8755f-m2r8l" Oct 12 07:49:18 crc kubenswrapper[4599]: I1012 07:49:18.413519 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-flhsf\" (UniqueName: \"kubernetes.io/projected/d0568da7-e8d7-4506-b405-8f7488ce28f9-kube-api-access-flhsf\") pod \"d0568da7-e8d7-4506-b405-8f7488ce28f9\" (UID: \"d0568da7-e8d7-4506-b405-8f7488ce28f9\") " Oct 12 07:49:18 crc kubenswrapper[4599]: I1012 07:49:18.413671 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0568da7-e8d7-4506-b405-8f7488ce28f9-config\") pod \"d0568da7-e8d7-4506-b405-8f7488ce28f9\" (UID: \"d0568da7-e8d7-4506-b405-8f7488ce28f9\") " Oct 12 07:49:18 crc kubenswrapper[4599]: I1012 07:49:18.413699 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d0568da7-e8d7-4506-b405-8f7488ce28f9-dns-svc\") pod \"d0568da7-e8d7-4506-b405-8f7488ce28f9\" (UID: \"d0568da7-e8d7-4506-b405-8f7488ce28f9\") " Oct 12 07:49:18 crc kubenswrapper[4599]: I1012 07:49:18.413823 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d0568da7-e8d7-4506-b405-8f7488ce28f9-ovsdbserver-nb\") pod \"d0568da7-e8d7-4506-b405-8f7488ce28f9\" (UID: \"d0568da7-e8d7-4506-b405-8f7488ce28f9\") " Oct 12 07:49:18 crc kubenswrapper[4599]: I1012 07:49:18.413869 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d0568da7-e8d7-4506-b405-8f7488ce28f9-ovsdbserver-sb\") pod \"d0568da7-e8d7-4506-b405-8f7488ce28f9\" (UID: \"d0568da7-e8d7-4506-b405-8f7488ce28f9\") " Oct 12 07:49:18 crc kubenswrapper[4599]: I1012 07:49:18.414427 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d0568da7-e8d7-4506-b405-8f7488ce28f9-dns-swift-storage-0\") pod \"d0568da7-e8d7-4506-b405-8f7488ce28f9\" (UID: \"d0568da7-e8d7-4506-b405-8f7488ce28f9\") " Oct 12 07:49:18 crc kubenswrapper[4599]: I1012 07:49:18.421489 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0568da7-e8d7-4506-b405-8f7488ce28f9-kube-api-access-flhsf" (OuterVolumeSpecName: "kube-api-access-flhsf") pod "d0568da7-e8d7-4506-b405-8f7488ce28f9" (UID: "d0568da7-e8d7-4506-b405-8f7488ce28f9"). InnerVolumeSpecName "kube-api-access-flhsf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:49:18 crc kubenswrapper[4599]: I1012 07:49:18.458916 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0568da7-e8d7-4506-b405-8f7488ce28f9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d0568da7-e8d7-4506-b405-8f7488ce28f9" (UID: "d0568da7-e8d7-4506-b405-8f7488ce28f9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:49:18 crc kubenswrapper[4599]: I1012 07:49:18.469297 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0568da7-e8d7-4506-b405-8f7488ce28f9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d0568da7-e8d7-4506-b405-8f7488ce28f9" (UID: "d0568da7-e8d7-4506-b405-8f7488ce28f9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:49:18 crc kubenswrapper[4599]: I1012 07:49:18.470610 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0568da7-e8d7-4506-b405-8f7488ce28f9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d0568da7-e8d7-4506-b405-8f7488ce28f9" (UID: "d0568da7-e8d7-4506-b405-8f7488ce28f9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:49:18 crc kubenswrapper[4599]: I1012 07:49:18.476025 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0568da7-e8d7-4506-b405-8f7488ce28f9-config" (OuterVolumeSpecName: "config") pod "d0568da7-e8d7-4506-b405-8f7488ce28f9" (UID: "d0568da7-e8d7-4506-b405-8f7488ce28f9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:49:18 crc kubenswrapper[4599]: I1012 07:49:18.478869 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0568da7-e8d7-4506-b405-8f7488ce28f9-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d0568da7-e8d7-4506-b405-8f7488ce28f9" (UID: "d0568da7-e8d7-4506-b405-8f7488ce28f9"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:49:18 crc kubenswrapper[4599]: I1012 07:49:18.516973 4599 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d0568da7-e8d7-4506-b405-8f7488ce28f9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:18 crc kubenswrapper[4599]: I1012 07:49:18.517010 4599 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d0568da7-e8d7-4506-b405-8f7488ce28f9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:18 crc kubenswrapper[4599]: I1012 07:49:18.517020 4599 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d0568da7-e8d7-4506-b405-8f7488ce28f9-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:18 crc kubenswrapper[4599]: I1012 07:49:18.517030 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-flhsf\" (UniqueName: \"kubernetes.io/projected/d0568da7-e8d7-4506-b405-8f7488ce28f9-kube-api-access-flhsf\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:18 crc kubenswrapper[4599]: I1012 07:49:18.517045 4599 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0568da7-e8d7-4506-b405-8f7488ce28f9-config\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:18 crc kubenswrapper[4599]: I1012 07:49:18.517054 4599 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d0568da7-e8d7-4506-b405-8f7488ce28f9-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:19 crc kubenswrapper[4599]: I1012 07:49:19.120179 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dcf8755f-m2r8l" event={"ID":"d0568da7-e8d7-4506-b405-8f7488ce28f9","Type":"ContainerDied","Data":"7c5721846c6c7c8df2b5bf313217b7a558c8304aab178d40e64c380422f42713"} Oct 12 07:49:19 crc kubenswrapper[4599]: I1012 07:49:19.120203 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dcf8755f-m2r8l" Oct 12 07:49:19 crc kubenswrapper[4599]: I1012 07:49:19.120253 4599 scope.go:117] "RemoveContainer" containerID="ad652b3ca8bdcc1691f62d028ce31f4973911eb7043f562c6184fa5e4c28e41e" Oct 12 07:49:19 crc kubenswrapper[4599]: I1012 07:49:19.121879 4599 generic.go:334] "Generic (PLEG): container finished" podID="9d43ed11-7175-498a-8dc1-e15cfc41b5c8" containerID="5a4ad9a8d063c11b64ee6e2b27292cec66ec28f8bd6209138160a1fec33b2c1d" exitCode=0 Oct 12 07:49:19 crc kubenswrapper[4599]: I1012 07:49:19.122248 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-pkz8c" event={"ID":"9d43ed11-7175-498a-8dc1-e15cfc41b5c8","Type":"ContainerDied","Data":"5a4ad9a8d063c11b64ee6e2b27292cec66ec28f8bd6209138160a1fec33b2c1d"} Oct 12 07:49:19 crc kubenswrapper[4599]: I1012 07:49:19.163457 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5dcf8755f-m2r8l"] Oct 12 07:49:19 crc kubenswrapper[4599]: I1012 07:49:19.168658 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5dcf8755f-m2r8l"] Oct 12 07:49:19 crc kubenswrapper[4599]: I1012 07:49:19.555154 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0568da7-e8d7-4506-b405-8f7488ce28f9" path="/var/lib/kubelet/pods/d0568da7-e8d7-4506-b405-8f7488ce28f9/volumes" Oct 12 07:49:20 crc kubenswrapper[4599]: I1012 07:49:20.131896 4599 generic.go:334] "Generic (PLEG): container finished" podID="fe135e30-f182-4517-8605-1097e7391663" containerID="dec604968bf1207c013ff70bfd46a24b591d3d267a5174aeffe1f5dd2b74886c" exitCode=0 Oct 12 07:49:20 crc kubenswrapper[4599]: I1012 07:49:20.131994 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-tsv76" event={"ID":"fe135e30-f182-4517-8605-1097e7391663","Type":"ContainerDied","Data":"dec604968bf1207c013ff70bfd46a24b591d3d267a5174aeffe1f5dd2b74886c"} Oct 12 07:49:20 crc kubenswrapper[4599]: I1012 07:49:20.743440 4599 scope.go:117] "RemoveContainer" containerID="c6a64ed85521f30ad8a4e0e35efdf36c47d2fc61e6549e3e9f947967949f0233" Oct 12 07:49:22 crc kubenswrapper[4599]: I1012 07:49:22.345231 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 12 07:49:22 crc kubenswrapper[4599]: I1012 07:49:22.346565 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 12 07:49:22 crc kubenswrapper[4599]: I1012 07:49:22.373390 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 12 07:49:22 crc kubenswrapper[4599]: I1012 07:49:22.381717 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 12 07:49:22 crc kubenswrapper[4599]: I1012 07:49:22.816374 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-0e9d-account-create-xlkmf" Oct 12 07:49:22 crc kubenswrapper[4599]: I1012 07:49:22.823503 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-pkz8c" Oct 12 07:49:22 crc kubenswrapper[4599]: I1012 07:49:22.828047 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-tsv76" Oct 12 07:49:22 crc kubenswrapper[4599]: I1012 07:49:22.901355 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lwk57\" (UniqueName: \"kubernetes.io/projected/9d43ed11-7175-498a-8dc1-e15cfc41b5c8-kube-api-access-lwk57\") pod \"9d43ed11-7175-498a-8dc1-e15cfc41b5c8\" (UID: \"9d43ed11-7175-498a-8dc1-e15cfc41b5c8\") " Oct 12 07:49:22 crc kubenswrapper[4599]: I1012 07:49:22.901406 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe135e30-f182-4517-8605-1097e7391663-combined-ca-bundle\") pod \"fe135e30-f182-4517-8605-1097e7391663\" (UID: \"fe135e30-f182-4517-8605-1097e7391663\") " Oct 12 07:49:22 crc kubenswrapper[4599]: I1012 07:49:22.901521 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d43ed11-7175-498a-8dc1-e15cfc41b5c8-scripts\") pod \"9d43ed11-7175-498a-8dc1-e15cfc41b5c8\" (UID: \"9d43ed11-7175-498a-8dc1-e15cfc41b5c8\") " Oct 12 07:49:22 crc kubenswrapper[4599]: I1012 07:49:22.901593 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqdgv\" (UniqueName: \"kubernetes.io/projected/616ae17d-da7f-4f46-9d1e-234bcb028377-kube-api-access-vqdgv\") pod \"616ae17d-da7f-4f46-9d1e-234bcb028377\" (UID: \"616ae17d-da7f-4f46-9d1e-234bcb028377\") " Oct 12 07:49:22 crc kubenswrapper[4599]: I1012 07:49:22.901611 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fe135e30-f182-4517-8605-1097e7391663-credential-keys\") pod \"fe135e30-f182-4517-8605-1097e7391663\" (UID: \"fe135e30-f182-4517-8605-1097e7391663\") " Oct 12 07:49:22 crc kubenswrapper[4599]: I1012 07:49:22.901724 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fe135e30-f182-4517-8605-1097e7391663-fernet-keys\") pod \"fe135e30-f182-4517-8605-1097e7391663\" (UID: \"fe135e30-f182-4517-8605-1097e7391663\") " Oct 12 07:49:22 crc kubenswrapper[4599]: I1012 07:49:22.901777 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe135e30-f182-4517-8605-1097e7391663-scripts\") pod \"fe135e30-f182-4517-8605-1097e7391663\" (UID: \"fe135e30-f182-4517-8605-1097e7391663\") " Oct 12 07:49:22 crc kubenswrapper[4599]: I1012 07:49:22.901803 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wpx8x\" (UniqueName: \"kubernetes.io/projected/fe135e30-f182-4517-8605-1097e7391663-kube-api-access-wpx8x\") pod \"fe135e30-f182-4517-8605-1097e7391663\" (UID: \"fe135e30-f182-4517-8605-1097e7391663\") " Oct 12 07:49:22 crc kubenswrapper[4599]: I1012 07:49:22.901865 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d43ed11-7175-498a-8dc1-e15cfc41b5c8-logs\") pod \"9d43ed11-7175-498a-8dc1-e15cfc41b5c8\" (UID: \"9d43ed11-7175-498a-8dc1-e15cfc41b5c8\") " Oct 12 07:49:22 crc kubenswrapper[4599]: I1012 07:49:22.901921 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d43ed11-7175-498a-8dc1-e15cfc41b5c8-config-data\") pod \"9d43ed11-7175-498a-8dc1-e15cfc41b5c8\" (UID: \"9d43ed11-7175-498a-8dc1-e15cfc41b5c8\") " Oct 12 07:49:22 crc kubenswrapper[4599]: I1012 07:49:22.901944 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe135e30-f182-4517-8605-1097e7391663-config-data\") pod \"fe135e30-f182-4517-8605-1097e7391663\" (UID: \"fe135e30-f182-4517-8605-1097e7391663\") " Oct 12 07:49:22 crc kubenswrapper[4599]: I1012 07:49:22.902011 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d43ed11-7175-498a-8dc1-e15cfc41b5c8-combined-ca-bundle\") pod \"9d43ed11-7175-498a-8dc1-e15cfc41b5c8\" (UID: \"9d43ed11-7175-498a-8dc1-e15cfc41b5c8\") " Oct 12 07:49:22 crc kubenswrapper[4599]: I1012 07:49:22.902504 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d43ed11-7175-498a-8dc1-e15cfc41b5c8-logs" (OuterVolumeSpecName: "logs") pod "9d43ed11-7175-498a-8dc1-e15cfc41b5c8" (UID: "9d43ed11-7175-498a-8dc1-e15cfc41b5c8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:49:22 crc kubenswrapper[4599]: I1012 07:49:22.902824 4599 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d43ed11-7175-498a-8dc1-e15cfc41b5c8-logs\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:22 crc kubenswrapper[4599]: I1012 07:49:22.906852 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d43ed11-7175-498a-8dc1-e15cfc41b5c8-scripts" (OuterVolumeSpecName: "scripts") pod "9d43ed11-7175-498a-8dc1-e15cfc41b5c8" (UID: "9d43ed11-7175-498a-8dc1-e15cfc41b5c8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:49:22 crc kubenswrapper[4599]: I1012 07:49:22.907987 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe135e30-f182-4517-8605-1097e7391663-kube-api-access-wpx8x" (OuterVolumeSpecName: "kube-api-access-wpx8x") pod "fe135e30-f182-4517-8605-1097e7391663" (UID: "fe135e30-f182-4517-8605-1097e7391663"). InnerVolumeSpecName "kube-api-access-wpx8x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:49:22 crc kubenswrapper[4599]: I1012 07:49:22.908687 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe135e30-f182-4517-8605-1097e7391663-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "fe135e30-f182-4517-8605-1097e7391663" (UID: "fe135e30-f182-4517-8605-1097e7391663"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:49:22 crc kubenswrapper[4599]: I1012 07:49:22.908991 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/616ae17d-da7f-4f46-9d1e-234bcb028377-kube-api-access-vqdgv" (OuterVolumeSpecName: "kube-api-access-vqdgv") pod "616ae17d-da7f-4f46-9d1e-234bcb028377" (UID: "616ae17d-da7f-4f46-9d1e-234bcb028377"). InnerVolumeSpecName "kube-api-access-vqdgv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:49:22 crc kubenswrapper[4599]: I1012 07:49:22.911065 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d43ed11-7175-498a-8dc1-e15cfc41b5c8-kube-api-access-lwk57" (OuterVolumeSpecName: "kube-api-access-lwk57") pod "9d43ed11-7175-498a-8dc1-e15cfc41b5c8" (UID: "9d43ed11-7175-498a-8dc1-e15cfc41b5c8"). InnerVolumeSpecName "kube-api-access-lwk57". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:49:22 crc kubenswrapper[4599]: I1012 07:49:22.911104 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe135e30-f182-4517-8605-1097e7391663-scripts" (OuterVolumeSpecName: "scripts") pod "fe135e30-f182-4517-8605-1097e7391663" (UID: "fe135e30-f182-4517-8605-1097e7391663"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:49:22 crc kubenswrapper[4599]: I1012 07:49:22.915251 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe135e30-f182-4517-8605-1097e7391663-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "fe135e30-f182-4517-8605-1097e7391663" (UID: "fe135e30-f182-4517-8605-1097e7391663"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:49:22 crc kubenswrapper[4599]: I1012 07:49:22.927413 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d43ed11-7175-498a-8dc1-e15cfc41b5c8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9d43ed11-7175-498a-8dc1-e15cfc41b5c8" (UID: "9d43ed11-7175-498a-8dc1-e15cfc41b5c8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:49:22 crc kubenswrapper[4599]: I1012 07:49:22.929515 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe135e30-f182-4517-8605-1097e7391663-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fe135e30-f182-4517-8605-1097e7391663" (UID: "fe135e30-f182-4517-8605-1097e7391663"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:49:22 crc kubenswrapper[4599]: I1012 07:49:22.932295 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d43ed11-7175-498a-8dc1-e15cfc41b5c8-config-data" (OuterVolumeSpecName: "config-data") pod "9d43ed11-7175-498a-8dc1-e15cfc41b5c8" (UID: "9d43ed11-7175-498a-8dc1-e15cfc41b5c8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:49:22 crc kubenswrapper[4599]: I1012 07:49:22.936803 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe135e30-f182-4517-8605-1097e7391663-config-data" (OuterVolumeSpecName: "config-data") pod "fe135e30-f182-4517-8605-1097e7391663" (UID: "fe135e30-f182-4517-8605-1097e7391663"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:49:23 crc kubenswrapper[4599]: I1012 07:49:23.004586 4599 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fe135e30-f182-4517-8605-1097e7391663-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:23 crc kubenswrapper[4599]: I1012 07:49:23.004615 4599 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe135e30-f182-4517-8605-1097e7391663-scripts\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:23 crc kubenswrapper[4599]: I1012 07:49:23.004626 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wpx8x\" (UniqueName: \"kubernetes.io/projected/fe135e30-f182-4517-8605-1097e7391663-kube-api-access-wpx8x\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:23 crc kubenswrapper[4599]: I1012 07:49:23.004638 4599 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d43ed11-7175-498a-8dc1-e15cfc41b5c8-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:23 crc kubenswrapper[4599]: I1012 07:49:23.004647 4599 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe135e30-f182-4517-8605-1097e7391663-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:23 crc kubenswrapper[4599]: I1012 07:49:23.004655 4599 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d43ed11-7175-498a-8dc1-e15cfc41b5c8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:23 crc kubenswrapper[4599]: I1012 07:49:23.004666 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lwk57\" (UniqueName: \"kubernetes.io/projected/9d43ed11-7175-498a-8dc1-e15cfc41b5c8-kube-api-access-lwk57\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:23 crc kubenswrapper[4599]: I1012 07:49:23.004675 4599 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe135e30-f182-4517-8605-1097e7391663-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:23 crc kubenswrapper[4599]: I1012 07:49:23.004684 4599 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d43ed11-7175-498a-8dc1-e15cfc41b5c8-scripts\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:23 crc kubenswrapper[4599]: I1012 07:49:23.004693 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vqdgv\" (UniqueName: \"kubernetes.io/projected/616ae17d-da7f-4f46-9d1e-234bcb028377-kube-api-access-vqdgv\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:23 crc kubenswrapper[4599]: I1012 07:49:23.004700 4599 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fe135e30-f182-4517-8605-1097e7391663-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:23 crc kubenswrapper[4599]: I1012 07:49:23.160100 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-tsv76" event={"ID":"fe135e30-f182-4517-8605-1097e7391663","Type":"ContainerDied","Data":"04f82369cc5cd945d983ea3d11c618039f7d944f00bfb98dab9a030da4affe36"} Oct 12 07:49:23 crc kubenswrapper[4599]: I1012 07:49:23.160151 4599 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="04f82369cc5cd945d983ea3d11c618039f7d944f00bfb98dab9a030da4affe36" Oct 12 07:49:23 crc kubenswrapper[4599]: I1012 07:49:23.160115 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-tsv76" Oct 12 07:49:23 crc kubenswrapper[4599]: I1012 07:49:23.161941 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-pkz8c" event={"ID":"9d43ed11-7175-498a-8dc1-e15cfc41b5c8","Type":"ContainerDied","Data":"3b98a862f2a26e6aab4d936817b0e3e55f0f67169e1044617a0be1125a66b3cd"} Oct 12 07:49:23 crc kubenswrapper[4599]: I1012 07:49:23.161992 4599 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b98a862f2a26e6aab4d936817b0e3e55f0f67169e1044617a0be1125a66b3cd" Oct 12 07:49:23 crc kubenswrapper[4599]: I1012 07:49:23.162057 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-pkz8c" Oct 12 07:49:23 crc kubenswrapper[4599]: I1012 07:49:23.164668 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-0e9d-account-create-xlkmf" Oct 12 07:49:23 crc kubenswrapper[4599]: I1012 07:49:23.164682 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-0e9d-account-create-xlkmf" event={"ID":"616ae17d-da7f-4f46-9d1e-234bcb028377","Type":"ContainerDied","Data":"cae965b3e05b72852df4432fceb26eb9b849c4ca4dac57f2211ea9e00b732d0e"} Oct 12 07:49:23 crc kubenswrapper[4599]: I1012 07:49:23.164720 4599 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cae965b3e05b72852df4432fceb26eb9b849c4ca4dac57f2211ea9e00b732d0e" Oct 12 07:49:23 crc kubenswrapper[4599]: I1012 07:49:23.165009 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 12 07:49:23 crc kubenswrapper[4599]: I1012 07:49:23.165044 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 12 07:49:23 crc kubenswrapper[4599]: I1012 07:49:23.716048 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 12 07:49:23 crc kubenswrapper[4599]: I1012 07:49:23.716590 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 12 07:49:23 crc kubenswrapper[4599]: I1012 07:49:23.746991 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 12 07:49:23 crc kubenswrapper[4599]: I1012 07:49:23.759367 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 12 07:49:24 crc kubenswrapper[4599]: I1012 07:49:24.045280 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-55dcb544bd-wvf5d"] Oct 12 07:49:24 crc kubenswrapper[4599]: E1012 07:49:24.045754 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="616ae17d-da7f-4f46-9d1e-234bcb028377" containerName="mariadb-account-create" Oct 12 07:49:24 crc kubenswrapper[4599]: I1012 07:49:24.045774 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="616ae17d-da7f-4f46-9d1e-234bcb028377" containerName="mariadb-account-create" Oct 12 07:49:24 crc kubenswrapper[4599]: E1012 07:49:24.045784 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe135e30-f182-4517-8605-1097e7391663" containerName="keystone-bootstrap" Oct 12 07:49:24 crc kubenswrapper[4599]: I1012 07:49:24.045791 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe135e30-f182-4517-8605-1097e7391663" containerName="keystone-bootstrap" Oct 12 07:49:24 crc kubenswrapper[4599]: E1012 07:49:24.045818 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0568da7-e8d7-4506-b405-8f7488ce28f9" containerName="init" Oct 12 07:49:24 crc kubenswrapper[4599]: I1012 07:49:24.045824 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0568da7-e8d7-4506-b405-8f7488ce28f9" containerName="init" Oct 12 07:49:24 crc kubenswrapper[4599]: E1012 07:49:24.045834 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0568da7-e8d7-4506-b405-8f7488ce28f9" containerName="dnsmasq-dns" Oct 12 07:49:24 crc kubenswrapper[4599]: I1012 07:49:24.045841 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0568da7-e8d7-4506-b405-8f7488ce28f9" containerName="dnsmasq-dns" Oct 12 07:49:24 crc kubenswrapper[4599]: E1012 07:49:24.045857 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d43ed11-7175-498a-8dc1-e15cfc41b5c8" containerName="placement-db-sync" Oct 12 07:49:24 crc kubenswrapper[4599]: I1012 07:49:24.045862 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d43ed11-7175-498a-8dc1-e15cfc41b5c8" containerName="placement-db-sync" Oct 12 07:49:24 crc kubenswrapper[4599]: I1012 07:49:24.046030 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0568da7-e8d7-4506-b405-8f7488ce28f9" containerName="dnsmasq-dns" Oct 12 07:49:24 crc kubenswrapper[4599]: I1012 07:49:24.046040 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d43ed11-7175-498a-8dc1-e15cfc41b5c8" containerName="placement-db-sync" Oct 12 07:49:24 crc kubenswrapper[4599]: I1012 07:49:24.046050 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe135e30-f182-4517-8605-1097e7391663" containerName="keystone-bootstrap" Oct 12 07:49:24 crc kubenswrapper[4599]: I1012 07:49:24.046059 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="616ae17d-da7f-4f46-9d1e-234bcb028377" containerName="mariadb-account-create" Oct 12 07:49:24 crc kubenswrapper[4599]: I1012 07:49:24.047054 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-55dcb544bd-wvf5d" Oct 12 07:49:24 crc kubenswrapper[4599]: I1012 07:49:24.054270 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 12 07:49:24 crc kubenswrapper[4599]: I1012 07:49:24.054611 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 12 07:49:24 crc kubenswrapper[4599]: I1012 07:49:24.054654 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Oct 12 07:49:24 crc kubenswrapper[4599]: I1012 07:49:24.055589 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-5d9d5b7486-4r48s"] Oct 12 07:49:24 crc kubenswrapper[4599]: I1012 07:49:24.055847 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-kqzdj" Oct 12 07:49:24 crc kubenswrapper[4599]: I1012 07:49:24.056001 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Oct 12 07:49:24 crc kubenswrapper[4599]: I1012 07:49:24.056445 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5d9d5b7486-4r48s" Oct 12 07:49:24 crc kubenswrapper[4599]: I1012 07:49:24.066279 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 12 07:49:24 crc kubenswrapper[4599]: I1012 07:49:24.066468 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 12 07:49:24 crc kubenswrapper[4599]: I1012 07:49:24.066607 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Oct 12 07:49:24 crc kubenswrapper[4599]: I1012 07:49:24.066720 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Oct 12 07:49:24 crc kubenswrapper[4599]: I1012 07:49:24.066901 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-6vkdx" Oct 12 07:49:24 crc kubenswrapper[4599]: I1012 07:49:24.067031 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 12 07:49:24 crc kubenswrapper[4599]: I1012 07:49:24.086106 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5d9d5b7486-4r48s"] Oct 12 07:49:24 crc kubenswrapper[4599]: I1012 07:49:24.094918 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-55dcb544bd-wvf5d"] Oct 12 07:49:24 crc kubenswrapper[4599]: I1012 07:49:24.132029 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d346d4c-1358-4305-89ac-c9c012143de6-config-data\") pod \"placement-55dcb544bd-wvf5d\" (UID: \"6d346d4c-1358-4305-89ac-c9c012143de6\") " pod="openstack/placement-55dcb544bd-wvf5d" Oct 12 07:49:24 crc kubenswrapper[4599]: I1012 07:49:24.132128 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5d11880e-4007-4266-bf3b-8c1e3eea20b8-fernet-keys\") pod \"keystone-5d9d5b7486-4r48s\" (UID: \"5d11880e-4007-4266-bf3b-8c1e3eea20b8\") " pod="openstack/keystone-5d9d5b7486-4r48s" Oct 12 07:49:24 crc kubenswrapper[4599]: I1012 07:49:24.132170 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gh52k\" (UniqueName: \"kubernetes.io/projected/6d346d4c-1358-4305-89ac-c9c012143de6-kube-api-access-gh52k\") pod \"placement-55dcb544bd-wvf5d\" (UID: \"6d346d4c-1358-4305-89ac-c9c012143de6\") " pod="openstack/placement-55dcb544bd-wvf5d" Oct 12 07:49:24 crc kubenswrapper[4599]: I1012 07:49:24.132194 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d11880e-4007-4266-bf3b-8c1e3eea20b8-internal-tls-certs\") pod \"keystone-5d9d5b7486-4r48s\" (UID: \"5d11880e-4007-4266-bf3b-8c1e3eea20b8\") " pod="openstack/keystone-5d9d5b7486-4r48s" Oct 12 07:49:24 crc kubenswrapper[4599]: I1012 07:49:24.132212 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d346d4c-1358-4305-89ac-c9c012143de6-public-tls-certs\") pod \"placement-55dcb544bd-wvf5d\" (UID: \"6d346d4c-1358-4305-89ac-c9c012143de6\") " pod="openstack/placement-55dcb544bd-wvf5d" Oct 12 07:49:24 crc kubenswrapper[4599]: I1012 07:49:24.132241 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d11880e-4007-4266-bf3b-8c1e3eea20b8-public-tls-certs\") pod \"keystone-5d9d5b7486-4r48s\" (UID: \"5d11880e-4007-4266-bf3b-8c1e3eea20b8\") " pod="openstack/keystone-5d9d5b7486-4r48s" Oct 12 07:49:24 crc kubenswrapper[4599]: I1012 07:49:24.132261 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d346d4c-1358-4305-89ac-c9c012143de6-combined-ca-bundle\") pod \"placement-55dcb544bd-wvf5d\" (UID: \"6d346d4c-1358-4305-89ac-c9c012143de6\") " pod="openstack/placement-55dcb544bd-wvf5d" Oct 12 07:49:24 crc kubenswrapper[4599]: I1012 07:49:24.132283 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d11880e-4007-4266-bf3b-8c1e3eea20b8-scripts\") pod \"keystone-5d9d5b7486-4r48s\" (UID: \"5d11880e-4007-4266-bf3b-8c1e3eea20b8\") " pod="openstack/keystone-5d9d5b7486-4r48s" Oct 12 07:49:24 crc kubenswrapper[4599]: I1012 07:49:24.132300 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxnjj\" (UniqueName: \"kubernetes.io/projected/5d11880e-4007-4266-bf3b-8c1e3eea20b8-kube-api-access-jxnjj\") pod \"keystone-5d9d5b7486-4r48s\" (UID: \"5d11880e-4007-4266-bf3b-8c1e3eea20b8\") " pod="openstack/keystone-5d9d5b7486-4r48s" Oct 12 07:49:24 crc kubenswrapper[4599]: I1012 07:49:24.132325 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d346d4c-1358-4305-89ac-c9c012143de6-scripts\") pod \"placement-55dcb544bd-wvf5d\" (UID: \"6d346d4c-1358-4305-89ac-c9c012143de6\") " pod="openstack/placement-55dcb544bd-wvf5d" Oct 12 07:49:24 crc kubenswrapper[4599]: I1012 07:49:24.132354 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d346d4c-1358-4305-89ac-c9c012143de6-internal-tls-certs\") pod \"placement-55dcb544bd-wvf5d\" (UID: \"6d346d4c-1358-4305-89ac-c9c012143de6\") " pod="openstack/placement-55dcb544bd-wvf5d" Oct 12 07:49:24 crc kubenswrapper[4599]: I1012 07:49:24.132375 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d11880e-4007-4266-bf3b-8c1e3eea20b8-combined-ca-bundle\") pod \"keystone-5d9d5b7486-4r48s\" (UID: \"5d11880e-4007-4266-bf3b-8c1e3eea20b8\") " pod="openstack/keystone-5d9d5b7486-4r48s" Oct 12 07:49:24 crc kubenswrapper[4599]: I1012 07:49:24.132401 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5d11880e-4007-4266-bf3b-8c1e3eea20b8-credential-keys\") pod \"keystone-5d9d5b7486-4r48s\" (UID: \"5d11880e-4007-4266-bf3b-8c1e3eea20b8\") " pod="openstack/keystone-5d9d5b7486-4r48s" Oct 12 07:49:24 crc kubenswrapper[4599]: I1012 07:49:24.132438 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6d346d4c-1358-4305-89ac-c9c012143de6-logs\") pod \"placement-55dcb544bd-wvf5d\" (UID: \"6d346d4c-1358-4305-89ac-c9c012143de6\") " pod="openstack/placement-55dcb544bd-wvf5d" Oct 12 07:49:24 crc kubenswrapper[4599]: I1012 07:49:24.132459 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d11880e-4007-4266-bf3b-8c1e3eea20b8-config-data\") pod \"keystone-5d9d5b7486-4r48s\" (UID: \"5d11880e-4007-4266-bf3b-8c1e3eea20b8\") " pod="openstack/keystone-5d9d5b7486-4r48s" Oct 12 07:49:24 crc kubenswrapper[4599]: I1012 07:49:24.183882 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f94bf31d-933c-4cca-998a-4c3fc9a451f4","Type":"ContainerStarted","Data":"036a903d34b4436b12b7ccac14fb27518208774daa9e31d4ce65967db6523fc1"} Oct 12 07:49:24 crc kubenswrapper[4599]: I1012 07:49:24.188589 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-h6mzn" event={"ID":"8468e667-fc53-4e39-939f-2221c78d4313","Type":"ContainerStarted","Data":"cd02781890a38d192ec7d89fe075f2aefaef75bbc5f9779efa162dc3bacb067b"} Oct 12 07:49:24 crc kubenswrapper[4599]: I1012 07:49:24.190905 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 12 07:49:24 crc kubenswrapper[4599]: I1012 07:49:24.190942 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 12 07:49:24 crc kubenswrapper[4599]: I1012 07:49:24.201776 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-h6mzn" podStartSLOduration=2.099698859 podStartE2EDuration="9.201764197s" podCreationTimestamp="2025-10-12 07:49:15 +0000 UTC" firstStartedPulling="2025-10-12 07:49:16.159455374 +0000 UTC m=+852.948650876" lastFinishedPulling="2025-10-12 07:49:23.261520713 +0000 UTC m=+860.050716214" observedRunningTime="2025-10-12 07:49:24.199809108 +0000 UTC m=+860.989004611" watchObservedRunningTime="2025-10-12 07:49:24.201764197 +0000 UTC m=+860.990959700" Oct 12 07:49:24 crc kubenswrapper[4599]: I1012 07:49:24.235097 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d346d4c-1358-4305-89ac-c9c012143de6-scripts\") pod \"placement-55dcb544bd-wvf5d\" (UID: \"6d346d4c-1358-4305-89ac-c9c012143de6\") " pod="openstack/placement-55dcb544bd-wvf5d" Oct 12 07:49:24 crc kubenswrapper[4599]: I1012 07:49:24.235147 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d346d4c-1358-4305-89ac-c9c012143de6-internal-tls-certs\") pod \"placement-55dcb544bd-wvf5d\" (UID: \"6d346d4c-1358-4305-89ac-c9c012143de6\") " pod="openstack/placement-55dcb544bd-wvf5d" Oct 12 07:49:24 crc kubenswrapper[4599]: I1012 07:49:24.235173 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d11880e-4007-4266-bf3b-8c1e3eea20b8-combined-ca-bundle\") pod \"keystone-5d9d5b7486-4r48s\" (UID: \"5d11880e-4007-4266-bf3b-8c1e3eea20b8\") " pod="openstack/keystone-5d9d5b7486-4r48s" Oct 12 07:49:24 crc kubenswrapper[4599]: I1012 07:49:24.235208 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5d11880e-4007-4266-bf3b-8c1e3eea20b8-credential-keys\") pod \"keystone-5d9d5b7486-4r48s\" (UID: \"5d11880e-4007-4266-bf3b-8c1e3eea20b8\") " pod="openstack/keystone-5d9d5b7486-4r48s" Oct 12 07:49:24 crc kubenswrapper[4599]: I1012 07:49:24.235253 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6d346d4c-1358-4305-89ac-c9c012143de6-logs\") pod \"placement-55dcb544bd-wvf5d\" (UID: \"6d346d4c-1358-4305-89ac-c9c012143de6\") " pod="openstack/placement-55dcb544bd-wvf5d" Oct 12 07:49:24 crc kubenswrapper[4599]: I1012 07:49:24.235277 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d11880e-4007-4266-bf3b-8c1e3eea20b8-config-data\") pod \"keystone-5d9d5b7486-4r48s\" (UID: \"5d11880e-4007-4266-bf3b-8c1e3eea20b8\") " pod="openstack/keystone-5d9d5b7486-4r48s" Oct 12 07:49:24 crc kubenswrapper[4599]: I1012 07:49:24.235298 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d346d4c-1358-4305-89ac-c9c012143de6-config-data\") pod \"placement-55dcb544bd-wvf5d\" (UID: \"6d346d4c-1358-4305-89ac-c9c012143de6\") " pod="openstack/placement-55dcb544bd-wvf5d" Oct 12 07:49:24 crc kubenswrapper[4599]: I1012 07:49:24.235354 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5d11880e-4007-4266-bf3b-8c1e3eea20b8-fernet-keys\") pod \"keystone-5d9d5b7486-4r48s\" (UID: \"5d11880e-4007-4266-bf3b-8c1e3eea20b8\") " pod="openstack/keystone-5d9d5b7486-4r48s" Oct 12 07:49:24 crc kubenswrapper[4599]: I1012 07:49:24.235379 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gh52k\" (UniqueName: \"kubernetes.io/projected/6d346d4c-1358-4305-89ac-c9c012143de6-kube-api-access-gh52k\") pod \"placement-55dcb544bd-wvf5d\" (UID: \"6d346d4c-1358-4305-89ac-c9c012143de6\") " pod="openstack/placement-55dcb544bd-wvf5d" Oct 12 07:49:24 crc kubenswrapper[4599]: I1012 07:49:24.235395 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d11880e-4007-4266-bf3b-8c1e3eea20b8-internal-tls-certs\") pod \"keystone-5d9d5b7486-4r48s\" (UID: \"5d11880e-4007-4266-bf3b-8c1e3eea20b8\") " pod="openstack/keystone-5d9d5b7486-4r48s" Oct 12 07:49:24 crc kubenswrapper[4599]: I1012 07:49:24.235411 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d346d4c-1358-4305-89ac-c9c012143de6-public-tls-certs\") pod \"placement-55dcb544bd-wvf5d\" (UID: \"6d346d4c-1358-4305-89ac-c9c012143de6\") " pod="openstack/placement-55dcb544bd-wvf5d" Oct 12 07:49:24 crc kubenswrapper[4599]: I1012 07:49:24.235439 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d11880e-4007-4266-bf3b-8c1e3eea20b8-public-tls-certs\") pod \"keystone-5d9d5b7486-4r48s\" (UID: \"5d11880e-4007-4266-bf3b-8c1e3eea20b8\") " pod="openstack/keystone-5d9d5b7486-4r48s" Oct 12 07:49:24 crc kubenswrapper[4599]: I1012 07:49:24.235455 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d346d4c-1358-4305-89ac-c9c012143de6-combined-ca-bundle\") pod \"placement-55dcb544bd-wvf5d\" (UID: \"6d346d4c-1358-4305-89ac-c9c012143de6\") " pod="openstack/placement-55dcb544bd-wvf5d" Oct 12 07:49:24 crc kubenswrapper[4599]: I1012 07:49:24.235475 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d11880e-4007-4266-bf3b-8c1e3eea20b8-scripts\") pod \"keystone-5d9d5b7486-4r48s\" (UID: \"5d11880e-4007-4266-bf3b-8c1e3eea20b8\") " pod="openstack/keystone-5d9d5b7486-4r48s" Oct 12 07:49:24 crc kubenswrapper[4599]: I1012 07:49:24.235495 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxnjj\" (UniqueName: \"kubernetes.io/projected/5d11880e-4007-4266-bf3b-8c1e3eea20b8-kube-api-access-jxnjj\") pod \"keystone-5d9d5b7486-4r48s\" (UID: \"5d11880e-4007-4266-bf3b-8c1e3eea20b8\") " pod="openstack/keystone-5d9d5b7486-4r48s" Oct 12 07:49:24 crc kubenswrapper[4599]: I1012 07:49:24.236754 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6d346d4c-1358-4305-89ac-c9c012143de6-logs\") pod \"placement-55dcb544bd-wvf5d\" (UID: \"6d346d4c-1358-4305-89ac-c9c012143de6\") " pod="openstack/placement-55dcb544bd-wvf5d" Oct 12 07:49:24 crc kubenswrapper[4599]: I1012 07:49:24.244585 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d346d4c-1358-4305-89ac-c9c012143de6-combined-ca-bundle\") pod \"placement-55dcb544bd-wvf5d\" (UID: \"6d346d4c-1358-4305-89ac-c9c012143de6\") " pod="openstack/placement-55dcb544bd-wvf5d" Oct 12 07:49:24 crc kubenswrapper[4599]: I1012 07:49:24.245032 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d346d4c-1358-4305-89ac-c9c012143de6-config-data\") pod \"placement-55dcb544bd-wvf5d\" (UID: \"6d346d4c-1358-4305-89ac-c9c012143de6\") " pod="openstack/placement-55dcb544bd-wvf5d" Oct 12 07:49:24 crc kubenswrapper[4599]: I1012 07:49:24.245116 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d11880e-4007-4266-bf3b-8c1e3eea20b8-internal-tls-certs\") pod \"keystone-5d9d5b7486-4r48s\" (UID: \"5d11880e-4007-4266-bf3b-8c1e3eea20b8\") " pod="openstack/keystone-5d9d5b7486-4r48s" Oct 12 07:49:24 crc kubenswrapper[4599]: I1012 07:49:24.245517 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d11880e-4007-4266-bf3b-8c1e3eea20b8-config-data\") pod \"keystone-5d9d5b7486-4r48s\" (UID: \"5d11880e-4007-4266-bf3b-8c1e3eea20b8\") " pod="openstack/keystone-5d9d5b7486-4r48s" Oct 12 07:49:24 crc kubenswrapper[4599]: I1012 07:49:24.245967 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d346d4c-1358-4305-89ac-c9c012143de6-scripts\") pod \"placement-55dcb544bd-wvf5d\" (UID: \"6d346d4c-1358-4305-89ac-c9c012143de6\") " pod="openstack/placement-55dcb544bd-wvf5d" Oct 12 07:49:24 crc kubenswrapper[4599]: I1012 07:49:24.249303 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxnjj\" (UniqueName: \"kubernetes.io/projected/5d11880e-4007-4266-bf3b-8c1e3eea20b8-kube-api-access-jxnjj\") pod \"keystone-5d9d5b7486-4r48s\" (UID: \"5d11880e-4007-4266-bf3b-8c1e3eea20b8\") " pod="openstack/keystone-5d9d5b7486-4r48s" Oct 12 07:49:24 crc kubenswrapper[4599]: I1012 07:49:24.251875 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d11880e-4007-4266-bf3b-8c1e3eea20b8-public-tls-certs\") pod \"keystone-5d9d5b7486-4r48s\" (UID: \"5d11880e-4007-4266-bf3b-8c1e3eea20b8\") " pod="openstack/keystone-5d9d5b7486-4r48s" Oct 12 07:49:24 crc kubenswrapper[4599]: I1012 07:49:24.252482 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gh52k\" (UniqueName: \"kubernetes.io/projected/6d346d4c-1358-4305-89ac-c9c012143de6-kube-api-access-gh52k\") pod \"placement-55dcb544bd-wvf5d\" (UID: \"6d346d4c-1358-4305-89ac-c9c012143de6\") " pod="openstack/placement-55dcb544bd-wvf5d" Oct 12 07:49:24 crc kubenswrapper[4599]: I1012 07:49:24.257718 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d346d4c-1358-4305-89ac-c9c012143de6-public-tls-certs\") pod \"placement-55dcb544bd-wvf5d\" (UID: \"6d346d4c-1358-4305-89ac-c9c012143de6\") " pod="openstack/placement-55dcb544bd-wvf5d" Oct 12 07:49:24 crc kubenswrapper[4599]: I1012 07:49:24.258411 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d346d4c-1358-4305-89ac-c9c012143de6-internal-tls-certs\") pod \"placement-55dcb544bd-wvf5d\" (UID: \"6d346d4c-1358-4305-89ac-c9c012143de6\") " pod="openstack/placement-55dcb544bd-wvf5d" Oct 12 07:49:24 crc kubenswrapper[4599]: I1012 07:49:24.258642 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d11880e-4007-4266-bf3b-8c1e3eea20b8-combined-ca-bundle\") pod \"keystone-5d9d5b7486-4r48s\" (UID: \"5d11880e-4007-4266-bf3b-8c1e3eea20b8\") " pod="openstack/keystone-5d9d5b7486-4r48s" Oct 12 07:49:24 crc kubenswrapper[4599]: I1012 07:49:24.259803 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d11880e-4007-4266-bf3b-8c1e3eea20b8-scripts\") pod \"keystone-5d9d5b7486-4r48s\" (UID: \"5d11880e-4007-4266-bf3b-8c1e3eea20b8\") " pod="openstack/keystone-5d9d5b7486-4r48s" Oct 12 07:49:24 crc kubenswrapper[4599]: I1012 07:49:24.260597 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5d11880e-4007-4266-bf3b-8c1e3eea20b8-credential-keys\") pod \"keystone-5d9d5b7486-4r48s\" (UID: \"5d11880e-4007-4266-bf3b-8c1e3eea20b8\") " pod="openstack/keystone-5d9d5b7486-4r48s" Oct 12 07:49:24 crc kubenswrapper[4599]: I1012 07:49:24.260654 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5d11880e-4007-4266-bf3b-8c1e3eea20b8-fernet-keys\") pod \"keystone-5d9d5b7486-4r48s\" (UID: \"5d11880e-4007-4266-bf3b-8c1e3eea20b8\") " pod="openstack/keystone-5d9d5b7486-4r48s" Oct 12 07:49:24 crc kubenswrapper[4599]: I1012 07:49:24.371415 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-55dcb544bd-wvf5d" Oct 12 07:49:24 crc kubenswrapper[4599]: I1012 07:49:24.382902 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5d9d5b7486-4r48s" Oct 12 07:49:24 crc kubenswrapper[4599]: I1012 07:49:24.858460 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-55dcb544bd-wvf5d"] Oct 12 07:49:24 crc kubenswrapper[4599]: W1012 07:49:24.868677 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d346d4c_1358_4305_89ac_c9c012143de6.slice/crio-4a36d4bd94bc70d60d964df378966ddcbd0c11f093424c06b2732f294dc6e46e WatchSource:0}: Error finding container 4a36d4bd94bc70d60d964df378966ddcbd0c11f093424c06b2732f294dc6e46e: Status 404 returned error can't find the container with id 4a36d4bd94bc70d60d964df378966ddcbd0c11f093424c06b2732f294dc6e46e Oct 12 07:49:24 crc kubenswrapper[4599]: I1012 07:49:24.963917 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5d9d5b7486-4r48s"] Oct 12 07:49:25 crc kubenswrapper[4599]: I1012 07:49:25.053091 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 12 07:49:25 crc kubenswrapper[4599]: I1012 07:49:25.053794 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 12 07:49:25 crc kubenswrapper[4599]: I1012 07:49:25.206403 4599 generic.go:334] "Generic (PLEG): container finished" podID="8468e667-fc53-4e39-939f-2221c78d4313" containerID="cd02781890a38d192ec7d89fe075f2aefaef75bbc5f9779efa162dc3bacb067b" exitCode=0 Oct 12 07:49:25 crc kubenswrapper[4599]: I1012 07:49:25.206522 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-h6mzn" event={"ID":"8468e667-fc53-4e39-939f-2221c78d4313","Type":"ContainerDied","Data":"cd02781890a38d192ec7d89fe075f2aefaef75bbc5f9779efa162dc3bacb067b"} Oct 12 07:49:25 crc kubenswrapper[4599]: I1012 07:49:25.211825 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5d9d5b7486-4r48s" event={"ID":"5d11880e-4007-4266-bf3b-8c1e3eea20b8","Type":"ContainerStarted","Data":"b5e79df3b2be3e9880252570b0628f0c96776ce0be2f9179c64bd447b96b3b0f"} Oct 12 07:49:25 crc kubenswrapper[4599]: I1012 07:49:25.211869 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5d9d5b7486-4r48s" event={"ID":"5d11880e-4007-4266-bf3b-8c1e3eea20b8","Type":"ContainerStarted","Data":"a1da07fb50e79044de2ce6111852ad95eede3e73055caf8233be2a5fa0dac3a1"} Oct 12 07:49:25 crc kubenswrapper[4599]: I1012 07:49:25.212472 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-5d9d5b7486-4r48s" Oct 12 07:49:25 crc kubenswrapper[4599]: I1012 07:49:25.224929 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-55dcb544bd-wvf5d" event={"ID":"6d346d4c-1358-4305-89ac-c9c012143de6","Type":"ContainerStarted","Data":"dc585646a5989ee7cfd51f0b42d11e56ab0795c38d65b860eb7429829a769b38"} Oct 12 07:49:25 crc kubenswrapper[4599]: I1012 07:49:25.224960 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-55dcb544bd-wvf5d" event={"ID":"6d346d4c-1358-4305-89ac-c9c012143de6","Type":"ContainerStarted","Data":"4a36d4bd94bc70d60d964df378966ddcbd0c11f093424c06b2732f294dc6e46e"} Oct 12 07:49:25 crc kubenswrapper[4599]: I1012 07:49:25.509540 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-5d9d5b7486-4r48s" podStartSLOduration=1.509523012 podStartE2EDuration="1.509523012s" podCreationTimestamp="2025-10-12 07:49:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:49:25.262028531 +0000 UTC m=+862.051224033" watchObservedRunningTime="2025-10-12 07:49:25.509523012 +0000 UTC m=+862.298718515" Oct 12 07:49:25 crc kubenswrapper[4599]: I1012 07:49:25.512216 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-jjn25"] Oct 12 07:49:25 crc kubenswrapper[4599]: I1012 07:49:25.513524 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-jjn25" Oct 12 07:49:25 crc kubenswrapper[4599]: I1012 07:49:25.525730 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 12 07:49:25 crc kubenswrapper[4599]: I1012 07:49:25.530484 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 12 07:49:25 crc kubenswrapper[4599]: I1012 07:49:25.530843 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-wlqpk" Oct 12 07:49:25 crc kubenswrapper[4599]: I1012 07:49:25.537128 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-jjn25"] Oct 12 07:49:25 crc kubenswrapper[4599]: I1012 07:49:25.574136 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wv65f\" (UniqueName: \"kubernetes.io/projected/dd14916d-8d1c-4c3d-9721-2d5e0e171db1-kube-api-access-wv65f\") pod \"neutron-db-sync-jjn25\" (UID: \"dd14916d-8d1c-4c3d-9721-2d5e0e171db1\") " pod="openstack/neutron-db-sync-jjn25" Oct 12 07:49:25 crc kubenswrapper[4599]: I1012 07:49:25.574187 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/dd14916d-8d1c-4c3d-9721-2d5e0e171db1-config\") pod \"neutron-db-sync-jjn25\" (UID: \"dd14916d-8d1c-4c3d-9721-2d5e0e171db1\") " pod="openstack/neutron-db-sync-jjn25" Oct 12 07:49:25 crc kubenswrapper[4599]: I1012 07:49:25.574462 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd14916d-8d1c-4c3d-9721-2d5e0e171db1-combined-ca-bundle\") pod \"neutron-db-sync-jjn25\" (UID: \"dd14916d-8d1c-4c3d-9721-2d5e0e171db1\") " pod="openstack/neutron-db-sync-jjn25" Oct 12 07:49:25 crc kubenswrapper[4599]: I1012 07:49:25.676415 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wv65f\" (UniqueName: \"kubernetes.io/projected/dd14916d-8d1c-4c3d-9721-2d5e0e171db1-kube-api-access-wv65f\") pod \"neutron-db-sync-jjn25\" (UID: \"dd14916d-8d1c-4c3d-9721-2d5e0e171db1\") " pod="openstack/neutron-db-sync-jjn25" Oct 12 07:49:25 crc kubenswrapper[4599]: I1012 07:49:25.676739 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/dd14916d-8d1c-4c3d-9721-2d5e0e171db1-config\") pod \"neutron-db-sync-jjn25\" (UID: \"dd14916d-8d1c-4c3d-9721-2d5e0e171db1\") " pod="openstack/neutron-db-sync-jjn25" Oct 12 07:49:25 crc kubenswrapper[4599]: I1012 07:49:25.676815 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd14916d-8d1c-4c3d-9721-2d5e0e171db1-combined-ca-bundle\") pod \"neutron-db-sync-jjn25\" (UID: \"dd14916d-8d1c-4c3d-9721-2d5e0e171db1\") " pod="openstack/neutron-db-sync-jjn25" Oct 12 07:49:25 crc kubenswrapper[4599]: I1012 07:49:25.682929 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/dd14916d-8d1c-4c3d-9721-2d5e0e171db1-config\") pod \"neutron-db-sync-jjn25\" (UID: \"dd14916d-8d1c-4c3d-9721-2d5e0e171db1\") " pod="openstack/neutron-db-sync-jjn25" Oct 12 07:49:25 crc kubenswrapper[4599]: I1012 07:49:25.686207 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd14916d-8d1c-4c3d-9721-2d5e0e171db1-combined-ca-bundle\") pod \"neutron-db-sync-jjn25\" (UID: \"dd14916d-8d1c-4c3d-9721-2d5e0e171db1\") " pod="openstack/neutron-db-sync-jjn25" Oct 12 07:49:25 crc kubenswrapper[4599]: I1012 07:49:25.690672 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wv65f\" (UniqueName: \"kubernetes.io/projected/dd14916d-8d1c-4c3d-9721-2d5e0e171db1-kube-api-access-wv65f\") pod \"neutron-db-sync-jjn25\" (UID: \"dd14916d-8d1c-4c3d-9721-2d5e0e171db1\") " pod="openstack/neutron-db-sync-jjn25" Oct 12 07:49:25 crc kubenswrapper[4599]: I1012 07:49:25.870617 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-jjn25" Oct 12 07:49:26 crc kubenswrapper[4599]: I1012 07:49:26.254086 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-55dcb544bd-wvf5d" event={"ID":"6d346d4c-1358-4305-89ac-c9c012143de6","Type":"ContainerStarted","Data":"97a41a95eb92800da84d7cc4b5dfbe156398a9788788025339b64188516f5487"} Oct 12 07:49:26 crc kubenswrapper[4599]: I1012 07:49:26.274849 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-55dcb544bd-wvf5d" podStartSLOduration=2.274834799 podStartE2EDuration="2.274834799s" podCreationTimestamp="2025-10-12 07:49:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:49:26.273829131 +0000 UTC m=+863.063024633" watchObservedRunningTime="2025-10-12 07:49:26.274834799 +0000 UTC m=+863.064030300" Oct 12 07:49:26 crc kubenswrapper[4599]: I1012 07:49:26.375750 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-jjn25"] Oct 12 07:49:26 crc kubenswrapper[4599]: I1012 07:49:26.378971 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 12 07:49:26 crc kubenswrapper[4599]: I1012 07:49:26.379187 4599 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 12 07:49:26 crc kubenswrapper[4599]: I1012 07:49:26.385445 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 12 07:49:27 crc kubenswrapper[4599]: I1012 07:49:27.261781 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-55dcb544bd-wvf5d" Oct 12 07:49:27 crc kubenswrapper[4599]: I1012 07:49:27.262048 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-55dcb544bd-wvf5d" Oct 12 07:49:28 crc kubenswrapper[4599]: W1012 07:49:28.233712 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddd14916d_8d1c_4c3d_9721_2d5e0e171db1.slice/crio-74b3220c5e595ad3558f71b4c9f98616b8bdc4a1af37f614112a7c711a5d7464 WatchSource:0}: Error finding container 74b3220c5e595ad3558f71b4c9f98616b8bdc4a1af37f614112a7c711a5d7464: Status 404 returned error can't find the container with id 74b3220c5e595ad3558f71b4c9f98616b8bdc4a1af37f614112a7c711a5d7464 Oct 12 07:49:28 crc kubenswrapper[4599]: I1012 07:49:28.272725 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-jjn25" event={"ID":"dd14916d-8d1c-4c3d-9721-2d5e0e171db1","Type":"ContainerStarted","Data":"74b3220c5e595ad3558f71b4c9f98616b8bdc4a1af37f614112a7c711a5d7464"} Oct 12 07:49:28 crc kubenswrapper[4599]: I1012 07:49:28.321801 4599 patch_prober.go:28] interesting pod/machine-config-daemon-5mz5c container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 07:49:28 crc kubenswrapper[4599]: I1012 07:49:28.321866 4599 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 07:49:36 crc kubenswrapper[4599]: I1012 07:49:36.144385 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-h6mzn" Oct 12 07:49:36 crc kubenswrapper[4599]: I1012 07:49:36.180087 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8468e667-fc53-4e39-939f-2221c78d4313-combined-ca-bundle\") pod \"8468e667-fc53-4e39-939f-2221c78d4313\" (UID: \"8468e667-fc53-4e39-939f-2221c78d4313\") " Oct 12 07:49:36 crc kubenswrapper[4599]: I1012 07:49:36.180253 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8468e667-fc53-4e39-939f-2221c78d4313-db-sync-config-data\") pod \"8468e667-fc53-4e39-939f-2221c78d4313\" (UID: \"8468e667-fc53-4e39-939f-2221c78d4313\") " Oct 12 07:49:36 crc kubenswrapper[4599]: I1012 07:49:36.180549 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vrttr\" (UniqueName: \"kubernetes.io/projected/8468e667-fc53-4e39-939f-2221c78d4313-kube-api-access-vrttr\") pod \"8468e667-fc53-4e39-939f-2221c78d4313\" (UID: \"8468e667-fc53-4e39-939f-2221c78d4313\") " Oct 12 07:49:36 crc kubenswrapper[4599]: I1012 07:49:36.187911 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8468e667-fc53-4e39-939f-2221c78d4313-kube-api-access-vrttr" (OuterVolumeSpecName: "kube-api-access-vrttr") pod "8468e667-fc53-4e39-939f-2221c78d4313" (UID: "8468e667-fc53-4e39-939f-2221c78d4313"). InnerVolumeSpecName "kube-api-access-vrttr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:49:36 crc kubenswrapper[4599]: I1012 07:49:36.189136 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8468e667-fc53-4e39-939f-2221c78d4313-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "8468e667-fc53-4e39-939f-2221c78d4313" (UID: "8468e667-fc53-4e39-939f-2221c78d4313"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:49:36 crc kubenswrapper[4599]: I1012 07:49:36.209624 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8468e667-fc53-4e39-939f-2221c78d4313-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8468e667-fc53-4e39-939f-2221c78d4313" (UID: "8468e667-fc53-4e39-939f-2221c78d4313"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:49:36 crc kubenswrapper[4599]: I1012 07:49:36.283757 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vrttr\" (UniqueName: \"kubernetes.io/projected/8468e667-fc53-4e39-939f-2221c78d4313-kube-api-access-vrttr\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:36 crc kubenswrapper[4599]: I1012 07:49:36.283791 4599 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8468e667-fc53-4e39-939f-2221c78d4313-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:36 crc kubenswrapper[4599]: I1012 07:49:36.283802 4599 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8468e667-fc53-4e39-939f-2221c78d4313-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:36 crc kubenswrapper[4599]: I1012 07:49:36.338457 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-h6mzn" event={"ID":"8468e667-fc53-4e39-939f-2221c78d4313","Type":"ContainerDied","Data":"6fa4c11bf7f8c99e9df06afc0918bd58d2acc2bf3c6bf895e33bdb67868ee9ab"} Oct 12 07:49:36 crc kubenswrapper[4599]: I1012 07:49:36.338521 4599 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6fa4c11bf7f8c99e9df06afc0918bd58d2acc2bf3c6bf895e33bdb67868ee9ab" Oct 12 07:49:36 crc kubenswrapper[4599]: I1012 07:49:36.338519 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-h6mzn" Oct 12 07:49:37 crc kubenswrapper[4599]: I1012 07:49:37.351828 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f94bf31d-933c-4cca-998a-4c3fc9a451f4","Type":"ContainerStarted","Data":"fa724a8bca55acb47a6b6f2cfb8f4cc5b5f33a479cb09f48c902ef947f3880a7"} Oct 12 07:49:37 crc kubenswrapper[4599]: I1012 07:49:37.354616 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-jjn25" event={"ID":"dd14916d-8d1c-4c3d-9721-2d5e0e171db1","Type":"ContainerStarted","Data":"e99684c0cc74ae635f1895693c49738c2353540617db6b5aadc701551a93a579"} Oct 12 07:49:37 crc kubenswrapper[4599]: I1012 07:49:37.373762 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-jjn25" podStartSLOduration=12.37374351 podStartE2EDuration="12.37374351s" podCreationTimestamp="2025-10-12 07:49:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:49:37.367474388 +0000 UTC m=+874.156669889" watchObservedRunningTime="2025-10-12 07:49:37.37374351 +0000 UTC m=+874.162939012" Oct 12 07:49:37 crc kubenswrapper[4599]: I1012 07:49:37.439899 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-58d46f7854-2972q"] Oct 12 07:49:37 crc kubenswrapper[4599]: E1012 07:49:37.440298 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8468e667-fc53-4e39-939f-2221c78d4313" containerName="barbican-db-sync" Oct 12 07:49:37 crc kubenswrapper[4599]: I1012 07:49:37.440317 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="8468e667-fc53-4e39-939f-2221c78d4313" containerName="barbican-db-sync" Oct 12 07:49:37 crc kubenswrapper[4599]: I1012 07:49:37.440520 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="8468e667-fc53-4e39-939f-2221c78d4313" containerName="barbican-db-sync" Oct 12 07:49:37 crc kubenswrapper[4599]: I1012 07:49:37.441379 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-58d46f7854-2972q" Oct 12 07:49:37 crc kubenswrapper[4599]: I1012 07:49:37.445581 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 12 07:49:37 crc kubenswrapper[4599]: I1012 07:49:37.445885 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-96cb4" Oct 12 07:49:37 crc kubenswrapper[4599]: I1012 07:49:37.446008 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Oct 12 07:49:37 crc kubenswrapper[4599]: I1012 07:49:37.451747 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-6d4cb4975-47tpw"] Oct 12 07:49:37 crc kubenswrapper[4599]: I1012 07:49:37.453217 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6d4cb4975-47tpw" Oct 12 07:49:37 crc kubenswrapper[4599]: I1012 07:49:37.464984 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Oct 12 07:49:37 crc kubenswrapper[4599]: I1012 07:49:37.474399 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6d4cb4975-47tpw"] Oct 12 07:49:37 crc kubenswrapper[4599]: I1012 07:49:37.499571 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-58d46f7854-2972q"] Oct 12 07:49:37 crc kubenswrapper[4599]: I1012 07:49:37.579829 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6cfb48f4f9-phx2l"] Oct 12 07:49:37 crc kubenswrapper[4599]: I1012 07:49:37.581141 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6cfb48f4f9-phx2l"] Oct 12 07:49:37 crc kubenswrapper[4599]: I1012 07:49:37.581236 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cfb48f4f9-phx2l" Oct 12 07:49:37 crc kubenswrapper[4599]: I1012 07:49:37.610446 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mklkt\" (UniqueName: \"kubernetes.io/projected/f8d2d027-f32a-4708-b7cb-5302f1def41f-kube-api-access-mklkt\") pod \"barbican-worker-6d4cb4975-47tpw\" (UID: \"f8d2d027-f32a-4708-b7cb-5302f1def41f\") " pod="openstack/barbican-worker-6d4cb4975-47tpw" Oct 12 07:49:37 crc kubenswrapper[4599]: I1012 07:49:37.610511 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8d2d027-f32a-4708-b7cb-5302f1def41f-combined-ca-bundle\") pod \"barbican-worker-6d4cb4975-47tpw\" (UID: \"f8d2d027-f32a-4708-b7cb-5302f1def41f\") " pod="openstack/barbican-worker-6d4cb4975-47tpw" Oct 12 07:49:37 crc kubenswrapper[4599]: I1012 07:49:37.610596 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/82626116-d40d-45c2-8a6f-513cb12f6b19-logs\") pod \"barbican-keystone-listener-58d46f7854-2972q\" (UID: \"82626116-d40d-45c2-8a6f-513cb12f6b19\") " pod="openstack/barbican-keystone-listener-58d46f7854-2972q" Oct 12 07:49:37 crc kubenswrapper[4599]: I1012 07:49:37.610623 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/82626116-d40d-45c2-8a6f-513cb12f6b19-config-data-custom\") pod \"barbican-keystone-listener-58d46f7854-2972q\" (UID: \"82626116-d40d-45c2-8a6f-513cb12f6b19\") " pod="openstack/barbican-keystone-listener-58d46f7854-2972q" Oct 12 07:49:37 crc kubenswrapper[4599]: I1012 07:49:37.610680 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59ldv\" (UniqueName: \"kubernetes.io/projected/82626116-d40d-45c2-8a6f-513cb12f6b19-kube-api-access-59ldv\") pod \"barbican-keystone-listener-58d46f7854-2972q\" (UID: \"82626116-d40d-45c2-8a6f-513cb12f6b19\") " pod="openstack/barbican-keystone-listener-58d46f7854-2972q" Oct 12 07:49:37 crc kubenswrapper[4599]: I1012 07:49:37.610714 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82626116-d40d-45c2-8a6f-513cb12f6b19-combined-ca-bundle\") pod \"barbican-keystone-listener-58d46f7854-2972q\" (UID: \"82626116-d40d-45c2-8a6f-513cb12f6b19\") " pod="openstack/barbican-keystone-listener-58d46f7854-2972q" Oct 12 07:49:37 crc kubenswrapper[4599]: I1012 07:49:37.610732 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f8d2d027-f32a-4708-b7cb-5302f1def41f-config-data-custom\") pod \"barbican-worker-6d4cb4975-47tpw\" (UID: \"f8d2d027-f32a-4708-b7cb-5302f1def41f\") " pod="openstack/barbican-worker-6d4cb4975-47tpw" Oct 12 07:49:37 crc kubenswrapper[4599]: I1012 07:49:37.610797 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8d2d027-f32a-4708-b7cb-5302f1def41f-logs\") pod \"barbican-worker-6d4cb4975-47tpw\" (UID: \"f8d2d027-f32a-4708-b7cb-5302f1def41f\") " pod="openstack/barbican-worker-6d4cb4975-47tpw" Oct 12 07:49:37 crc kubenswrapper[4599]: I1012 07:49:37.610833 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82626116-d40d-45c2-8a6f-513cb12f6b19-config-data\") pod \"barbican-keystone-listener-58d46f7854-2972q\" (UID: \"82626116-d40d-45c2-8a6f-513cb12f6b19\") " pod="openstack/barbican-keystone-listener-58d46f7854-2972q" Oct 12 07:49:37 crc kubenswrapper[4599]: I1012 07:49:37.610857 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8d2d027-f32a-4708-b7cb-5302f1def41f-config-data\") pod \"barbican-worker-6d4cb4975-47tpw\" (UID: \"f8d2d027-f32a-4708-b7cb-5302f1def41f\") " pod="openstack/barbican-worker-6d4cb4975-47tpw" Oct 12 07:49:37 crc kubenswrapper[4599]: I1012 07:49:37.637785 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-bdcbd85c8-pvjzc"] Oct 12 07:49:37 crc kubenswrapper[4599]: I1012 07:49:37.639389 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-bdcbd85c8-pvjzc" Oct 12 07:49:37 crc kubenswrapper[4599]: I1012 07:49:37.648984 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-bdcbd85c8-pvjzc"] Oct 12 07:49:37 crc kubenswrapper[4599]: I1012 07:49:37.649384 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Oct 12 07:49:37 crc kubenswrapper[4599]: I1012 07:49:37.712667 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59ldv\" (UniqueName: \"kubernetes.io/projected/82626116-d40d-45c2-8a6f-513cb12f6b19-kube-api-access-59ldv\") pod \"barbican-keystone-listener-58d46f7854-2972q\" (UID: \"82626116-d40d-45c2-8a6f-513cb12f6b19\") " pod="openstack/barbican-keystone-listener-58d46f7854-2972q" Oct 12 07:49:37 crc kubenswrapper[4599]: I1012 07:49:37.712739 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82626116-d40d-45c2-8a6f-513cb12f6b19-combined-ca-bundle\") pod \"barbican-keystone-listener-58d46f7854-2972q\" (UID: \"82626116-d40d-45c2-8a6f-513cb12f6b19\") " pod="openstack/barbican-keystone-listener-58d46f7854-2972q" Oct 12 07:49:37 crc kubenswrapper[4599]: I1012 07:49:37.712762 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f8d2d027-f32a-4708-b7cb-5302f1def41f-config-data-custom\") pod \"barbican-worker-6d4cb4975-47tpw\" (UID: \"f8d2d027-f32a-4708-b7cb-5302f1def41f\") " pod="openstack/barbican-worker-6d4cb4975-47tpw" Oct 12 07:49:37 crc kubenswrapper[4599]: I1012 07:49:37.712818 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8d2d027-f32a-4708-b7cb-5302f1def41f-logs\") pod \"barbican-worker-6d4cb4975-47tpw\" (UID: \"f8d2d027-f32a-4708-b7cb-5302f1def41f\") " pod="openstack/barbican-worker-6d4cb4975-47tpw" Oct 12 07:49:37 crc kubenswrapper[4599]: I1012 07:49:37.712844 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82626116-d40d-45c2-8a6f-513cb12f6b19-config-data\") pod \"barbican-keystone-listener-58d46f7854-2972q\" (UID: \"82626116-d40d-45c2-8a6f-513cb12f6b19\") " pod="openstack/barbican-keystone-listener-58d46f7854-2972q" Oct 12 07:49:37 crc kubenswrapper[4599]: I1012 07:49:37.712862 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8d2d027-f32a-4708-b7cb-5302f1def41f-config-data\") pod \"barbican-worker-6d4cb4975-47tpw\" (UID: \"f8d2d027-f32a-4708-b7cb-5302f1def41f\") " pod="openstack/barbican-worker-6d4cb4975-47tpw" Oct 12 07:49:37 crc kubenswrapper[4599]: I1012 07:49:37.712883 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bba461b1-74aa-45cf-bf02-2569bf3a0c2d-config\") pod \"dnsmasq-dns-6cfb48f4f9-phx2l\" (UID: \"bba461b1-74aa-45cf-bf02-2569bf3a0c2d\") " pod="openstack/dnsmasq-dns-6cfb48f4f9-phx2l" Oct 12 07:49:37 crc kubenswrapper[4599]: I1012 07:49:37.712918 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bba461b1-74aa-45cf-bf02-2569bf3a0c2d-dns-swift-storage-0\") pod \"dnsmasq-dns-6cfb48f4f9-phx2l\" (UID: \"bba461b1-74aa-45cf-bf02-2569bf3a0c2d\") " pod="openstack/dnsmasq-dns-6cfb48f4f9-phx2l" Oct 12 07:49:37 crc kubenswrapper[4599]: I1012 07:49:37.712941 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bba461b1-74aa-45cf-bf02-2569bf3a0c2d-ovsdbserver-sb\") pod \"dnsmasq-dns-6cfb48f4f9-phx2l\" (UID: \"bba461b1-74aa-45cf-bf02-2569bf3a0c2d\") " pod="openstack/dnsmasq-dns-6cfb48f4f9-phx2l" Oct 12 07:49:37 crc kubenswrapper[4599]: I1012 07:49:37.712964 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bba461b1-74aa-45cf-bf02-2569bf3a0c2d-dns-svc\") pod \"dnsmasq-dns-6cfb48f4f9-phx2l\" (UID: \"bba461b1-74aa-45cf-bf02-2569bf3a0c2d\") " pod="openstack/dnsmasq-dns-6cfb48f4f9-phx2l" Oct 12 07:49:37 crc kubenswrapper[4599]: I1012 07:49:37.712986 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bba461b1-74aa-45cf-bf02-2569bf3a0c2d-ovsdbserver-nb\") pod \"dnsmasq-dns-6cfb48f4f9-phx2l\" (UID: \"bba461b1-74aa-45cf-bf02-2569bf3a0c2d\") " pod="openstack/dnsmasq-dns-6cfb48f4f9-phx2l" Oct 12 07:49:37 crc kubenswrapper[4599]: I1012 07:49:37.713009 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mklkt\" (UniqueName: \"kubernetes.io/projected/f8d2d027-f32a-4708-b7cb-5302f1def41f-kube-api-access-mklkt\") pod \"barbican-worker-6d4cb4975-47tpw\" (UID: \"f8d2d027-f32a-4708-b7cb-5302f1def41f\") " pod="openstack/barbican-worker-6d4cb4975-47tpw" Oct 12 07:49:37 crc kubenswrapper[4599]: I1012 07:49:37.713035 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8d2d027-f32a-4708-b7cb-5302f1def41f-combined-ca-bundle\") pod \"barbican-worker-6d4cb4975-47tpw\" (UID: \"f8d2d027-f32a-4708-b7cb-5302f1def41f\") " pod="openstack/barbican-worker-6d4cb4975-47tpw" Oct 12 07:49:37 crc kubenswrapper[4599]: I1012 07:49:37.713059 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jg542\" (UniqueName: \"kubernetes.io/projected/bba461b1-74aa-45cf-bf02-2569bf3a0c2d-kube-api-access-jg542\") pod \"dnsmasq-dns-6cfb48f4f9-phx2l\" (UID: \"bba461b1-74aa-45cf-bf02-2569bf3a0c2d\") " pod="openstack/dnsmasq-dns-6cfb48f4f9-phx2l" Oct 12 07:49:37 crc kubenswrapper[4599]: I1012 07:49:37.713108 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/82626116-d40d-45c2-8a6f-513cb12f6b19-logs\") pod \"barbican-keystone-listener-58d46f7854-2972q\" (UID: \"82626116-d40d-45c2-8a6f-513cb12f6b19\") " pod="openstack/barbican-keystone-listener-58d46f7854-2972q" Oct 12 07:49:37 crc kubenswrapper[4599]: I1012 07:49:37.713127 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/82626116-d40d-45c2-8a6f-513cb12f6b19-config-data-custom\") pod \"barbican-keystone-listener-58d46f7854-2972q\" (UID: \"82626116-d40d-45c2-8a6f-513cb12f6b19\") " pod="openstack/barbican-keystone-listener-58d46f7854-2972q" Oct 12 07:49:37 crc kubenswrapper[4599]: I1012 07:49:37.714188 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/82626116-d40d-45c2-8a6f-513cb12f6b19-logs\") pod \"barbican-keystone-listener-58d46f7854-2972q\" (UID: \"82626116-d40d-45c2-8a6f-513cb12f6b19\") " pod="openstack/barbican-keystone-listener-58d46f7854-2972q" Oct 12 07:49:37 crc kubenswrapper[4599]: I1012 07:49:37.714931 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8d2d027-f32a-4708-b7cb-5302f1def41f-logs\") pod \"barbican-worker-6d4cb4975-47tpw\" (UID: \"f8d2d027-f32a-4708-b7cb-5302f1def41f\") " pod="openstack/barbican-worker-6d4cb4975-47tpw" Oct 12 07:49:37 crc kubenswrapper[4599]: I1012 07:49:37.719036 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/82626116-d40d-45c2-8a6f-513cb12f6b19-config-data-custom\") pod \"barbican-keystone-listener-58d46f7854-2972q\" (UID: \"82626116-d40d-45c2-8a6f-513cb12f6b19\") " pod="openstack/barbican-keystone-listener-58d46f7854-2972q" Oct 12 07:49:37 crc kubenswrapper[4599]: I1012 07:49:37.719933 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f8d2d027-f32a-4708-b7cb-5302f1def41f-config-data-custom\") pod \"barbican-worker-6d4cb4975-47tpw\" (UID: \"f8d2d027-f32a-4708-b7cb-5302f1def41f\") " pod="openstack/barbican-worker-6d4cb4975-47tpw" Oct 12 07:49:37 crc kubenswrapper[4599]: I1012 07:49:37.720007 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82626116-d40d-45c2-8a6f-513cb12f6b19-combined-ca-bundle\") pod \"barbican-keystone-listener-58d46f7854-2972q\" (UID: \"82626116-d40d-45c2-8a6f-513cb12f6b19\") " pod="openstack/barbican-keystone-listener-58d46f7854-2972q" Oct 12 07:49:37 crc kubenswrapper[4599]: I1012 07:49:37.720571 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8d2d027-f32a-4708-b7cb-5302f1def41f-combined-ca-bundle\") pod \"barbican-worker-6d4cb4975-47tpw\" (UID: \"f8d2d027-f32a-4708-b7cb-5302f1def41f\") " pod="openstack/barbican-worker-6d4cb4975-47tpw" Oct 12 07:49:37 crc kubenswrapper[4599]: I1012 07:49:37.721215 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82626116-d40d-45c2-8a6f-513cb12f6b19-config-data\") pod \"barbican-keystone-listener-58d46f7854-2972q\" (UID: \"82626116-d40d-45c2-8a6f-513cb12f6b19\") " pod="openstack/barbican-keystone-listener-58d46f7854-2972q" Oct 12 07:49:37 crc kubenswrapper[4599]: I1012 07:49:37.729250 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8d2d027-f32a-4708-b7cb-5302f1def41f-config-data\") pod \"barbican-worker-6d4cb4975-47tpw\" (UID: \"f8d2d027-f32a-4708-b7cb-5302f1def41f\") " pod="openstack/barbican-worker-6d4cb4975-47tpw" Oct 12 07:49:37 crc kubenswrapper[4599]: I1012 07:49:37.729263 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59ldv\" (UniqueName: \"kubernetes.io/projected/82626116-d40d-45c2-8a6f-513cb12f6b19-kube-api-access-59ldv\") pod \"barbican-keystone-listener-58d46f7854-2972q\" (UID: \"82626116-d40d-45c2-8a6f-513cb12f6b19\") " pod="openstack/barbican-keystone-listener-58d46f7854-2972q" Oct 12 07:49:37 crc kubenswrapper[4599]: I1012 07:49:37.730708 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mklkt\" (UniqueName: \"kubernetes.io/projected/f8d2d027-f32a-4708-b7cb-5302f1def41f-kube-api-access-mklkt\") pod \"barbican-worker-6d4cb4975-47tpw\" (UID: \"f8d2d027-f32a-4708-b7cb-5302f1def41f\") " pod="openstack/barbican-worker-6d4cb4975-47tpw" Oct 12 07:49:37 crc kubenswrapper[4599]: I1012 07:49:37.761300 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-58d46f7854-2972q" Oct 12 07:49:37 crc kubenswrapper[4599]: I1012 07:49:37.777426 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6d4cb4975-47tpw" Oct 12 07:49:37 crc kubenswrapper[4599]: I1012 07:49:37.815029 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d5b5eea9-05b1-417c-a181-bb58db532acc-config-data-custom\") pod \"barbican-api-bdcbd85c8-pvjzc\" (UID: \"d5b5eea9-05b1-417c-a181-bb58db532acc\") " pod="openstack/barbican-api-bdcbd85c8-pvjzc" Oct 12 07:49:37 crc kubenswrapper[4599]: I1012 07:49:37.815082 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5b5eea9-05b1-417c-a181-bb58db532acc-logs\") pod \"barbican-api-bdcbd85c8-pvjzc\" (UID: \"d5b5eea9-05b1-417c-a181-bb58db532acc\") " pod="openstack/barbican-api-bdcbd85c8-pvjzc" Oct 12 07:49:37 crc kubenswrapper[4599]: I1012 07:49:37.815131 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bba461b1-74aa-45cf-bf02-2569bf3a0c2d-config\") pod \"dnsmasq-dns-6cfb48f4f9-phx2l\" (UID: \"bba461b1-74aa-45cf-bf02-2569bf3a0c2d\") " pod="openstack/dnsmasq-dns-6cfb48f4f9-phx2l" Oct 12 07:49:37 crc kubenswrapper[4599]: I1012 07:49:37.815167 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bba461b1-74aa-45cf-bf02-2569bf3a0c2d-dns-swift-storage-0\") pod \"dnsmasq-dns-6cfb48f4f9-phx2l\" (UID: \"bba461b1-74aa-45cf-bf02-2569bf3a0c2d\") " pod="openstack/dnsmasq-dns-6cfb48f4f9-phx2l" Oct 12 07:49:37 crc kubenswrapper[4599]: I1012 07:49:37.815189 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bba461b1-74aa-45cf-bf02-2569bf3a0c2d-ovsdbserver-sb\") pod \"dnsmasq-dns-6cfb48f4f9-phx2l\" (UID: \"bba461b1-74aa-45cf-bf02-2569bf3a0c2d\") " pod="openstack/dnsmasq-dns-6cfb48f4f9-phx2l" Oct 12 07:49:37 crc kubenswrapper[4599]: I1012 07:49:37.815208 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tp752\" (UniqueName: \"kubernetes.io/projected/d5b5eea9-05b1-417c-a181-bb58db532acc-kube-api-access-tp752\") pod \"barbican-api-bdcbd85c8-pvjzc\" (UID: \"d5b5eea9-05b1-417c-a181-bb58db532acc\") " pod="openstack/barbican-api-bdcbd85c8-pvjzc" Oct 12 07:49:37 crc kubenswrapper[4599]: I1012 07:49:37.815229 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bba461b1-74aa-45cf-bf02-2569bf3a0c2d-dns-svc\") pod \"dnsmasq-dns-6cfb48f4f9-phx2l\" (UID: \"bba461b1-74aa-45cf-bf02-2569bf3a0c2d\") " pod="openstack/dnsmasq-dns-6cfb48f4f9-phx2l" Oct 12 07:49:37 crc kubenswrapper[4599]: I1012 07:49:37.815249 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bba461b1-74aa-45cf-bf02-2569bf3a0c2d-ovsdbserver-nb\") pod \"dnsmasq-dns-6cfb48f4f9-phx2l\" (UID: \"bba461b1-74aa-45cf-bf02-2569bf3a0c2d\") " pod="openstack/dnsmasq-dns-6cfb48f4f9-phx2l" Oct 12 07:49:37 crc kubenswrapper[4599]: I1012 07:49:37.815272 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5b5eea9-05b1-417c-a181-bb58db532acc-config-data\") pod \"barbican-api-bdcbd85c8-pvjzc\" (UID: \"d5b5eea9-05b1-417c-a181-bb58db532acc\") " pod="openstack/barbican-api-bdcbd85c8-pvjzc" Oct 12 07:49:37 crc kubenswrapper[4599]: I1012 07:49:37.815308 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jg542\" (UniqueName: \"kubernetes.io/projected/bba461b1-74aa-45cf-bf02-2569bf3a0c2d-kube-api-access-jg542\") pod \"dnsmasq-dns-6cfb48f4f9-phx2l\" (UID: \"bba461b1-74aa-45cf-bf02-2569bf3a0c2d\") " pod="openstack/dnsmasq-dns-6cfb48f4f9-phx2l" Oct 12 07:49:37 crc kubenswrapper[4599]: I1012 07:49:37.815356 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5b5eea9-05b1-417c-a181-bb58db532acc-combined-ca-bundle\") pod \"barbican-api-bdcbd85c8-pvjzc\" (UID: \"d5b5eea9-05b1-417c-a181-bb58db532acc\") " pod="openstack/barbican-api-bdcbd85c8-pvjzc" Oct 12 07:49:37 crc kubenswrapper[4599]: I1012 07:49:37.816284 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bba461b1-74aa-45cf-bf02-2569bf3a0c2d-ovsdbserver-sb\") pod \"dnsmasq-dns-6cfb48f4f9-phx2l\" (UID: \"bba461b1-74aa-45cf-bf02-2569bf3a0c2d\") " pod="openstack/dnsmasq-dns-6cfb48f4f9-phx2l" Oct 12 07:49:37 crc kubenswrapper[4599]: I1012 07:49:37.816327 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bba461b1-74aa-45cf-bf02-2569bf3a0c2d-dns-svc\") pod \"dnsmasq-dns-6cfb48f4f9-phx2l\" (UID: \"bba461b1-74aa-45cf-bf02-2569bf3a0c2d\") " pod="openstack/dnsmasq-dns-6cfb48f4f9-phx2l" Oct 12 07:49:37 crc kubenswrapper[4599]: I1012 07:49:37.816473 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bba461b1-74aa-45cf-bf02-2569bf3a0c2d-config\") pod \"dnsmasq-dns-6cfb48f4f9-phx2l\" (UID: \"bba461b1-74aa-45cf-bf02-2569bf3a0c2d\") " pod="openstack/dnsmasq-dns-6cfb48f4f9-phx2l" Oct 12 07:49:37 crc kubenswrapper[4599]: I1012 07:49:37.816905 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bba461b1-74aa-45cf-bf02-2569bf3a0c2d-ovsdbserver-nb\") pod \"dnsmasq-dns-6cfb48f4f9-phx2l\" (UID: \"bba461b1-74aa-45cf-bf02-2569bf3a0c2d\") " pod="openstack/dnsmasq-dns-6cfb48f4f9-phx2l" Oct 12 07:49:37 crc kubenswrapper[4599]: I1012 07:49:37.817073 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bba461b1-74aa-45cf-bf02-2569bf3a0c2d-dns-swift-storage-0\") pod \"dnsmasq-dns-6cfb48f4f9-phx2l\" (UID: \"bba461b1-74aa-45cf-bf02-2569bf3a0c2d\") " pod="openstack/dnsmasq-dns-6cfb48f4f9-phx2l" Oct 12 07:49:37 crc kubenswrapper[4599]: I1012 07:49:37.833114 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jg542\" (UniqueName: \"kubernetes.io/projected/bba461b1-74aa-45cf-bf02-2569bf3a0c2d-kube-api-access-jg542\") pod \"dnsmasq-dns-6cfb48f4f9-phx2l\" (UID: \"bba461b1-74aa-45cf-bf02-2569bf3a0c2d\") " pod="openstack/dnsmasq-dns-6cfb48f4f9-phx2l" Oct 12 07:49:37 crc kubenswrapper[4599]: I1012 07:49:37.917872 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cfb48f4f9-phx2l" Oct 12 07:49:37 crc kubenswrapper[4599]: I1012 07:49:37.918322 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d5b5eea9-05b1-417c-a181-bb58db532acc-config-data-custom\") pod \"barbican-api-bdcbd85c8-pvjzc\" (UID: \"d5b5eea9-05b1-417c-a181-bb58db532acc\") " pod="openstack/barbican-api-bdcbd85c8-pvjzc" Oct 12 07:49:37 crc kubenswrapper[4599]: I1012 07:49:37.918401 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5b5eea9-05b1-417c-a181-bb58db532acc-logs\") pod \"barbican-api-bdcbd85c8-pvjzc\" (UID: \"d5b5eea9-05b1-417c-a181-bb58db532acc\") " pod="openstack/barbican-api-bdcbd85c8-pvjzc" Oct 12 07:49:37 crc kubenswrapper[4599]: I1012 07:49:37.918449 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tp752\" (UniqueName: \"kubernetes.io/projected/d5b5eea9-05b1-417c-a181-bb58db532acc-kube-api-access-tp752\") pod \"barbican-api-bdcbd85c8-pvjzc\" (UID: \"d5b5eea9-05b1-417c-a181-bb58db532acc\") " pod="openstack/barbican-api-bdcbd85c8-pvjzc" Oct 12 07:49:37 crc kubenswrapper[4599]: I1012 07:49:37.918482 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5b5eea9-05b1-417c-a181-bb58db532acc-config-data\") pod \"barbican-api-bdcbd85c8-pvjzc\" (UID: \"d5b5eea9-05b1-417c-a181-bb58db532acc\") " pod="openstack/barbican-api-bdcbd85c8-pvjzc" Oct 12 07:49:37 crc kubenswrapper[4599]: I1012 07:49:37.918536 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5b5eea9-05b1-417c-a181-bb58db532acc-combined-ca-bundle\") pod \"barbican-api-bdcbd85c8-pvjzc\" (UID: \"d5b5eea9-05b1-417c-a181-bb58db532acc\") " pod="openstack/barbican-api-bdcbd85c8-pvjzc" Oct 12 07:49:37 crc kubenswrapper[4599]: I1012 07:49:37.918874 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5b5eea9-05b1-417c-a181-bb58db532acc-logs\") pod \"barbican-api-bdcbd85c8-pvjzc\" (UID: \"d5b5eea9-05b1-417c-a181-bb58db532acc\") " pod="openstack/barbican-api-bdcbd85c8-pvjzc" Oct 12 07:49:37 crc kubenswrapper[4599]: I1012 07:49:37.929684 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5b5eea9-05b1-417c-a181-bb58db532acc-config-data\") pod \"barbican-api-bdcbd85c8-pvjzc\" (UID: \"d5b5eea9-05b1-417c-a181-bb58db532acc\") " pod="openstack/barbican-api-bdcbd85c8-pvjzc" Oct 12 07:49:37 crc kubenswrapper[4599]: I1012 07:49:37.931418 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5b5eea9-05b1-417c-a181-bb58db532acc-combined-ca-bundle\") pod \"barbican-api-bdcbd85c8-pvjzc\" (UID: \"d5b5eea9-05b1-417c-a181-bb58db532acc\") " pod="openstack/barbican-api-bdcbd85c8-pvjzc" Oct 12 07:49:37 crc kubenswrapper[4599]: I1012 07:49:37.931486 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d5b5eea9-05b1-417c-a181-bb58db532acc-config-data-custom\") pod \"barbican-api-bdcbd85c8-pvjzc\" (UID: \"d5b5eea9-05b1-417c-a181-bb58db532acc\") " pod="openstack/barbican-api-bdcbd85c8-pvjzc" Oct 12 07:49:37 crc kubenswrapper[4599]: I1012 07:49:37.936712 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tp752\" (UniqueName: \"kubernetes.io/projected/d5b5eea9-05b1-417c-a181-bb58db532acc-kube-api-access-tp752\") pod \"barbican-api-bdcbd85c8-pvjzc\" (UID: \"d5b5eea9-05b1-417c-a181-bb58db532acc\") " pod="openstack/barbican-api-bdcbd85c8-pvjzc" Oct 12 07:49:37 crc kubenswrapper[4599]: I1012 07:49:37.959311 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-bdcbd85c8-pvjzc" Oct 12 07:49:38 crc kubenswrapper[4599]: I1012 07:49:38.200324 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-58d46f7854-2972q"] Oct 12 07:49:38 crc kubenswrapper[4599]: I1012 07:49:38.215903 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6d4cb4975-47tpw"] Oct 12 07:49:38 crc kubenswrapper[4599]: I1012 07:49:38.358546 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6cfb48f4f9-phx2l"] Oct 12 07:49:38 crc kubenswrapper[4599]: I1012 07:49:38.367255 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-58d46f7854-2972q" event={"ID":"82626116-d40d-45c2-8a6f-513cb12f6b19","Type":"ContainerStarted","Data":"94aa5fb065a91ed0104a82d214df74f795d8ac102d4875a076d04b80e5b7da56"} Oct 12 07:49:38 crc kubenswrapper[4599]: I1012 07:49:38.369183 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-x47gg" event={"ID":"dc0cde93-5cbb-4652-9dc3-05666e908b49","Type":"ContainerStarted","Data":"0ae8e7b7cbd7d13a1e8b77c33d83159eb9cb77ba4482cdac6c60a1cc21408e05"} Oct 12 07:49:38 crc kubenswrapper[4599]: I1012 07:49:38.370366 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6d4cb4975-47tpw" event={"ID":"f8d2d027-f32a-4708-b7cb-5302f1def41f","Type":"ContainerStarted","Data":"d9099101867a3776a30ab0412a35f0789a414c2e946bc471a0065214a02e9730"} Oct 12 07:49:38 crc kubenswrapper[4599]: I1012 07:49:38.387865 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-x47gg" podStartSLOduration=2.174009952 podStartE2EDuration="23.387852225s" podCreationTimestamp="2025-10-12 07:49:15 +0000 UTC" firstStartedPulling="2025-10-12 07:49:15.959686292 +0000 UTC m=+852.748881794" lastFinishedPulling="2025-10-12 07:49:37.173528576 +0000 UTC m=+873.962724067" observedRunningTime="2025-10-12 07:49:38.382238782 +0000 UTC m=+875.171434284" watchObservedRunningTime="2025-10-12 07:49:38.387852225 +0000 UTC m=+875.177047728" Oct 12 07:49:38 crc kubenswrapper[4599]: I1012 07:49:38.436871 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-bdcbd85c8-pvjzc"] Oct 12 07:49:39 crc kubenswrapper[4599]: I1012 07:49:39.382837 4599 generic.go:334] "Generic (PLEG): container finished" podID="bba461b1-74aa-45cf-bf02-2569bf3a0c2d" containerID="c0c7036391c45aa48ea68db4a505e984bb71baecd7cbfa2868c0c57b08b0b89a" exitCode=0 Oct 12 07:49:39 crc kubenswrapper[4599]: I1012 07:49:39.383150 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cfb48f4f9-phx2l" event={"ID":"bba461b1-74aa-45cf-bf02-2569bf3a0c2d","Type":"ContainerDied","Data":"c0c7036391c45aa48ea68db4a505e984bb71baecd7cbfa2868c0c57b08b0b89a"} Oct 12 07:49:39 crc kubenswrapper[4599]: I1012 07:49:39.383184 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cfb48f4f9-phx2l" event={"ID":"bba461b1-74aa-45cf-bf02-2569bf3a0c2d","Type":"ContainerStarted","Data":"7ba6e6fa5ddaa1573d3c4fe7b065a0c370efb8ff22a3c97260b814d43121e513"} Oct 12 07:49:39 crc kubenswrapper[4599]: I1012 07:49:39.394513 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-bdcbd85c8-pvjzc" event={"ID":"d5b5eea9-05b1-417c-a181-bb58db532acc","Type":"ContainerStarted","Data":"31fbe1e390315b7164049d2e778c5dc4012cec9e0dd2e8e2f9c81250418829b7"} Oct 12 07:49:39 crc kubenswrapper[4599]: I1012 07:49:39.394618 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-bdcbd85c8-pvjzc" event={"ID":"d5b5eea9-05b1-417c-a181-bb58db532acc","Type":"ContainerStarted","Data":"0aeb413e3dd8be9b9cf5e4df72e9eb5dd9985af11d03e78f933583c58116d3ee"} Oct 12 07:49:39 crc kubenswrapper[4599]: I1012 07:49:39.394633 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-bdcbd85c8-pvjzc" event={"ID":"d5b5eea9-05b1-417c-a181-bb58db532acc","Type":"ContainerStarted","Data":"0ca9a93a1222b4cb8067f5662b287e52f7ea492cb96f97f2c7592c5f6cc75682"} Oct 12 07:49:39 crc kubenswrapper[4599]: I1012 07:49:39.394849 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-bdcbd85c8-pvjzc" Oct 12 07:49:39 crc kubenswrapper[4599]: I1012 07:49:39.394951 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-bdcbd85c8-pvjzc" Oct 12 07:49:39 crc kubenswrapper[4599]: I1012 07:49:39.416108 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-bdcbd85c8-pvjzc" podStartSLOduration=2.4160865 podStartE2EDuration="2.4160865s" podCreationTimestamp="2025-10-12 07:49:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:49:39.415325945 +0000 UTC m=+876.204521447" watchObservedRunningTime="2025-10-12 07:49:39.4160865 +0000 UTC m=+876.205281992" Oct 12 07:49:40 crc kubenswrapper[4599]: I1012 07:49:40.405888 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cfb48f4f9-phx2l" event={"ID":"bba461b1-74aa-45cf-bf02-2569bf3a0c2d","Type":"ContainerStarted","Data":"7cef59e154cead2059286f66aa6ea852fb5213ed3e44957420b02cefe3bd0565"} Oct 12 07:49:40 crc kubenswrapper[4599]: I1012 07:49:40.406720 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6cfb48f4f9-phx2l" Oct 12 07:49:40 crc kubenswrapper[4599]: I1012 07:49:40.775623 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6cfb48f4f9-phx2l" podStartSLOduration=3.775604155 podStartE2EDuration="3.775604155s" podCreationTimestamp="2025-10-12 07:49:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:49:40.42548169 +0000 UTC m=+877.214677193" watchObservedRunningTime="2025-10-12 07:49:40.775604155 +0000 UTC m=+877.564799657" Oct 12 07:49:40 crc kubenswrapper[4599]: I1012 07:49:40.776988 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-689dd94bf4-pwcdz"] Oct 12 07:49:40 crc kubenswrapper[4599]: I1012 07:49:40.778279 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-689dd94bf4-pwcdz" Oct 12 07:49:40 crc kubenswrapper[4599]: I1012 07:49:40.782433 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Oct 12 07:49:40 crc kubenswrapper[4599]: I1012 07:49:40.782631 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Oct 12 07:49:40 crc kubenswrapper[4599]: I1012 07:49:40.808811 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-689dd94bf4-pwcdz"] Oct 12 07:49:40 crc kubenswrapper[4599]: I1012 07:49:40.896422 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c247e243-5ad3-4e53-a733-a11d9407c42a-config-data\") pod \"barbican-api-689dd94bf4-pwcdz\" (UID: \"c247e243-5ad3-4e53-a733-a11d9407c42a\") " pod="openstack/barbican-api-689dd94bf4-pwcdz" Oct 12 07:49:40 crc kubenswrapper[4599]: I1012 07:49:40.896706 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c247e243-5ad3-4e53-a733-a11d9407c42a-logs\") pod \"barbican-api-689dd94bf4-pwcdz\" (UID: \"c247e243-5ad3-4e53-a733-a11d9407c42a\") " pod="openstack/barbican-api-689dd94bf4-pwcdz" Oct 12 07:49:40 crc kubenswrapper[4599]: I1012 07:49:40.896844 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c247e243-5ad3-4e53-a733-a11d9407c42a-internal-tls-certs\") pod \"barbican-api-689dd94bf4-pwcdz\" (UID: \"c247e243-5ad3-4e53-a733-a11d9407c42a\") " pod="openstack/barbican-api-689dd94bf4-pwcdz" Oct 12 07:49:40 crc kubenswrapper[4599]: I1012 07:49:40.896974 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c247e243-5ad3-4e53-a733-a11d9407c42a-config-data-custom\") pod \"barbican-api-689dd94bf4-pwcdz\" (UID: \"c247e243-5ad3-4e53-a733-a11d9407c42a\") " pod="openstack/barbican-api-689dd94bf4-pwcdz" Oct 12 07:49:40 crc kubenswrapper[4599]: I1012 07:49:40.897110 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdgb6\" (UniqueName: \"kubernetes.io/projected/c247e243-5ad3-4e53-a733-a11d9407c42a-kube-api-access-gdgb6\") pod \"barbican-api-689dd94bf4-pwcdz\" (UID: \"c247e243-5ad3-4e53-a733-a11d9407c42a\") " pod="openstack/barbican-api-689dd94bf4-pwcdz" Oct 12 07:49:40 crc kubenswrapper[4599]: I1012 07:49:40.897228 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c247e243-5ad3-4e53-a733-a11d9407c42a-public-tls-certs\") pod \"barbican-api-689dd94bf4-pwcdz\" (UID: \"c247e243-5ad3-4e53-a733-a11d9407c42a\") " pod="openstack/barbican-api-689dd94bf4-pwcdz" Oct 12 07:49:40 crc kubenswrapper[4599]: I1012 07:49:40.897350 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c247e243-5ad3-4e53-a733-a11d9407c42a-combined-ca-bundle\") pod \"barbican-api-689dd94bf4-pwcdz\" (UID: \"c247e243-5ad3-4e53-a733-a11d9407c42a\") " pod="openstack/barbican-api-689dd94bf4-pwcdz" Oct 12 07:49:40 crc kubenswrapper[4599]: I1012 07:49:40.999093 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdgb6\" (UniqueName: \"kubernetes.io/projected/c247e243-5ad3-4e53-a733-a11d9407c42a-kube-api-access-gdgb6\") pod \"barbican-api-689dd94bf4-pwcdz\" (UID: \"c247e243-5ad3-4e53-a733-a11d9407c42a\") " pod="openstack/barbican-api-689dd94bf4-pwcdz" Oct 12 07:49:40 crc kubenswrapper[4599]: I1012 07:49:40.999169 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c247e243-5ad3-4e53-a733-a11d9407c42a-public-tls-certs\") pod \"barbican-api-689dd94bf4-pwcdz\" (UID: \"c247e243-5ad3-4e53-a733-a11d9407c42a\") " pod="openstack/barbican-api-689dd94bf4-pwcdz" Oct 12 07:49:40 crc kubenswrapper[4599]: I1012 07:49:40.999207 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c247e243-5ad3-4e53-a733-a11d9407c42a-combined-ca-bundle\") pod \"barbican-api-689dd94bf4-pwcdz\" (UID: \"c247e243-5ad3-4e53-a733-a11d9407c42a\") " pod="openstack/barbican-api-689dd94bf4-pwcdz" Oct 12 07:49:40 crc kubenswrapper[4599]: I1012 07:49:40.999275 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c247e243-5ad3-4e53-a733-a11d9407c42a-config-data\") pod \"barbican-api-689dd94bf4-pwcdz\" (UID: \"c247e243-5ad3-4e53-a733-a11d9407c42a\") " pod="openstack/barbican-api-689dd94bf4-pwcdz" Oct 12 07:49:40 crc kubenswrapper[4599]: I1012 07:49:40.999299 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c247e243-5ad3-4e53-a733-a11d9407c42a-logs\") pod \"barbican-api-689dd94bf4-pwcdz\" (UID: \"c247e243-5ad3-4e53-a733-a11d9407c42a\") " pod="openstack/barbican-api-689dd94bf4-pwcdz" Oct 12 07:49:40 crc kubenswrapper[4599]: I1012 07:49:40.999345 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c247e243-5ad3-4e53-a733-a11d9407c42a-internal-tls-certs\") pod \"barbican-api-689dd94bf4-pwcdz\" (UID: \"c247e243-5ad3-4e53-a733-a11d9407c42a\") " pod="openstack/barbican-api-689dd94bf4-pwcdz" Oct 12 07:49:40 crc kubenswrapper[4599]: I1012 07:49:40.999375 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c247e243-5ad3-4e53-a733-a11d9407c42a-config-data-custom\") pod \"barbican-api-689dd94bf4-pwcdz\" (UID: \"c247e243-5ad3-4e53-a733-a11d9407c42a\") " pod="openstack/barbican-api-689dd94bf4-pwcdz" Oct 12 07:49:41 crc kubenswrapper[4599]: I1012 07:49:41.000440 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c247e243-5ad3-4e53-a733-a11d9407c42a-logs\") pod \"barbican-api-689dd94bf4-pwcdz\" (UID: \"c247e243-5ad3-4e53-a733-a11d9407c42a\") " pod="openstack/barbican-api-689dd94bf4-pwcdz" Oct 12 07:49:41 crc kubenswrapper[4599]: I1012 07:49:41.004945 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c247e243-5ad3-4e53-a733-a11d9407c42a-config-data-custom\") pod \"barbican-api-689dd94bf4-pwcdz\" (UID: \"c247e243-5ad3-4e53-a733-a11d9407c42a\") " pod="openstack/barbican-api-689dd94bf4-pwcdz" Oct 12 07:49:41 crc kubenswrapper[4599]: I1012 07:49:41.005740 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c247e243-5ad3-4e53-a733-a11d9407c42a-combined-ca-bundle\") pod \"barbican-api-689dd94bf4-pwcdz\" (UID: \"c247e243-5ad3-4e53-a733-a11d9407c42a\") " pod="openstack/barbican-api-689dd94bf4-pwcdz" Oct 12 07:49:41 crc kubenswrapper[4599]: I1012 07:49:41.006368 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c247e243-5ad3-4e53-a733-a11d9407c42a-config-data\") pod \"barbican-api-689dd94bf4-pwcdz\" (UID: \"c247e243-5ad3-4e53-a733-a11d9407c42a\") " pod="openstack/barbican-api-689dd94bf4-pwcdz" Oct 12 07:49:41 crc kubenswrapper[4599]: I1012 07:49:41.010715 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c247e243-5ad3-4e53-a733-a11d9407c42a-internal-tls-certs\") pod \"barbican-api-689dd94bf4-pwcdz\" (UID: \"c247e243-5ad3-4e53-a733-a11d9407c42a\") " pod="openstack/barbican-api-689dd94bf4-pwcdz" Oct 12 07:49:41 crc kubenswrapper[4599]: I1012 07:49:41.014423 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c247e243-5ad3-4e53-a733-a11d9407c42a-public-tls-certs\") pod \"barbican-api-689dd94bf4-pwcdz\" (UID: \"c247e243-5ad3-4e53-a733-a11d9407c42a\") " pod="openstack/barbican-api-689dd94bf4-pwcdz" Oct 12 07:49:41 crc kubenswrapper[4599]: I1012 07:49:41.014842 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdgb6\" (UniqueName: \"kubernetes.io/projected/c247e243-5ad3-4e53-a733-a11d9407c42a-kube-api-access-gdgb6\") pod \"barbican-api-689dd94bf4-pwcdz\" (UID: \"c247e243-5ad3-4e53-a733-a11d9407c42a\") " pod="openstack/barbican-api-689dd94bf4-pwcdz" Oct 12 07:49:41 crc kubenswrapper[4599]: I1012 07:49:41.095369 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-689dd94bf4-pwcdz" Oct 12 07:49:41 crc kubenswrapper[4599]: I1012 07:49:41.417797 4599 generic.go:334] "Generic (PLEG): container finished" podID="dc0cde93-5cbb-4652-9dc3-05666e908b49" containerID="0ae8e7b7cbd7d13a1e8b77c33d83159eb9cb77ba4482cdac6c60a1cc21408e05" exitCode=0 Oct 12 07:49:41 crc kubenswrapper[4599]: I1012 07:49:41.417885 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-x47gg" event={"ID":"dc0cde93-5cbb-4652-9dc3-05666e908b49","Type":"ContainerDied","Data":"0ae8e7b7cbd7d13a1e8b77c33d83159eb9cb77ba4482cdac6c60a1cc21408e05"} Oct 12 07:49:43 crc kubenswrapper[4599]: I1012 07:49:43.731201 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-x47gg" Oct 12 07:49:43 crc kubenswrapper[4599]: I1012 07:49:43.856557 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc0cde93-5cbb-4652-9dc3-05666e908b49-scripts\") pod \"dc0cde93-5cbb-4652-9dc3-05666e908b49\" (UID: \"dc0cde93-5cbb-4652-9dc3-05666e908b49\") " Oct 12 07:49:43 crc kubenswrapper[4599]: I1012 07:49:43.856596 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d7cz\" (UniqueName: \"kubernetes.io/projected/dc0cde93-5cbb-4652-9dc3-05666e908b49-kube-api-access-4d7cz\") pod \"dc0cde93-5cbb-4652-9dc3-05666e908b49\" (UID: \"dc0cde93-5cbb-4652-9dc3-05666e908b49\") " Oct 12 07:49:43 crc kubenswrapper[4599]: I1012 07:49:43.856635 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc0cde93-5cbb-4652-9dc3-05666e908b49-combined-ca-bundle\") pod \"dc0cde93-5cbb-4652-9dc3-05666e908b49\" (UID: \"dc0cde93-5cbb-4652-9dc3-05666e908b49\") " Oct 12 07:49:43 crc kubenswrapper[4599]: I1012 07:49:43.856683 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dc0cde93-5cbb-4652-9dc3-05666e908b49-db-sync-config-data\") pod \"dc0cde93-5cbb-4652-9dc3-05666e908b49\" (UID: \"dc0cde93-5cbb-4652-9dc3-05666e908b49\") " Oct 12 07:49:43 crc kubenswrapper[4599]: I1012 07:49:43.856702 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc0cde93-5cbb-4652-9dc3-05666e908b49-config-data\") pod \"dc0cde93-5cbb-4652-9dc3-05666e908b49\" (UID: \"dc0cde93-5cbb-4652-9dc3-05666e908b49\") " Oct 12 07:49:43 crc kubenswrapper[4599]: I1012 07:49:43.856742 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dc0cde93-5cbb-4652-9dc3-05666e908b49-etc-machine-id\") pod \"dc0cde93-5cbb-4652-9dc3-05666e908b49\" (UID: \"dc0cde93-5cbb-4652-9dc3-05666e908b49\") " Oct 12 07:49:43 crc kubenswrapper[4599]: I1012 07:49:43.857012 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dc0cde93-5cbb-4652-9dc3-05666e908b49-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "dc0cde93-5cbb-4652-9dc3-05666e908b49" (UID: "dc0cde93-5cbb-4652-9dc3-05666e908b49"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 12 07:49:43 crc kubenswrapper[4599]: I1012 07:49:43.862005 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc0cde93-5cbb-4652-9dc3-05666e908b49-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "dc0cde93-5cbb-4652-9dc3-05666e908b49" (UID: "dc0cde93-5cbb-4652-9dc3-05666e908b49"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:49:43 crc kubenswrapper[4599]: I1012 07:49:43.870354 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc0cde93-5cbb-4652-9dc3-05666e908b49-kube-api-access-4d7cz" (OuterVolumeSpecName: "kube-api-access-4d7cz") pod "dc0cde93-5cbb-4652-9dc3-05666e908b49" (UID: "dc0cde93-5cbb-4652-9dc3-05666e908b49"). InnerVolumeSpecName "kube-api-access-4d7cz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:49:43 crc kubenswrapper[4599]: I1012 07:49:43.870440 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc0cde93-5cbb-4652-9dc3-05666e908b49-scripts" (OuterVolumeSpecName: "scripts") pod "dc0cde93-5cbb-4652-9dc3-05666e908b49" (UID: "dc0cde93-5cbb-4652-9dc3-05666e908b49"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:49:43 crc kubenswrapper[4599]: I1012 07:49:43.883674 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc0cde93-5cbb-4652-9dc3-05666e908b49-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dc0cde93-5cbb-4652-9dc3-05666e908b49" (UID: "dc0cde93-5cbb-4652-9dc3-05666e908b49"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:49:43 crc kubenswrapper[4599]: I1012 07:49:43.903736 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc0cde93-5cbb-4652-9dc3-05666e908b49-config-data" (OuterVolumeSpecName: "config-data") pod "dc0cde93-5cbb-4652-9dc3-05666e908b49" (UID: "dc0cde93-5cbb-4652-9dc3-05666e908b49"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:49:43 crc kubenswrapper[4599]: I1012 07:49:43.958165 4599 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc0cde93-5cbb-4652-9dc3-05666e908b49-scripts\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:43 crc kubenswrapper[4599]: I1012 07:49:43.958191 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d7cz\" (UniqueName: \"kubernetes.io/projected/dc0cde93-5cbb-4652-9dc3-05666e908b49-kube-api-access-4d7cz\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:43 crc kubenswrapper[4599]: I1012 07:49:43.958204 4599 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc0cde93-5cbb-4652-9dc3-05666e908b49-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:43 crc kubenswrapper[4599]: I1012 07:49:43.958213 4599 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dc0cde93-5cbb-4652-9dc3-05666e908b49-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:43 crc kubenswrapper[4599]: I1012 07:49:43.958223 4599 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc0cde93-5cbb-4652-9dc3-05666e908b49-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:43 crc kubenswrapper[4599]: I1012 07:49:43.958231 4599 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dc0cde93-5cbb-4652-9dc3-05666e908b49-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:44 crc kubenswrapper[4599]: I1012 07:49:44.445430 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-x47gg" event={"ID":"dc0cde93-5cbb-4652-9dc3-05666e908b49","Type":"ContainerDied","Data":"e3cc40fdec404afbb24efe8268d6fb4ae39de440a6f89184660b1c77f3be41d2"} Oct 12 07:49:44 crc kubenswrapper[4599]: I1012 07:49:44.445748 4599 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e3cc40fdec404afbb24efe8268d6fb4ae39de440a6f89184660b1c77f3be41d2" Oct 12 07:49:44 crc kubenswrapper[4599]: I1012 07:49:44.445824 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-x47gg" Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.048865 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 12 07:49:45 crc kubenswrapper[4599]: E1012 07:49:45.049836 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc0cde93-5cbb-4652-9dc3-05666e908b49" containerName="cinder-db-sync" Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.049853 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc0cde93-5cbb-4652-9dc3-05666e908b49" containerName="cinder-db-sync" Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.050034 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc0cde93-5cbb-4652-9dc3-05666e908b49" containerName="cinder-db-sync" Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.051169 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.053737 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.053934 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-nchcj" Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.054067 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.060192 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.106396 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.134277 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6cfb48f4f9-phx2l"] Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.134627 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6cfb48f4f9-phx2l" podUID="bba461b1-74aa-45cf-bf02-2569bf3a0c2d" containerName="dnsmasq-dns" containerID="cri-o://7cef59e154cead2059286f66aa6ea852fb5213ed3e44957420b02cefe3bd0565" gracePeriod=10 Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.138945 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6cfb48f4f9-phx2l" Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.154518 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5dd96ddfff-bbpjj"] Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.156459 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dd96ddfff-bbpjj" Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.168138 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5dd96ddfff-bbpjj"] Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.199700 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6dbf57f-d6e6-417f-9c84-968e6336e3a6-scripts\") pod \"cinder-scheduler-0\" (UID: \"b6dbf57f-d6e6-417f-9c84-968e6336e3a6\") " pod="openstack/cinder-scheduler-0" Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.199753 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6dbf57f-d6e6-417f-9c84-968e6336e3a6-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"b6dbf57f-d6e6-417f-9c84-968e6336e3a6\") " pod="openstack/cinder-scheduler-0" Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.199898 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6dbf57f-d6e6-417f-9c84-968e6336e3a6-config-data\") pod \"cinder-scheduler-0\" (UID: \"b6dbf57f-d6e6-417f-9c84-968e6336e3a6\") " pod="openstack/cinder-scheduler-0" Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.200034 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78s68\" (UniqueName: \"kubernetes.io/projected/b6dbf57f-d6e6-417f-9c84-968e6336e3a6-kube-api-access-78s68\") pod \"cinder-scheduler-0\" (UID: \"b6dbf57f-d6e6-417f-9c84-968e6336e3a6\") " pod="openstack/cinder-scheduler-0" Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.200233 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b6dbf57f-d6e6-417f-9c84-968e6336e3a6-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"b6dbf57f-d6e6-417f-9c84-968e6336e3a6\") " pod="openstack/cinder-scheduler-0" Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.200277 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b6dbf57f-d6e6-417f-9c84-968e6336e3a6-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"b6dbf57f-d6e6-417f-9c84-968e6336e3a6\") " pod="openstack/cinder-scheduler-0" Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.267113 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.271324 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.275156 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.288911 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.303429 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ea02d25-75be-4c25-bc35-539b4d590e10-config\") pod \"dnsmasq-dns-5dd96ddfff-bbpjj\" (UID: \"7ea02d25-75be-4c25-bc35-539b4d590e10\") " pod="openstack/dnsmasq-dns-5dd96ddfff-bbpjj" Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.303543 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b6dbf57f-d6e6-417f-9c84-968e6336e3a6-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"b6dbf57f-d6e6-417f-9c84-968e6336e3a6\") " pod="openstack/cinder-scheduler-0" Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.303595 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7ea02d25-75be-4c25-bc35-539b4d590e10-dns-svc\") pod \"dnsmasq-dns-5dd96ddfff-bbpjj\" (UID: \"7ea02d25-75be-4c25-bc35-539b4d590e10\") " pod="openstack/dnsmasq-dns-5dd96ddfff-bbpjj" Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.303629 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b6dbf57f-d6e6-417f-9c84-968e6336e3a6-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"b6dbf57f-d6e6-417f-9c84-968e6336e3a6\") " pod="openstack/cinder-scheduler-0" Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.303676 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6dbf57f-d6e6-417f-9c84-968e6336e3a6-scripts\") pod \"cinder-scheduler-0\" (UID: \"b6dbf57f-d6e6-417f-9c84-968e6336e3a6\") " pod="openstack/cinder-scheduler-0" Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.303708 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6dbf57f-d6e6-417f-9c84-968e6336e3a6-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"b6dbf57f-d6e6-417f-9c84-968e6336e3a6\") " pod="openstack/cinder-scheduler-0" Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.303828 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6dbf57f-d6e6-417f-9c84-968e6336e3a6-config-data\") pod \"cinder-scheduler-0\" (UID: \"b6dbf57f-d6e6-417f-9c84-968e6336e3a6\") " pod="openstack/cinder-scheduler-0" Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.303863 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7ea02d25-75be-4c25-bc35-539b4d590e10-dns-swift-storage-0\") pod \"dnsmasq-dns-5dd96ddfff-bbpjj\" (UID: \"7ea02d25-75be-4c25-bc35-539b4d590e10\") " pod="openstack/dnsmasq-dns-5dd96ddfff-bbpjj" Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.303942 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7ea02d25-75be-4c25-bc35-539b4d590e10-ovsdbserver-nb\") pod \"dnsmasq-dns-5dd96ddfff-bbpjj\" (UID: \"7ea02d25-75be-4c25-bc35-539b4d590e10\") " pod="openstack/dnsmasq-dns-5dd96ddfff-bbpjj" Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.303974 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7ea02d25-75be-4c25-bc35-539b4d590e10-ovsdbserver-sb\") pod \"dnsmasq-dns-5dd96ddfff-bbpjj\" (UID: \"7ea02d25-75be-4c25-bc35-539b4d590e10\") " pod="openstack/dnsmasq-dns-5dd96ddfff-bbpjj" Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.304009 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78s68\" (UniqueName: \"kubernetes.io/projected/b6dbf57f-d6e6-417f-9c84-968e6336e3a6-kube-api-access-78s68\") pod \"cinder-scheduler-0\" (UID: \"b6dbf57f-d6e6-417f-9c84-968e6336e3a6\") " pod="openstack/cinder-scheduler-0" Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.304049 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrzjx\" (UniqueName: \"kubernetes.io/projected/7ea02d25-75be-4c25-bc35-539b4d590e10-kube-api-access-vrzjx\") pod \"dnsmasq-dns-5dd96ddfff-bbpjj\" (UID: \"7ea02d25-75be-4c25-bc35-539b4d590e10\") " pod="openstack/dnsmasq-dns-5dd96ddfff-bbpjj" Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.308691 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b6dbf57f-d6e6-417f-9c84-968e6336e3a6-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"b6dbf57f-d6e6-417f-9c84-968e6336e3a6\") " pod="openstack/cinder-scheduler-0" Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.316081 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6dbf57f-d6e6-417f-9c84-968e6336e3a6-scripts\") pod \"cinder-scheduler-0\" (UID: \"b6dbf57f-d6e6-417f-9c84-968e6336e3a6\") " pod="openstack/cinder-scheduler-0" Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.320817 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b6dbf57f-d6e6-417f-9c84-968e6336e3a6-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"b6dbf57f-d6e6-417f-9c84-968e6336e3a6\") " pod="openstack/cinder-scheduler-0" Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.320883 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6dbf57f-d6e6-417f-9c84-968e6336e3a6-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"b6dbf57f-d6e6-417f-9c84-968e6336e3a6\") " pod="openstack/cinder-scheduler-0" Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.324172 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6dbf57f-d6e6-417f-9c84-968e6336e3a6-config-data\") pod \"cinder-scheduler-0\" (UID: \"b6dbf57f-d6e6-417f-9c84-968e6336e3a6\") " pod="openstack/cinder-scheduler-0" Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.328199 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-689dd94bf4-pwcdz"] Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.342715 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78s68\" (UniqueName: \"kubernetes.io/projected/b6dbf57f-d6e6-417f-9c84-968e6336e3a6-kube-api-access-78s68\") pod \"cinder-scheduler-0\" (UID: \"b6dbf57f-d6e6-417f-9c84-968e6336e3a6\") " pod="openstack/cinder-scheduler-0" Oct 12 07:49:45 crc kubenswrapper[4599]: W1012 07:49:45.360504 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc247e243_5ad3_4e53_a733_a11d9407c42a.slice/crio-1d6f5e44dc15f352c0e608fd897d21a2128a10e4b8fef435fc1f5f25fcd2f6e3 WatchSource:0}: Error finding container 1d6f5e44dc15f352c0e608fd897d21a2128a10e4b8fef435fc1f5f25fcd2f6e3: Status 404 returned error can't find the container with id 1d6f5e44dc15f352c0e608fd897d21a2128a10e4b8fef435fc1f5f25fcd2f6e3 Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.388776 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.409500 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7ea02d25-75be-4c25-bc35-539b4d590e10-dns-svc\") pod \"dnsmasq-dns-5dd96ddfff-bbpjj\" (UID: \"7ea02d25-75be-4c25-bc35-539b4d590e10\") " pod="openstack/dnsmasq-dns-5dd96ddfff-bbpjj" Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.409599 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68qvr\" (UniqueName: \"kubernetes.io/projected/df8f9a6f-b14e-4241-9102-4473869410d8-kube-api-access-68qvr\") pod \"cinder-api-0\" (UID: \"df8f9a6f-b14e-4241-9102-4473869410d8\") " pod="openstack/cinder-api-0" Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.409629 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/df8f9a6f-b14e-4241-9102-4473869410d8-config-data-custom\") pod \"cinder-api-0\" (UID: \"df8f9a6f-b14e-4241-9102-4473869410d8\") " pod="openstack/cinder-api-0" Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.409649 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/df8f9a6f-b14e-4241-9102-4473869410d8-etc-machine-id\") pod \"cinder-api-0\" (UID: \"df8f9a6f-b14e-4241-9102-4473869410d8\") " pod="openstack/cinder-api-0" Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.409682 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df8f9a6f-b14e-4241-9102-4473869410d8-config-data\") pod \"cinder-api-0\" (UID: \"df8f9a6f-b14e-4241-9102-4473869410d8\") " pod="openstack/cinder-api-0" Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.409709 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7ea02d25-75be-4c25-bc35-539b4d590e10-dns-swift-storage-0\") pod \"dnsmasq-dns-5dd96ddfff-bbpjj\" (UID: \"7ea02d25-75be-4c25-bc35-539b4d590e10\") " pod="openstack/dnsmasq-dns-5dd96ddfff-bbpjj" Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.409725 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df8f9a6f-b14e-4241-9102-4473869410d8-scripts\") pod \"cinder-api-0\" (UID: \"df8f9a6f-b14e-4241-9102-4473869410d8\") " pod="openstack/cinder-api-0" Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.409762 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df8f9a6f-b14e-4241-9102-4473869410d8-logs\") pod \"cinder-api-0\" (UID: \"df8f9a6f-b14e-4241-9102-4473869410d8\") " pod="openstack/cinder-api-0" Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.409799 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7ea02d25-75be-4c25-bc35-539b4d590e10-ovsdbserver-nb\") pod \"dnsmasq-dns-5dd96ddfff-bbpjj\" (UID: \"7ea02d25-75be-4c25-bc35-539b4d590e10\") " pod="openstack/dnsmasq-dns-5dd96ddfff-bbpjj" Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.409817 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7ea02d25-75be-4c25-bc35-539b4d590e10-ovsdbserver-sb\") pod \"dnsmasq-dns-5dd96ddfff-bbpjj\" (UID: \"7ea02d25-75be-4c25-bc35-539b4d590e10\") " pod="openstack/dnsmasq-dns-5dd96ddfff-bbpjj" Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.409835 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df8f9a6f-b14e-4241-9102-4473869410d8-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"df8f9a6f-b14e-4241-9102-4473869410d8\") " pod="openstack/cinder-api-0" Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.409867 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrzjx\" (UniqueName: \"kubernetes.io/projected/7ea02d25-75be-4c25-bc35-539b4d590e10-kube-api-access-vrzjx\") pod \"dnsmasq-dns-5dd96ddfff-bbpjj\" (UID: \"7ea02d25-75be-4c25-bc35-539b4d590e10\") " pod="openstack/dnsmasq-dns-5dd96ddfff-bbpjj" Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.409912 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ea02d25-75be-4c25-bc35-539b4d590e10-config\") pod \"dnsmasq-dns-5dd96ddfff-bbpjj\" (UID: \"7ea02d25-75be-4c25-bc35-539b4d590e10\") " pod="openstack/dnsmasq-dns-5dd96ddfff-bbpjj" Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.410822 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ea02d25-75be-4c25-bc35-539b4d590e10-config\") pod \"dnsmasq-dns-5dd96ddfff-bbpjj\" (UID: \"7ea02d25-75be-4c25-bc35-539b4d590e10\") " pod="openstack/dnsmasq-dns-5dd96ddfff-bbpjj" Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.411357 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7ea02d25-75be-4c25-bc35-539b4d590e10-dns-svc\") pod \"dnsmasq-dns-5dd96ddfff-bbpjj\" (UID: \"7ea02d25-75be-4c25-bc35-539b4d590e10\") " pod="openstack/dnsmasq-dns-5dd96ddfff-bbpjj" Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.414680 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7ea02d25-75be-4c25-bc35-539b4d590e10-dns-swift-storage-0\") pod \"dnsmasq-dns-5dd96ddfff-bbpjj\" (UID: \"7ea02d25-75be-4c25-bc35-539b4d590e10\") " pod="openstack/dnsmasq-dns-5dd96ddfff-bbpjj" Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.414753 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7ea02d25-75be-4c25-bc35-539b4d590e10-ovsdbserver-nb\") pod \"dnsmasq-dns-5dd96ddfff-bbpjj\" (UID: \"7ea02d25-75be-4c25-bc35-539b4d590e10\") " pod="openstack/dnsmasq-dns-5dd96ddfff-bbpjj" Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.415292 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7ea02d25-75be-4c25-bc35-539b4d590e10-ovsdbserver-sb\") pod \"dnsmasq-dns-5dd96ddfff-bbpjj\" (UID: \"7ea02d25-75be-4c25-bc35-539b4d590e10\") " pod="openstack/dnsmasq-dns-5dd96ddfff-bbpjj" Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.437209 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrzjx\" (UniqueName: \"kubernetes.io/projected/7ea02d25-75be-4c25-bc35-539b4d590e10-kube-api-access-vrzjx\") pod \"dnsmasq-dns-5dd96ddfff-bbpjj\" (UID: \"7ea02d25-75be-4c25-bc35-539b4d590e10\") " pod="openstack/dnsmasq-dns-5dd96ddfff-bbpjj" Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.479519 4599 generic.go:334] "Generic (PLEG): container finished" podID="dd14916d-8d1c-4c3d-9721-2d5e0e171db1" containerID="e99684c0cc74ae635f1895693c49738c2353540617db6b5aadc701551a93a579" exitCode=0 Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.479602 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-jjn25" event={"ID":"dd14916d-8d1c-4c3d-9721-2d5e0e171db1","Type":"ContainerDied","Data":"e99684c0cc74ae635f1895693c49738c2353540617db6b5aadc701551a93a579"} Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.486023 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-58d46f7854-2972q" event={"ID":"82626116-d40d-45c2-8a6f-513cb12f6b19","Type":"ContainerStarted","Data":"4e00775a94f7168f24f30f72f98c31032fc332d04c3a5419558bf28a7093f11c"} Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.492515 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6d4cb4975-47tpw" event={"ID":"f8d2d027-f32a-4708-b7cb-5302f1def41f","Type":"ContainerStarted","Data":"2340a22bfc4f1cdd35fab59231d44719110d097f5e5c93d0cbf953db810ea750"} Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.512612 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df8f9a6f-b14e-4241-9102-4473869410d8-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"df8f9a6f-b14e-4241-9102-4473869410d8\") " pod="openstack/cinder-api-0" Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.512733 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68qvr\" (UniqueName: \"kubernetes.io/projected/df8f9a6f-b14e-4241-9102-4473869410d8-kube-api-access-68qvr\") pod \"cinder-api-0\" (UID: \"df8f9a6f-b14e-4241-9102-4473869410d8\") " pod="openstack/cinder-api-0" Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.512765 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/df8f9a6f-b14e-4241-9102-4473869410d8-config-data-custom\") pod \"cinder-api-0\" (UID: \"df8f9a6f-b14e-4241-9102-4473869410d8\") " pod="openstack/cinder-api-0" Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.512784 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/df8f9a6f-b14e-4241-9102-4473869410d8-etc-machine-id\") pod \"cinder-api-0\" (UID: \"df8f9a6f-b14e-4241-9102-4473869410d8\") " pod="openstack/cinder-api-0" Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.512804 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df8f9a6f-b14e-4241-9102-4473869410d8-config-data\") pod \"cinder-api-0\" (UID: \"df8f9a6f-b14e-4241-9102-4473869410d8\") " pod="openstack/cinder-api-0" Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.512825 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df8f9a6f-b14e-4241-9102-4473869410d8-scripts\") pod \"cinder-api-0\" (UID: \"df8f9a6f-b14e-4241-9102-4473869410d8\") " pod="openstack/cinder-api-0" Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.512849 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df8f9a6f-b14e-4241-9102-4473869410d8-logs\") pod \"cinder-api-0\" (UID: \"df8f9a6f-b14e-4241-9102-4473869410d8\") " pod="openstack/cinder-api-0" Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.513520 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df8f9a6f-b14e-4241-9102-4473869410d8-logs\") pod \"cinder-api-0\" (UID: \"df8f9a6f-b14e-4241-9102-4473869410d8\") " pod="openstack/cinder-api-0" Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.513571 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/df8f9a6f-b14e-4241-9102-4473869410d8-etc-machine-id\") pod \"cinder-api-0\" (UID: \"df8f9a6f-b14e-4241-9102-4473869410d8\") " pod="openstack/cinder-api-0" Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.518733 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/df8f9a6f-b14e-4241-9102-4473869410d8-config-data-custom\") pod \"cinder-api-0\" (UID: \"df8f9a6f-b14e-4241-9102-4473869410d8\") " pod="openstack/cinder-api-0" Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.519162 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df8f9a6f-b14e-4241-9102-4473869410d8-config-data\") pod \"cinder-api-0\" (UID: \"df8f9a6f-b14e-4241-9102-4473869410d8\") " pod="openstack/cinder-api-0" Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.519868 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df8f9a6f-b14e-4241-9102-4473869410d8-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"df8f9a6f-b14e-4241-9102-4473869410d8\") " pod="openstack/cinder-api-0" Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.536796 4599 generic.go:334] "Generic (PLEG): container finished" podID="bba461b1-74aa-45cf-bf02-2569bf3a0c2d" containerID="7cef59e154cead2059286f66aa6ea852fb5213ed3e44957420b02cefe3bd0565" exitCode=0 Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.536906 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cfb48f4f9-phx2l" event={"ID":"bba461b1-74aa-45cf-bf02-2569bf3a0c2d","Type":"ContainerDied","Data":"7cef59e154cead2059286f66aa6ea852fb5213ed3e44957420b02cefe3bd0565"} Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.536907 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df8f9a6f-b14e-4241-9102-4473869410d8-scripts\") pod \"cinder-api-0\" (UID: \"df8f9a6f-b14e-4241-9102-4473869410d8\") " pod="openstack/cinder-api-0" Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.539221 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-689dd94bf4-pwcdz" event={"ID":"c247e243-5ad3-4e53-a733-a11d9407c42a","Type":"ContainerStarted","Data":"1d6f5e44dc15f352c0e608fd897d21a2128a10e4b8fef435fc1f5f25fcd2f6e3"} Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.540832 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68qvr\" (UniqueName: \"kubernetes.io/projected/df8f9a6f-b14e-4241-9102-4473869410d8-kube-api-access-68qvr\") pod \"cinder-api-0\" (UID: \"df8f9a6f-b14e-4241-9102-4473869410d8\") " pod="openstack/cinder-api-0" Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.551170 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f94bf31d-933c-4cca-998a-4c3fc9a451f4" containerName="ceilometer-central-agent" containerID="cri-o://7f1c45d5e3d2dfbf584a7cb0627a5bc4ccaff62de2ca235d6cde978919ccda81" gracePeriod=30 Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.551286 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f94bf31d-933c-4cca-998a-4c3fc9a451f4" containerName="proxy-httpd" containerID="cri-o://4ed548665907723269ec5dee79aea041449fb9ce8af4a395c3f4fb4ed7523ae1" gracePeriod=30 Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.551349 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f94bf31d-933c-4cca-998a-4c3fc9a451f4" containerName="sg-core" containerID="cri-o://fa724a8bca55acb47a6b6f2cfb8f4cc5b5f33a479cb09f48c902ef947f3880a7" gracePeriod=30 Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.551405 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f94bf31d-933c-4cca-998a-4c3fc9a451f4" containerName="ceilometer-notification-agent" containerID="cri-o://036a903d34b4436b12b7ccac14fb27518208774daa9e31d4ce65967db6523fc1" gracePeriod=30 Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.558688 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f94bf31d-933c-4cca-998a-4c3fc9a451f4","Type":"ContainerStarted","Data":"4ed548665907723269ec5dee79aea041449fb9ce8af4a395c3f4fb4ed7523ae1"} Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.558734 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.577503 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dd96ddfff-bbpjj" Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.580688 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.85893193 podStartE2EDuration="40.580673313s" podCreationTimestamp="2025-10-12 07:49:05 +0000 UTC" firstStartedPulling="2025-10-12 07:49:07.109348615 +0000 UTC m=+843.898544118" lastFinishedPulling="2025-10-12 07:49:44.831089998 +0000 UTC m=+881.620285501" observedRunningTime="2025-10-12 07:49:45.570621022 +0000 UTC m=+882.359816524" watchObservedRunningTime="2025-10-12 07:49:45.580673313 +0000 UTC m=+882.369868815" Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.617001 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.638241 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cfb48f4f9-phx2l" Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.729636 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bba461b1-74aa-45cf-bf02-2569bf3a0c2d-dns-swift-storage-0\") pod \"bba461b1-74aa-45cf-bf02-2569bf3a0c2d\" (UID: \"bba461b1-74aa-45cf-bf02-2569bf3a0c2d\") " Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.729926 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bba461b1-74aa-45cf-bf02-2569bf3a0c2d-config\") pod \"bba461b1-74aa-45cf-bf02-2569bf3a0c2d\" (UID: \"bba461b1-74aa-45cf-bf02-2569bf3a0c2d\") " Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.729977 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jg542\" (UniqueName: \"kubernetes.io/projected/bba461b1-74aa-45cf-bf02-2569bf3a0c2d-kube-api-access-jg542\") pod \"bba461b1-74aa-45cf-bf02-2569bf3a0c2d\" (UID: \"bba461b1-74aa-45cf-bf02-2569bf3a0c2d\") " Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.730007 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bba461b1-74aa-45cf-bf02-2569bf3a0c2d-dns-svc\") pod \"bba461b1-74aa-45cf-bf02-2569bf3a0c2d\" (UID: \"bba461b1-74aa-45cf-bf02-2569bf3a0c2d\") " Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.730044 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bba461b1-74aa-45cf-bf02-2569bf3a0c2d-ovsdbserver-sb\") pod \"bba461b1-74aa-45cf-bf02-2569bf3a0c2d\" (UID: \"bba461b1-74aa-45cf-bf02-2569bf3a0c2d\") " Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.730200 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bba461b1-74aa-45cf-bf02-2569bf3a0c2d-ovsdbserver-nb\") pod \"bba461b1-74aa-45cf-bf02-2569bf3a0c2d\" (UID: \"bba461b1-74aa-45cf-bf02-2569bf3a0c2d\") " Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.840127 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bba461b1-74aa-45cf-bf02-2569bf3a0c2d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "bba461b1-74aa-45cf-bf02-2569bf3a0c2d" (UID: "bba461b1-74aa-45cf-bf02-2569bf3a0c2d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.845488 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bba461b1-74aa-45cf-bf02-2569bf3a0c2d-kube-api-access-jg542" (OuterVolumeSpecName: "kube-api-access-jg542") pod "bba461b1-74aa-45cf-bf02-2569bf3a0c2d" (UID: "bba461b1-74aa-45cf-bf02-2569bf3a0c2d"). InnerVolumeSpecName "kube-api-access-jg542". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.875081 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bba461b1-74aa-45cf-bf02-2569bf3a0c2d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bba461b1-74aa-45cf-bf02-2569bf3a0c2d" (UID: "bba461b1-74aa-45cf-bf02-2569bf3a0c2d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.876473 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bba461b1-74aa-45cf-bf02-2569bf3a0c2d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "bba461b1-74aa-45cf-bf02-2569bf3a0c2d" (UID: "bba461b1-74aa-45cf-bf02-2569bf3a0c2d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.903073 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bba461b1-74aa-45cf-bf02-2569bf3a0c2d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "bba461b1-74aa-45cf-bf02-2569bf3a0c2d" (UID: "bba461b1-74aa-45cf-bf02-2569bf3a0c2d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.969640 4599 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bba461b1-74aa-45cf-bf02-2569bf3a0c2d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.969672 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jg542\" (UniqueName: \"kubernetes.io/projected/bba461b1-74aa-45cf-bf02-2569bf3a0c2d-kube-api-access-jg542\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.969689 4599 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bba461b1-74aa-45cf-bf02-2569bf3a0c2d-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.969700 4599 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bba461b1-74aa-45cf-bf02-2569bf3a0c2d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:45 crc kubenswrapper[4599]: I1012 07:49:45.969708 4599 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bba461b1-74aa-45cf-bf02-2569bf3a0c2d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:46 crc kubenswrapper[4599]: I1012 07:49:45.998893 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bba461b1-74aa-45cf-bf02-2569bf3a0c2d-config" (OuterVolumeSpecName: "config") pod "bba461b1-74aa-45cf-bf02-2569bf3a0c2d" (UID: "bba461b1-74aa-45cf-bf02-2569bf3a0c2d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:49:46 crc kubenswrapper[4599]: I1012 07:49:46.072812 4599 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bba461b1-74aa-45cf-bf02-2569bf3a0c2d-config\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:46 crc kubenswrapper[4599]: I1012 07:49:46.200796 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 12 07:49:46 crc kubenswrapper[4599]: W1012 07:49:46.202817 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb6dbf57f_d6e6_417f_9c84_968e6336e3a6.slice/crio-42d3d8dba34475067a6f8ff5a8c1acad6aa7ea396d755054f8b4746bd177532a WatchSource:0}: Error finding container 42d3d8dba34475067a6f8ff5a8c1acad6aa7ea396d755054f8b4746bd177532a: Status 404 returned error can't find the container with id 42d3d8dba34475067a6f8ff5a8c1acad6aa7ea396d755054f8b4746bd177532a Oct 12 07:49:46 crc kubenswrapper[4599]: I1012 07:49:46.369193 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 12 07:49:46 crc kubenswrapper[4599]: W1012 07:49:46.376893 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddf8f9a6f_b14e_4241_9102_4473869410d8.slice/crio-ce6c44a2f2214b77bd3b08852683df1cfd012a0abe8006d0945c8be28d90cc2f WatchSource:0}: Error finding container ce6c44a2f2214b77bd3b08852683df1cfd012a0abe8006d0945c8be28d90cc2f: Status 404 returned error can't find the container with id ce6c44a2f2214b77bd3b08852683df1cfd012a0abe8006d0945c8be28d90cc2f Oct 12 07:49:46 crc kubenswrapper[4599]: I1012 07:49:46.379835 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5dd96ddfff-bbpjj"] Oct 12 07:49:46 crc kubenswrapper[4599]: W1012 07:49:46.381616 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7ea02d25_75be_4c25_bc35_539b4d590e10.slice/crio-eda97eeb803e27cb296dc603a5b01f23c6564ca6d2f46b5bf80f7d3ce5c4c2e8 WatchSource:0}: Error finding container eda97eeb803e27cb296dc603a5b01f23c6564ca6d2f46b5bf80f7d3ce5c4c2e8: Status 404 returned error can't find the container with id eda97eeb803e27cb296dc603a5b01f23c6564ca6d2f46b5bf80f7d3ce5c4c2e8 Oct 12 07:49:46 crc kubenswrapper[4599]: I1012 07:49:46.562271 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6d4cb4975-47tpw" event={"ID":"f8d2d027-f32a-4708-b7cb-5302f1def41f","Type":"ContainerStarted","Data":"34c01341c741d694dc64990b10aa5ff2e7b595a14499d6adcca6a11de7eda1b2"} Oct 12 07:49:46 crc kubenswrapper[4599]: I1012 07:49:46.565576 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b6dbf57f-d6e6-417f-9c84-968e6336e3a6","Type":"ContainerStarted","Data":"42d3d8dba34475067a6f8ff5a8c1acad6aa7ea396d755054f8b4746bd177532a"} Oct 12 07:49:46 crc kubenswrapper[4599]: I1012 07:49:46.567019 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dd96ddfff-bbpjj" event={"ID":"7ea02d25-75be-4c25-bc35-539b4d590e10","Type":"ContainerStarted","Data":"eda97eeb803e27cb296dc603a5b01f23c6564ca6d2f46b5bf80f7d3ce5c4c2e8"} Oct 12 07:49:46 crc kubenswrapper[4599]: I1012 07:49:46.582382 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cfb48f4f9-phx2l" event={"ID":"bba461b1-74aa-45cf-bf02-2569bf3a0c2d","Type":"ContainerDied","Data":"7ba6e6fa5ddaa1573d3c4fe7b065a0c370efb8ff22a3c97260b814d43121e513"} Oct 12 07:49:46 crc kubenswrapper[4599]: I1012 07:49:46.582424 4599 scope.go:117] "RemoveContainer" containerID="7cef59e154cead2059286f66aa6ea852fb5213ed3e44957420b02cefe3bd0565" Oct 12 07:49:46 crc kubenswrapper[4599]: I1012 07:49:46.582556 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cfb48f4f9-phx2l" Oct 12 07:49:46 crc kubenswrapper[4599]: I1012 07:49:46.589254 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-6d4cb4975-47tpw" podStartSLOduration=2.995301168 podStartE2EDuration="9.589242845s" podCreationTimestamp="2025-10-12 07:49:37 +0000 UTC" firstStartedPulling="2025-10-12 07:49:38.216803967 +0000 UTC m=+875.005999469" lastFinishedPulling="2025-10-12 07:49:44.810745644 +0000 UTC m=+881.599941146" observedRunningTime="2025-10-12 07:49:46.580001173 +0000 UTC m=+883.369196675" watchObservedRunningTime="2025-10-12 07:49:46.589242845 +0000 UTC m=+883.378438347" Oct 12 07:49:46 crc kubenswrapper[4599]: I1012 07:49:46.597357 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-689dd94bf4-pwcdz" event={"ID":"c247e243-5ad3-4e53-a733-a11d9407c42a","Type":"ContainerStarted","Data":"b1f5c368f23058e7efdd35c773865c27d81d0c513359a4d0efffc34b859f4f71"} Oct 12 07:49:46 crc kubenswrapper[4599]: I1012 07:49:46.597424 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-689dd94bf4-pwcdz" event={"ID":"c247e243-5ad3-4e53-a733-a11d9407c42a","Type":"ContainerStarted","Data":"10e6674155b7347fbe6e5ad704712f2b3b75fcd94560e92a5ee1aeeb3b8bd3ce"} Oct 12 07:49:46 crc kubenswrapper[4599]: I1012 07:49:46.620817 4599 scope.go:117] "RemoveContainer" containerID="c0c7036391c45aa48ea68db4a505e984bb71baecd7cbfa2868c0c57b08b0b89a" Oct 12 07:49:46 crc kubenswrapper[4599]: I1012 07:49:46.627080 4599 generic.go:334] "Generic (PLEG): container finished" podID="f94bf31d-933c-4cca-998a-4c3fc9a451f4" containerID="4ed548665907723269ec5dee79aea041449fb9ce8af4a395c3f4fb4ed7523ae1" exitCode=0 Oct 12 07:49:46 crc kubenswrapper[4599]: I1012 07:49:46.627125 4599 generic.go:334] "Generic (PLEG): container finished" podID="f94bf31d-933c-4cca-998a-4c3fc9a451f4" containerID="fa724a8bca55acb47a6b6f2cfb8f4cc5b5f33a479cb09f48c902ef947f3880a7" exitCode=2 Oct 12 07:49:46 crc kubenswrapper[4599]: I1012 07:49:46.627134 4599 generic.go:334] "Generic (PLEG): container finished" podID="f94bf31d-933c-4cca-998a-4c3fc9a451f4" containerID="7f1c45d5e3d2dfbf584a7cb0627a5bc4ccaff62de2ca235d6cde978919ccda81" exitCode=0 Oct 12 07:49:46 crc kubenswrapper[4599]: I1012 07:49:46.627195 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f94bf31d-933c-4cca-998a-4c3fc9a451f4","Type":"ContainerDied","Data":"4ed548665907723269ec5dee79aea041449fb9ce8af4a395c3f4fb4ed7523ae1"} Oct 12 07:49:46 crc kubenswrapper[4599]: I1012 07:49:46.627228 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f94bf31d-933c-4cca-998a-4c3fc9a451f4","Type":"ContainerDied","Data":"fa724a8bca55acb47a6b6f2cfb8f4cc5b5f33a479cb09f48c902ef947f3880a7"} Oct 12 07:49:46 crc kubenswrapper[4599]: I1012 07:49:46.627240 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f94bf31d-933c-4cca-998a-4c3fc9a451f4","Type":"ContainerDied","Data":"7f1c45d5e3d2dfbf584a7cb0627a5bc4ccaff62de2ca235d6cde978919ccda81"} Oct 12 07:49:46 crc kubenswrapper[4599]: I1012 07:49:46.628857 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6cfb48f4f9-phx2l"] Oct 12 07:49:46 crc kubenswrapper[4599]: I1012 07:49:46.632800 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"df8f9a6f-b14e-4241-9102-4473869410d8","Type":"ContainerStarted","Data":"ce6c44a2f2214b77bd3b08852683df1cfd012a0abe8006d0945c8be28d90cc2f"} Oct 12 07:49:46 crc kubenswrapper[4599]: I1012 07:49:46.639255 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-58d46f7854-2972q" event={"ID":"82626116-d40d-45c2-8a6f-513cb12f6b19","Type":"ContainerStarted","Data":"456d24d88533133e7212214c29f7ebb8c70ddc727d87f62f3792d9bdd2c90775"} Oct 12 07:49:46 crc kubenswrapper[4599]: I1012 07:49:46.640508 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6cfb48f4f9-phx2l"] Oct 12 07:49:46 crc kubenswrapper[4599]: I1012 07:49:46.644117 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-689dd94bf4-pwcdz" podStartSLOduration=6.6440963669999995 podStartE2EDuration="6.644096367s" podCreationTimestamp="2025-10-12 07:49:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:49:46.629613715 +0000 UTC m=+883.418809217" watchObservedRunningTime="2025-10-12 07:49:46.644096367 +0000 UTC m=+883.433291869" Oct 12 07:49:46 crc kubenswrapper[4599]: I1012 07:49:46.666697 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-58d46f7854-2972q" podStartSLOduration=3.071034964 podStartE2EDuration="9.666676951s" podCreationTimestamp="2025-10-12 07:49:37 +0000 UTC" firstStartedPulling="2025-10-12 07:49:38.204266535 +0000 UTC m=+874.993462037" lastFinishedPulling="2025-10-12 07:49:44.799908521 +0000 UTC m=+881.589104024" observedRunningTime="2025-10-12 07:49:46.656304906 +0000 UTC m=+883.445500409" watchObservedRunningTime="2025-10-12 07:49:46.666676951 +0000 UTC m=+883.455872453" Oct 12 07:49:47 crc kubenswrapper[4599]: I1012 07:49:47.022082 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-jjn25" Oct 12 07:49:47 crc kubenswrapper[4599]: I1012 07:49:47.096739 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/dd14916d-8d1c-4c3d-9721-2d5e0e171db1-config\") pod \"dd14916d-8d1c-4c3d-9721-2d5e0e171db1\" (UID: \"dd14916d-8d1c-4c3d-9721-2d5e0e171db1\") " Oct 12 07:49:47 crc kubenswrapper[4599]: I1012 07:49:47.096818 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wv65f\" (UniqueName: \"kubernetes.io/projected/dd14916d-8d1c-4c3d-9721-2d5e0e171db1-kube-api-access-wv65f\") pod \"dd14916d-8d1c-4c3d-9721-2d5e0e171db1\" (UID: \"dd14916d-8d1c-4c3d-9721-2d5e0e171db1\") " Oct 12 07:49:47 crc kubenswrapper[4599]: I1012 07:49:47.096920 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd14916d-8d1c-4c3d-9721-2d5e0e171db1-combined-ca-bundle\") pod \"dd14916d-8d1c-4c3d-9721-2d5e0e171db1\" (UID: \"dd14916d-8d1c-4c3d-9721-2d5e0e171db1\") " Oct 12 07:49:47 crc kubenswrapper[4599]: I1012 07:49:47.100694 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd14916d-8d1c-4c3d-9721-2d5e0e171db1-kube-api-access-wv65f" (OuterVolumeSpecName: "kube-api-access-wv65f") pod "dd14916d-8d1c-4c3d-9721-2d5e0e171db1" (UID: "dd14916d-8d1c-4c3d-9721-2d5e0e171db1"). InnerVolumeSpecName "kube-api-access-wv65f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:49:47 crc kubenswrapper[4599]: I1012 07:49:47.124418 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd14916d-8d1c-4c3d-9721-2d5e0e171db1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dd14916d-8d1c-4c3d-9721-2d5e0e171db1" (UID: "dd14916d-8d1c-4c3d-9721-2d5e0e171db1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:49:47 crc kubenswrapper[4599]: I1012 07:49:47.160621 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 12 07:49:47 crc kubenswrapper[4599]: I1012 07:49:47.164491 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd14916d-8d1c-4c3d-9721-2d5e0e171db1-config" (OuterVolumeSpecName: "config") pod "dd14916d-8d1c-4c3d-9721-2d5e0e171db1" (UID: "dd14916d-8d1c-4c3d-9721-2d5e0e171db1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:49:47 crc kubenswrapper[4599]: I1012 07:49:47.199478 4599 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/dd14916d-8d1c-4c3d-9721-2d5e0e171db1-config\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:47 crc kubenswrapper[4599]: I1012 07:49:47.199507 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wv65f\" (UniqueName: \"kubernetes.io/projected/dd14916d-8d1c-4c3d-9721-2d5e0e171db1-kube-api-access-wv65f\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:47 crc kubenswrapper[4599]: I1012 07:49:47.199517 4599 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd14916d-8d1c-4c3d-9721-2d5e0e171db1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:47 crc kubenswrapper[4599]: I1012 07:49:47.559384 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bba461b1-74aa-45cf-bf02-2569bf3a0c2d" path="/var/lib/kubelet/pods/bba461b1-74aa-45cf-bf02-2569bf3a0c2d/volumes" Oct 12 07:49:47 crc kubenswrapper[4599]: I1012 07:49:47.644986 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5dd96ddfff-bbpjj"] Oct 12 07:49:47 crc kubenswrapper[4599]: I1012 07:49:47.696396 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"df8f9a6f-b14e-4241-9102-4473869410d8","Type":"ContainerStarted","Data":"28a2b22c580e0d331820a034f24313ab13868922510e7fd9b01c2e507e9f2972"} Oct 12 07:49:47 crc kubenswrapper[4599]: I1012 07:49:47.696455 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"df8f9a6f-b14e-4241-9102-4473869410d8","Type":"ContainerStarted","Data":"b116cb821939a9c933101c37932183909c5cfd195bf545978df7572d26a519fe"} Oct 12 07:49:47 crc kubenswrapper[4599]: I1012 07:49:47.696675 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="df8f9a6f-b14e-4241-9102-4473869410d8" containerName="cinder-api-log" containerID="cri-o://b116cb821939a9c933101c37932183909c5cfd195bf545978df7572d26a519fe" gracePeriod=30 Oct 12 07:49:47 crc kubenswrapper[4599]: I1012 07:49:47.696879 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 12 07:49:47 crc kubenswrapper[4599]: I1012 07:49:47.697310 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="df8f9a6f-b14e-4241-9102-4473869410d8" containerName="cinder-api" containerID="cri-o://28a2b22c580e0d331820a034f24313ab13868922510e7fd9b01c2e507e9f2972" gracePeriod=30 Oct 12 07:49:47 crc kubenswrapper[4599]: I1012 07:49:47.717759 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-56c55fd88c-fnjlg"] Oct 12 07:49:47 crc kubenswrapper[4599]: E1012 07:49:47.718655 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bba461b1-74aa-45cf-bf02-2569bf3a0c2d" containerName="init" Oct 12 07:49:47 crc kubenswrapper[4599]: I1012 07:49:47.718678 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="bba461b1-74aa-45cf-bf02-2569bf3a0c2d" containerName="init" Oct 12 07:49:47 crc kubenswrapper[4599]: E1012 07:49:47.718726 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bba461b1-74aa-45cf-bf02-2569bf3a0c2d" containerName="dnsmasq-dns" Oct 12 07:49:47 crc kubenswrapper[4599]: I1012 07:49:47.718733 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="bba461b1-74aa-45cf-bf02-2569bf3a0c2d" containerName="dnsmasq-dns" Oct 12 07:49:47 crc kubenswrapper[4599]: E1012 07:49:47.718773 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd14916d-8d1c-4c3d-9721-2d5e0e171db1" containerName="neutron-db-sync" Oct 12 07:49:47 crc kubenswrapper[4599]: I1012 07:49:47.718780 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd14916d-8d1c-4c3d-9721-2d5e0e171db1" containerName="neutron-db-sync" Oct 12 07:49:47 crc kubenswrapper[4599]: I1012 07:49:47.719094 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd14916d-8d1c-4c3d-9721-2d5e0e171db1" containerName="neutron-db-sync" Oct 12 07:49:47 crc kubenswrapper[4599]: I1012 07:49:47.719121 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="bba461b1-74aa-45cf-bf02-2569bf3a0c2d" containerName="dnsmasq-dns" Oct 12 07:49:47 crc kubenswrapper[4599]: I1012 07:49:47.719385 4599 generic.go:334] "Generic (PLEG): container finished" podID="7ea02d25-75be-4c25-bc35-539b4d590e10" containerID="63718947180acc94069912a66deb1f5589cb279521cf6d6036133cb90c25f4be" exitCode=0 Oct 12 07:49:47 crc kubenswrapper[4599]: I1012 07:49:47.730238 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dd96ddfff-bbpjj" event={"ID":"7ea02d25-75be-4c25-bc35-539b4d590e10","Type":"ContainerDied","Data":"63718947180acc94069912a66deb1f5589cb279521cf6d6036133cb90c25f4be"} Oct 12 07:49:47 crc kubenswrapper[4599]: I1012 07:49:47.730380 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56c55fd88c-fnjlg" Oct 12 07:49:47 crc kubenswrapper[4599]: I1012 07:49:47.762636 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-jjn25" Oct 12 07:49:47 crc kubenswrapper[4599]: I1012 07:49:47.763480 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-jjn25" event={"ID":"dd14916d-8d1c-4c3d-9721-2d5e0e171db1","Type":"ContainerDied","Data":"74b3220c5e595ad3558f71b4c9f98616b8bdc4a1af37f614112a7c711a5d7464"} Oct 12 07:49:47 crc kubenswrapper[4599]: I1012 07:49:47.763530 4599 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="74b3220c5e595ad3558f71b4c9f98616b8bdc4a1af37f614112a7c711a5d7464" Oct 12 07:49:47 crc kubenswrapper[4599]: I1012 07:49:47.763994 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-689dd94bf4-pwcdz" Oct 12 07:49:47 crc kubenswrapper[4599]: I1012 07:49:47.764299 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-689dd94bf4-pwcdz" Oct 12 07:49:47 crc kubenswrapper[4599]: I1012 07:49:47.787390 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56c55fd88c-fnjlg"] Oct 12 07:49:47 crc kubenswrapper[4599]: I1012 07:49:47.796539 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=2.796513925 podStartE2EDuration="2.796513925s" podCreationTimestamp="2025-10-12 07:49:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:49:47.727862222 +0000 UTC m=+884.517057724" watchObservedRunningTime="2025-10-12 07:49:47.796513925 +0000 UTC m=+884.585709426" Oct 12 07:49:47 crc kubenswrapper[4599]: I1012 07:49:47.831583 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-586f7cb8d6-xq2lk"] Oct 12 07:49:47 crc kubenswrapper[4599]: I1012 07:49:47.833367 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-586f7cb8d6-xq2lk" Oct 12 07:49:47 crc kubenswrapper[4599]: I1012 07:49:47.831622 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ebd046c8-0af0-4d96-99a4-ce5bb0dec770-dns-swift-storage-0\") pod \"dnsmasq-dns-56c55fd88c-fnjlg\" (UID: \"ebd046c8-0af0-4d96-99a4-ce5bb0dec770\") " pod="openstack/dnsmasq-dns-56c55fd88c-fnjlg" Oct 12 07:49:47 crc kubenswrapper[4599]: I1012 07:49:47.834824 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vfz5\" (UniqueName: \"kubernetes.io/projected/ebd046c8-0af0-4d96-99a4-ce5bb0dec770-kube-api-access-5vfz5\") pod \"dnsmasq-dns-56c55fd88c-fnjlg\" (UID: \"ebd046c8-0af0-4d96-99a4-ce5bb0dec770\") " pod="openstack/dnsmasq-dns-56c55fd88c-fnjlg" Oct 12 07:49:47 crc kubenswrapper[4599]: I1012 07:49:47.834854 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ebd046c8-0af0-4d96-99a4-ce5bb0dec770-ovsdbserver-sb\") pod \"dnsmasq-dns-56c55fd88c-fnjlg\" (UID: \"ebd046c8-0af0-4d96-99a4-ce5bb0dec770\") " pod="openstack/dnsmasq-dns-56c55fd88c-fnjlg" Oct 12 07:49:47 crc kubenswrapper[4599]: I1012 07:49:47.834901 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ebd046c8-0af0-4d96-99a4-ce5bb0dec770-ovsdbserver-nb\") pod \"dnsmasq-dns-56c55fd88c-fnjlg\" (UID: \"ebd046c8-0af0-4d96-99a4-ce5bb0dec770\") " pod="openstack/dnsmasq-dns-56c55fd88c-fnjlg" Oct 12 07:49:47 crc kubenswrapper[4599]: I1012 07:49:47.835072 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ebd046c8-0af0-4d96-99a4-ce5bb0dec770-dns-svc\") pod \"dnsmasq-dns-56c55fd88c-fnjlg\" (UID: \"ebd046c8-0af0-4d96-99a4-ce5bb0dec770\") " pod="openstack/dnsmasq-dns-56c55fd88c-fnjlg" Oct 12 07:49:47 crc kubenswrapper[4599]: I1012 07:49:47.835118 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebd046c8-0af0-4d96-99a4-ce5bb0dec770-config\") pod \"dnsmasq-dns-56c55fd88c-fnjlg\" (UID: \"ebd046c8-0af0-4d96-99a4-ce5bb0dec770\") " pod="openstack/dnsmasq-dns-56c55fd88c-fnjlg" Oct 12 07:49:47 crc kubenswrapper[4599]: I1012 07:49:47.838251 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 12 07:49:47 crc kubenswrapper[4599]: I1012 07:49:47.838553 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-wlqpk" Oct 12 07:49:47 crc kubenswrapper[4599]: I1012 07:49:47.839571 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 12 07:49:47 crc kubenswrapper[4599]: I1012 07:49:47.839690 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Oct 12 07:49:47 crc kubenswrapper[4599]: I1012 07:49:47.843219 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-586f7cb8d6-xq2lk"] Oct 12 07:49:47 crc kubenswrapper[4599]: I1012 07:49:47.937856 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc5e8701-df4e-459c-85ae-78db217fcea0-ovndb-tls-certs\") pod \"neutron-586f7cb8d6-xq2lk\" (UID: \"fc5e8701-df4e-459c-85ae-78db217fcea0\") " pod="openstack/neutron-586f7cb8d6-xq2lk" Oct 12 07:49:47 crc kubenswrapper[4599]: I1012 07:49:47.938870 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ebd046c8-0af0-4d96-99a4-ce5bb0dec770-dns-swift-storage-0\") pod \"dnsmasq-dns-56c55fd88c-fnjlg\" (UID: \"ebd046c8-0af0-4d96-99a4-ce5bb0dec770\") " pod="openstack/dnsmasq-dns-56c55fd88c-fnjlg" Oct 12 07:49:47 crc kubenswrapper[4599]: I1012 07:49:47.938923 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ebd046c8-0af0-4d96-99a4-ce5bb0dec770-dns-swift-storage-0\") pod \"dnsmasq-dns-56c55fd88c-fnjlg\" (UID: \"ebd046c8-0af0-4d96-99a4-ce5bb0dec770\") " pod="openstack/dnsmasq-dns-56c55fd88c-fnjlg" Oct 12 07:49:47 crc kubenswrapper[4599]: I1012 07:49:47.938974 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fc5e8701-df4e-459c-85ae-78db217fcea0-config\") pod \"neutron-586f7cb8d6-xq2lk\" (UID: \"fc5e8701-df4e-459c-85ae-78db217fcea0\") " pod="openstack/neutron-586f7cb8d6-xq2lk" Oct 12 07:49:47 crc kubenswrapper[4599]: I1012 07:49:47.938993 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-488pl\" (UniqueName: \"kubernetes.io/projected/fc5e8701-df4e-459c-85ae-78db217fcea0-kube-api-access-488pl\") pod \"neutron-586f7cb8d6-xq2lk\" (UID: \"fc5e8701-df4e-459c-85ae-78db217fcea0\") " pod="openstack/neutron-586f7cb8d6-xq2lk" Oct 12 07:49:47 crc kubenswrapper[4599]: I1012 07:49:47.939017 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vfz5\" (UniqueName: \"kubernetes.io/projected/ebd046c8-0af0-4d96-99a4-ce5bb0dec770-kube-api-access-5vfz5\") pod \"dnsmasq-dns-56c55fd88c-fnjlg\" (UID: \"ebd046c8-0af0-4d96-99a4-ce5bb0dec770\") " pod="openstack/dnsmasq-dns-56c55fd88c-fnjlg" Oct 12 07:49:47 crc kubenswrapper[4599]: I1012 07:49:47.939037 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ebd046c8-0af0-4d96-99a4-ce5bb0dec770-ovsdbserver-sb\") pod \"dnsmasq-dns-56c55fd88c-fnjlg\" (UID: \"ebd046c8-0af0-4d96-99a4-ce5bb0dec770\") " pod="openstack/dnsmasq-dns-56c55fd88c-fnjlg" Oct 12 07:49:47 crc kubenswrapper[4599]: I1012 07:49:47.939054 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ebd046c8-0af0-4d96-99a4-ce5bb0dec770-ovsdbserver-nb\") pod \"dnsmasq-dns-56c55fd88c-fnjlg\" (UID: \"ebd046c8-0af0-4d96-99a4-ce5bb0dec770\") " pod="openstack/dnsmasq-dns-56c55fd88c-fnjlg" Oct 12 07:49:47 crc kubenswrapper[4599]: I1012 07:49:47.939129 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ebd046c8-0af0-4d96-99a4-ce5bb0dec770-dns-svc\") pod \"dnsmasq-dns-56c55fd88c-fnjlg\" (UID: \"ebd046c8-0af0-4d96-99a4-ce5bb0dec770\") " pod="openstack/dnsmasq-dns-56c55fd88c-fnjlg" Oct 12 07:49:47 crc kubenswrapper[4599]: I1012 07:49:47.939147 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebd046c8-0af0-4d96-99a4-ce5bb0dec770-config\") pod \"dnsmasq-dns-56c55fd88c-fnjlg\" (UID: \"ebd046c8-0af0-4d96-99a4-ce5bb0dec770\") " pod="openstack/dnsmasq-dns-56c55fd88c-fnjlg" Oct 12 07:49:47 crc kubenswrapper[4599]: I1012 07:49:47.939196 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/fc5e8701-df4e-459c-85ae-78db217fcea0-httpd-config\") pod \"neutron-586f7cb8d6-xq2lk\" (UID: \"fc5e8701-df4e-459c-85ae-78db217fcea0\") " pod="openstack/neutron-586f7cb8d6-xq2lk" Oct 12 07:49:47 crc kubenswrapper[4599]: I1012 07:49:47.939213 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc5e8701-df4e-459c-85ae-78db217fcea0-combined-ca-bundle\") pod \"neutron-586f7cb8d6-xq2lk\" (UID: \"fc5e8701-df4e-459c-85ae-78db217fcea0\") " pod="openstack/neutron-586f7cb8d6-xq2lk" Oct 12 07:49:47 crc kubenswrapper[4599]: I1012 07:49:47.940049 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ebd046c8-0af0-4d96-99a4-ce5bb0dec770-ovsdbserver-sb\") pod \"dnsmasq-dns-56c55fd88c-fnjlg\" (UID: \"ebd046c8-0af0-4d96-99a4-ce5bb0dec770\") " pod="openstack/dnsmasq-dns-56c55fd88c-fnjlg" Oct 12 07:49:47 crc kubenswrapper[4599]: I1012 07:49:47.940668 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ebd046c8-0af0-4d96-99a4-ce5bb0dec770-ovsdbserver-nb\") pod \"dnsmasq-dns-56c55fd88c-fnjlg\" (UID: \"ebd046c8-0af0-4d96-99a4-ce5bb0dec770\") " pod="openstack/dnsmasq-dns-56c55fd88c-fnjlg" Oct 12 07:49:47 crc kubenswrapper[4599]: I1012 07:49:47.941411 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ebd046c8-0af0-4d96-99a4-ce5bb0dec770-dns-svc\") pod \"dnsmasq-dns-56c55fd88c-fnjlg\" (UID: \"ebd046c8-0af0-4d96-99a4-ce5bb0dec770\") " pod="openstack/dnsmasq-dns-56c55fd88c-fnjlg" Oct 12 07:49:47 crc kubenswrapper[4599]: I1012 07:49:47.943228 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebd046c8-0af0-4d96-99a4-ce5bb0dec770-config\") pod \"dnsmasq-dns-56c55fd88c-fnjlg\" (UID: \"ebd046c8-0af0-4d96-99a4-ce5bb0dec770\") " pod="openstack/dnsmasq-dns-56c55fd88c-fnjlg" Oct 12 07:49:47 crc kubenswrapper[4599]: I1012 07:49:47.953743 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vfz5\" (UniqueName: \"kubernetes.io/projected/ebd046c8-0af0-4d96-99a4-ce5bb0dec770-kube-api-access-5vfz5\") pod \"dnsmasq-dns-56c55fd88c-fnjlg\" (UID: \"ebd046c8-0af0-4d96-99a4-ce5bb0dec770\") " pod="openstack/dnsmasq-dns-56c55fd88c-fnjlg" Oct 12 07:49:48 crc kubenswrapper[4599]: I1012 07:49:48.040027 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/fc5e8701-df4e-459c-85ae-78db217fcea0-httpd-config\") pod \"neutron-586f7cb8d6-xq2lk\" (UID: \"fc5e8701-df4e-459c-85ae-78db217fcea0\") " pod="openstack/neutron-586f7cb8d6-xq2lk" Oct 12 07:49:48 crc kubenswrapper[4599]: I1012 07:49:48.040071 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc5e8701-df4e-459c-85ae-78db217fcea0-combined-ca-bundle\") pod \"neutron-586f7cb8d6-xq2lk\" (UID: \"fc5e8701-df4e-459c-85ae-78db217fcea0\") " pod="openstack/neutron-586f7cb8d6-xq2lk" Oct 12 07:49:48 crc kubenswrapper[4599]: I1012 07:49:48.040115 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc5e8701-df4e-459c-85ae-78db217fcea0-ovndb-tls-certs\") pod \"neutron-586f7cb8d6-xq2lk\" (UID: \"fc5e8701-df4e-459c-85ae-78db217fcea0\") " pod="openstack/neutron-586f7cb8d6-xq2lk" Oct 12 07:49:48 crc kubenswrapper[4599]: I1012 07:49:48.040173 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fc5e8701-df4e-459c-85ae-78db217fcea0-config\") pod \"neutron-586f7cb8d6-xq2lk\" (UID: \"fc5e8701-df4e-459c-85ae-78db217fcea0\") " pod="openstack/neutron-586f7cb8d6-xq2lk" Oct 12 07:49:48 crc kubenswrapper[4599]: I1012 07:49:48.040200 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-488pl\" (UniqueName: \"kubernetes.io/projected/fc5e8701-df4e-459c-85ae-78db217fcea0-kube-api-access-488pl\") pod \"neutron-586f7cb8d6-xq2lk\" (UID: \"fc5e8701-df4e-459c-85ae-78db217fcea0\") " pod="openstack/neutron-586f7cb8d6-xq2lk" Oct 12 07:49:48 crc kubenswrapper[4599]: I1012 07:49:48.043877 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc5e8701-df4e-459c-85ae-78db217fcea0-combined-ca-bundle\") pod \"neutron-586f7cb8d6-xq2lk\" (UID: \"fc5e8701-df4e-459c-85ae-78db217fcea0\") " pod="openstack/neutron-586f7cb8d6-xq2lk" Oct 12 07:49:48 crc kubenswrapper[4599]: I1012 07:49:48.044985 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/fc5e8701-df4e-459c-85ae-78db217fcea0-httpd-config\") pod \"neutron-586f7cb8d6-xq2lk\" (UID: \"fc5e8701-df4e-459c-85ae-78db217fcea0\") " pod="openstack/neutron-586f7cb8d6-xq2lk" Oct 12 07:49:48 crc kubenswrapper[4599]: I1012 07:49:48.045020 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc5e8701-df4e-459c-85ae-78db217fcea0-ovndb-tls-certs\") pod \"neutron-586f7cb8d6-xq2lk\" (UID: \"fc5e8701-df4e-459c-85ae-78db217fcea0\") " pod="openstack/neutron-586f7cb8d6-xq2lk" Oct 12 07:49:48 crc kubenswrapper[4599]: I1012 07:49:48.047488 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/fc5e8701-df4e-459c-85ae-78db217fcea0-config\") pod \"neutron-586f7cb8d6-xq2lk\" (UID: \"fc5e8701-df4e-459c-85ae-78db217fcea0\") " pod="openstack/neutron-586f7cb8d6-xq2lk" Oct 12 07:49:48 crc kubenswrapper[4599]: I1012 07:49:48.055868 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-488pl\" (UniqueName: \"kubernetes.io/projected/fc5e8701-df4e-459c-85ae-78db217fcea0-kube-api-access-488pl\") pod \"neutron-586f7cb8d6-xq2lk\" (UID: \"fc5e8701-df4e-459c-85ae-78db217fcea0\") " pod="openstack/neutron-586f7cb8d6-xq2lk" Oct 12 07:49:48 crc kubenswrapper[4599]: I1012 07:49:48.074088 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56c55fd88c-fnjlg" Oct 12 07:49:48 crc kubenswrapper[4599]: E1012 07:49:48.117831 4599 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Oct 12 07:49:48 crc kubenswrapper[4599]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/7ea02d25-75be-4c25-bc35-539b4d590e10/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Oct 12 07:49:48 crc kubenswrapper[4599]: > podSandboxID="eda97eeb803e27cb296dc603a5b01f23c6564ca6d2f46b5bf80f7d3ce5c4c2e8" Oct 12 07:49:48 crc kubenswrapper[4599]: E1012 07:49:48.118234 4599 kuberuntime_manager.go:1274] "Unhandled Error" err=< Oct 12 07:49:48 crc kubenswrapper[4599]: container &Container{Name:dnsmasq-dns,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:92672cd85fd36317d65faa0525acf849,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n675h87h7dh67bh64bh7bh5fdhddh5ddhf7h66bh66fh56hc5hbdh689h58dh56ch596h56ch657h557h6dh5dbh696h68fhfdh578hf4h5h669h669q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-swift-storage-0,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-swift-storage-0,SubPath:dns-swift-storage-0,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-nb,SubPath:ovsdbserver-nb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-sb,SubPath:ovsdbserver-sb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vrzjx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5dd96ddfff-bbpjj_openstack(7ea02d25-75be-4c25-bc35-539b4d590e10): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/7ea02d25-75be-4c25-bc35-539b4d590e10/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Oct 12 07:49:48 crc kubenswrapper[4599]: > logger="UnhandledError" Oct 12 07:49:48 crc kubenswrapper[4599]: E1012 07:49:48.119500 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/7ea02d25-75be-4c25-bc35-539b4d590e10/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-5dd96ddfff-bbpjj" podUID="7ea02d25-75be-4c25-bc35-539b4d590e10" Oct 12 07:49:48 crc kubenswrapper[4599]: I1012 07:49:48.200944 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-586f7cb8d6-xq2lk" Oct 12 07:49:48 crc kubenswrapper[4599]: I1012 07:49:48.515239 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56c55fd88c-fnjlg"] Oct 12 07:49:48 crc kubenswrapper[4599]: W1012 07:49:48.528070 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podebd046c8_0af0_4d96_99a4_ce5bb0dec770.slice/crio-308577e1ca57abdf929523a1ea568f4bb0cc48d143e57aabc6dd6726a949273f WatchSource:0}: Error finding container 308577e1ca57abdf929523a1ea568f4bb0cc48d143e57aabc6dd6726a949273f: Status 404 returned error can't find the container with id 308577e1ca57abdf929523a1ea568f4bb0cc48d143e57aabc6dd6726a949273f Oct 12 07:49:48 crc kubenswrapper[4599]: I1012 07:49:48.788443 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b6dbf57f-d6e6-417f-9c84-968e6336e3a6","Type":"ContainerStarted","Data":"dc8ce30ad7abfcaccc058bec4b88434be1c58dd95b2012cfd0da33292d6d2a8f"} Oct 12 07:49:48 crc kubenswrapper[4599]: I1012 07:49:48.791555 4599 generic.go:334] "Generic (PLEG): container finished" podID="f94bf31d-933c-4cca-998a-4c3fc9a451f4" containerID="036a903d34b4436b12b7ccac14fb27518208774daa9e31d4ce65967db6523fc1" exitCode=0 Oct 12 07:49:48 crc kubenswrapper[4599]: I1012 07:49:48.791605 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f94bf31d-933c-4cca-998a-4c3fc9a451f4","Type":"ContainerDied","Data":"036a903d34b4436b12b7ccac14fb27518208774daa9e31d4ce65967db6523fc1"} Oct 12 07:49:48 crc kubenswrapper[4599]: I1012 07:49:48.813604 4599 generic.go:334] "Generic (PLEG): container finished" podID="df8f9a6f-b14e-4241-9102-4473869410d8" containerID="b116cb821939a9c933101c37932183909c5cfd195bf545978df7572d26a519fe" exitCode=143 Oct 12 07:49:48 crc kubenswrapper[4599]: I1012 07:49:48.813672 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"df8f9a6f-b14e-4241-9102-4473869410d8","Type":"ContainerDied","Data":"b116cb821939a9c933101c37932183909c5cfd195bf545978df7572d26a519fe"} Oct 12 07:49:48 crc kubenswrapper[4599]: I1012 07:49:48.818711 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56c55fd88c-fnjlg" event={"ID":"ebd046c8-0af0-4d96-99a4-ce5bb0dec770","Type":"ContainerStarted","Data":"308577e1ca57abdf929523a1ea568f4bb0cc48d143e57aabc6dd6726a949273f"} Oct 12 07:49:48 crc kubenswrapper[4599]: I1012 07:49:48.978408 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-586f7cb8d6-xq2lk"] Oct 12 07:49:49 crc kubenswrapper[4599]: I1012 07:49:49.121458 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 12 07:49:49 crc kubenswrapper[4599]: I1012 07:49:49.141413 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dd96ddfff-bbpjj" Oct 12 07:49:49 crc kubenswrapper[4599]: I1012 07:49:49.176054 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f94bf31d-933c-4cca-998a-4c3fc9a451f4-config-data\") pod \"f94bf31d-933c-4cca-998a-4c3fc9a451f4\" (UID: \"f94bf31d-933c-4cca-998a-4c3fc9a451f4\") " Oct 12 07:49:49 crc kubenswrapper[4599]: I1012 07:49:49.176143 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f94bf31d-933c-4cca-998a-4c3fc9a451f4-sg-core-conf-yaml\") pod \"f94bf31d-933c-4cca-998a-4c3fc9a451f4\" (UID: \"f94bf31d-933c-4cca-998a-4c3fc9a451f4\") " Oct 12 07:49:49 crc kubenswrapper[4599]: I1012 07:49:49.176222 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f94bf31d-933c-4cca-998a-4c3fc9a451f4-combined-ca-bundle\") pod \"f94bf31d-933c-4cca-998a-4c3fc9a451f4\" (UID: \"f94bf31d-933c-4cca-998a-4c3fc9a451f4\") " Oct 12 07:49:49 crc kubenswrapper[4599]: I1012 07:49:49.176357 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7ea02d25-75be-4c25-bc35-539b4d590e10-dns-swift-storage-0\") pod \"7ea02d25-75be-4c25-bc35-539b4d590e10\" (UID: \"7ea02d25-75be-4c25-bc35-539b4d590e10\") " Oct 12 07:49:49 crc kubenswrapper[4599]: I1012 07:49:49.176387 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f94bf31d-933c-4cca-998a-4c3fc9a451f4-run-httpd\") pod \"f94bf31d-933c-4cca-998a-4c3fc9a451f4\" (UID: \"f94bf31d-933c-4cca-998a-4c3fc9a451f4\") " Oct 12 07:49:49 crc kubenswrapper[4599]: I1012 07:49:49.176431 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f94bf31d-933c-4cca-998a-4c3fc9a451f4-scripts\") pod \"f94bf31d-933c-4cca-998a-4c3fc9a451f4\" (UID: \"f94bf31d-933c-4cca-998a-4c3fc9a451f4\") " Oct 12 07:49:49 crc kubenswrapper[4599]: I1012 07:49:49.176759 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f94bf31d-933c-4cca-998a-4c3fc9a451f4-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f94bf31d-933c-4cca-998a-4c3fc9a451f4" (UID: "f94bf31d-933c-4cca-998a-4c3fc9a451f4"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:49:49 crc kubenswrapper[4599]: I1012 07:49:49.176454 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7ea02d25-75be-4c25-bc35-539b4d590e10-dns-svc\") pod \"7ea02d25-75be-4c25-bc35-539b4d590e10\" (UID: \"7ea02d25-75be-4c25-bc35-539b4d590e10\") " Oct 12 07:49:49 crc kubenswrapper[4599]: I1012 07:49:49.177130 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f94bf31d-933c-4cca-998a-4c3fc9a451f4-log-httpd\") pod \"f94bf31d-933c-4cca-998a-4c3fc9a451f4\" (UID: \"f94bf31d-933c-4cca-998a-4c3fc9a451f4\") " Oct 12 07:49:49 crc kubenswrapper[4599]: I1012 07:49:49.177180 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7ea02d25-75be-4c25-bc35-539b4d590e10-ovsdbserver-sb\") pod \"7ea02d25-75be-4c25-bc35-539b4d590e10\" (UID: \"7ea02d25-75be-4c25-bc35-539b4d590e10\") " Oct 12 07:49:49 crc kubenswrapper[4599]: I1012 07:49:49.177224 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5hlxx\" (UniqueName: \"kubernetes.io/projected/f94bf31d-933c-4cca-998a-4c3fc9a451f4-kube-api-access-5hlxx\") pod \"f94bf31d-933c-4cca-998a-4c3fc9a451f4\" (UID: \"f94bf31d-933c-4cca-998a-4c3fc9a451f4\") " Oct 12 07:49:49 crc kubenswrapper[4599]: I1012 07:49:49.177279 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vrzjx\" (UniqueName: \"kubernetes.io/projected/7ea02d25-75be-4c25-bc35-539b4d590e10-kube-api-access-vrzjx\") pod \"7ea02d25-75be-4c25-bc35-539b4d590e10\" (UID: \"7ea02d25-75be-4c25-bc35-539b4d590e10\") " Oct 12 07:49:49 crc kubenswrapper[4599]: I1012 07:49:49.177359 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7ea02d25-75be-4c25-bc35-539b4d590e10-ovsdbserver-nb\") pod \"7ea02d25-75be-4c25-bc35-539b4d590e10\" (UID: \"7ea02d25-75be-4c25-bc35-539b4d590e10\") " Oct 12 07:49:49 crc kubenswrapper[4599]: I1012 07:49:49.177425 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ea02d25-75be-4c25-bc35-539b4d590e10-config\") pod \"7ea02d25-75be-4c25-bc35-539b4d590e10\" (UID: \"7ea02d25-75be-4c25-bc35-539b4d590e10\") " Oct 12 07:49:49 crc kubenswrapper[4599]: I1012 07:49:49.177775 4599 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f94bf31d-933c-4cca-998a-4c3fc9a451f4-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:49 crc kubenswrapper[4599]: I1012 07:49:49.178376 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f94bf31d-933c-4cca-998a-4c3fc9a451f4-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f94bf31d-933c-4cca-998a-4c3fc9a451f4" (UID: "f94bf31d-933c-4cca-998a-4c3fc9a451f4"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:49:49 crc kubenswrapper[4599]: I1012 07:49:49.180786 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f94bf31d-933c-4cca-998a-4c3fc9a451f4-scripts" (OuterVolumeSpecName: "scripts") pod "f94bf31d-933c-4cca-998a-4c3fc9a451f4" (UID: "f94bf31d-933c-4cca-998a-4c3fc9a451f4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:49:49 crc kubenswrapper[4599]: I1012 07:49:49.187498 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f94bf31d-933c-4cca-998a-4c3fc9a451f4-kube-api-access-5hlxx" (OuterVolumeSpecName: "kube-api-access-5hlxx") pod "f94bf31d-933c-4cca-998a-4c3fc9a451f4" (UID: "f94bf31d-933c-4cca-998a-4c3fc9a451f4"). InnerVolumeSpecName "kube-api-access-5hlxx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:49:49 crc kubenswrapper[4599]: I1012 07:49:49.188907 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ea02d25-75be-4c25-bc35-539b4d590e10-kube-api-access-vrzjx" (OuterVolumeSpecName: "kube-api-access-vrzjx") pod "7ea02d25-75be-4c25-bc35-539b4d590e10" (UID: "7ea02d25-75be-4c25-bc35-539b4d590e10"). InnerVolumeSpecName "kube-api-access-vrzjx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:49:49 crc kubenswrapper[4599]: I1012 07:49:49.281249 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f94bf31d-933c-4cca-998a-4c3fc9a451f4-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f94bf31d-933c-4cca-998a-4c3fc9a451f4" (UID: "f94bf31d-933c-4cca-998a-4c3fc9a451f4"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:49:49 crc kubenswrapper[4599]: I1012 07:49:49.282530 4599 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f94bf31d-933c-4cca-998a-4c3fc9a451f4-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:49 crc kubenswrapper[4599]: I1012 07:49:49.282599 4599 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f94bf31d-933c-4cca-998a-4c3fc9a451f4-scripts\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:49 crc kubenswrapper[4599]: I1012 07:49:49.282612 4599 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f94bf31d-933c-4cca-998a-4c3fc9a451f4-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:49 crc kubenswrapper[4599]: I1012 07:49:49.282623 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5hlxx\" (UniqueName: \"kubernetes.io/projected/f94bf31d-933c-4cca-998a-4c3fc9a451f4-kube-api-access-5hlxx\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:49 crc kubenswrapper[4599]: I1012 07:49:49.282637 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vrzjx\" (UniqueName: \"kubernetes.io/projected/7ea02d25-75be-4c25-bc35-539b4d590e10-kube-api-access-vrzjx\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:49 crc kubenswrapper[4599]: I1012 07:49:49.296935 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ea02d25-75be-4c25-bc35-539b4d590e10-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7ea02d25-75be-4c25-bc35-539b4d590e10" (UID: "7ea02d25-75be-4c25-bc35-539b4d590e10"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:49:49 crc kubenswrapper[4599]: I1012 07:49:49.301392 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ea02d25-75be-4c25-bc35-539b4d590e10-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7ea02d25-75be-4c25-bc35-539b4d590e10" (UID: "7ea02d25-75be-4c25-bc35-539b4d590e10"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:49:49 crc kubenswrapper[4599]: I1012 07:49:49.310133 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ea02d25-75be-4c25-bc35-539b4d590e10-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "7ea02d25-75be-4c25-bc35-539b4d590e10" (UID: "7ea02d25-75be-4c25-bc35-539b4d590e10"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:49:49 crc kubenswrapper[4599]: I1012 07:49:49.312431 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ea02d25-75be-4c25-bc35-539b4d590e10-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7ea02d25-75be-4c25-bc35-539b4d590e10" (UID: "7ea02d25-75be-4c25-bc35-539b4d590e10"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:49:49 crc kubenswrapper[4599]: I1012 07:49:49.312710 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ea02d25-75be-4c25-bc35-539b4d590e10-config" (OuterVolumeSpecName: "config") pod "7ea02d25-75be-4c25-bc35-539b4d590e10" (UID: "7ea02d25-75be-4c25-bc35-539b4d590e10"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:49:49 crc kubenswrapper[4599]: I1012 07:49:49.328161 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f94bf31d-933c-4cca-998a-4c3fc9a451f4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f94bf31d-933c-4cca-998a-4c3fc9a451f4" (UID: "f94bf31d-933c-4cca-998a-4c3fc9a451f4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:49:49 crc kubenswrapper[4599]: I1012 07:49:49.361205 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f94bf31d-933c-4cca-998a-4c3fc9a451f4-config-data" (OuterVolumeSpecName: "config-data") pod "f94bf31d-933c-4cca-998a-4c3fc9a451f4" (UID: "f94bf31d-933c-4cca-998a-4c3fc9a451f4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:49:49 crc kubenswrapper[4599]: I1012 07:49:49.382832 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-bdcbd85c8-pvjzc" Oct 12 07:49:49 crc kubenswrapper[4599]: I1012 07:49:49.384423 4599 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7ea02d25-75be-4c25-bc35-539b4d590e10-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:49 crc kubenswrapper[4599]: I1012 07:49:49.384514 4599 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7ea02d25-75be-4c25-bc35-539b4d590e10-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:49 crc kubenswrapper[4599]: I1012 07:49:49.384574 4599 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7ea02d25-75be-4c25-bc35-539b4d590e10-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:49 crc kubenswrapper[4599]: I1012 07:49:49.384625 4599 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7ea02d25-75be-4c25-bc35-539b4d590e10-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:49 crc kubenswrapper[4599]: I1012 07:49:49.384684 4599 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ea02d25-75be-4c25-bc35-539b4d590e10-config\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:49 crc kubenswrapper[4599]: I1012 07:49:49.384733 4599 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f94bf31d-933c-4cca-998a-4c3fc9a451f4-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:49 crc kubenswrapper[4599]: I1012 07:49:49.384792 4599 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f94bf31d-933c-4cca-998a-4c3fc9a451f4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:49 crc kubenswrapper[4599]: I1012 07:49:49.557850 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-bdcbd85c8-pvjzc" Oct 12 07:49:49 crc kubenswrapper[4599]: I1012 07:49:49.829345 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f94bf31d-933c-4cca-998a-4c3fc9a451f4","Type":"ContainerDied","Data":"289d224b3977712361e5b93f8c1a0b7e85fe80edaf86f173cd6b5e2b00d8de30"} Oct 12 07:49:49 crc kubenswrapper[4599]: I1012 07:49:49.829655 4599 scope.go:117] "RemoveContainer" containerID="4ed548665907723269ec5dee79aea041449fb9ce8af4a395c3f4fb4ed7523ae1" Oct 12 07:49:49 crc kubenswrapper[4599]: I1012 07:49:49.829407 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 12 07:49:49 crc kubenswrapper[4599]: I1012 07:49:49.830624 4599 generic.go:334] "Generic (PLEG): container finished" podID="ebd046c8-0af0-4d96-99a4-ce5bb0dec770" containerID="862c749414fb2e59c5221d01459de7394ea1c056bcdadc2badceb43204cc4214" exitCode=0 Oct 12 07:49:49 crc kubenswrapper[4599]: I1012 07:49:49.830748 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56c55fd88c-fnjlg" event={"ID":"ebd046c8-0af0-4d96-99a4-ce5bb0dec770","Type":"ContainerDied","Data":"862c749414fb2e59c5221d01459de7394ea1c056bcdadc2badceb43204cc4214"} Oct 12 07:49:49 crc kubenswrapper[4599]: I1012 07:49:49.835658 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-586f7cb8d6-xq2lk" event={"ID":"fc5e8701-df4e-459c-85ae-78db217fcea0","Type":"ContainerStarted","Data":"869243432b03a8368e341b8063082bb68d7d1901d66174e6ffed62187df2b87e"} Oct 12 07:49:49 crc kubenswrapper[4599]: I1012 07:49:49.835707 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-586f7cb8d6-xq2lk" event={"ID":"fc5e8701-df4e-459c-85ae-78db217fcea0","Type":"ContainerStarted","Data":"94fce9a0f63f7cf8b138f590f6da338cb28eb8bb2a21c68e841b5c625a81f6c4"} Oct 12 07:49:49 crc kubenswrapper[4599]: I1012 07:49:49.835724 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-586f7cb8d6-xq2lk" event={"ID":"fc5e8701-df4e-459c-85ae-78db217fcea0","Type":"ContainerStarted","Data":"b42401f8cf3b4e6c273c057fa4ba267bdd331d8d4876919e80c12c05574c2cd5"} Oct 12 07:49:49 crc kubenswrapper[4599]: I1012 07:49:49.836171 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-586f7cb8d6-xq2lk" Oct 12 07:49:49 crc kubenswrapper[4599]: I1012 07:49:49.844084 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b6dbf57f-d6e6-417f-9c84-968e6336e3a6","Type":"ContainerStarted","Data":"0f6bb05a52897aac97d0fa11262581ea437fe7ef13f2f0f39c5270a9ffcfa14a"} Oct 12 07:49:49 crc kubenswrapper[4599]: I1012 07:49:49.846664 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dd96ddfff-bbpjj" Oct 12 07:49:49 crc kubenswrapper[4599]: I1012 07:49:49.848086 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dd96ddfff-bbpjj" event={"ID":"7ea02d25-75be-4c25-bc35-539b4d590e10","Type":"ContainerDied","Data":"eda97eeb803e27cb296dc603a5b01f23c6564ca6d2f46b5bf80f7d3ce5c4c2e8"} Oct 12 07:49:49 crc kubenswrapper[4599]: I1012 07:49:49.887050 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-586f7cb8d6-xq2lk" podStartSLOduration=2.887016905 podStartE2EDuration="2.887016905s" podCreationTimestamp="2025-10-12 07:49:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:49:49.879407233 +0000 UTC m=+886.668602735" watchObservedRunningTime="2025-10-12 07:49:49.887016905 +0000 UTC m=+886.676212407" Oct 12 07:49:49 crc kubenswrapper[4599]: I1012 07:49:49.917169 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.822243673 podStartE2EDuration="4.917139494s" podCreationTimestamp="2025-10-12 07:49:45 +0000 UTC" firstStartedPulling="2025-10-12 07:49:46.204824615 +0000 UTC m=+882.994020117" lastFinishedPulling="2025-10-12 07:49:47.299720437 +0000 UTC m=+884.088915938" observedRunningTime="2025-10-12 07:49:49.913363988 +0000 UTC m=+886.702559490" watchObservedRunningTime="2025-10-12 07:49:49.917139494 +0000 UTC m=+886.706334995" Oct 12 07:49:49 crc kubenswrapper[4599]: I1012 07:49:49.968755 4599 scope.go:117] "RemoveContainer" containerID="fa724a8bca55acb47a6b6f2cfb8f4cc5b5f33a479cb09f48c902ef947f3880a7" Oct 12 07:49:50 crc kubenswrapper[4599]: I1012 07:49:50.052542 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5dd96ddfff-bbpjj"] Oct 12 07:49:50 crc kubenswrapper[4599]: I1012 07:49:50.054605 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5dd96ddfff-bbpjj"] Oct 12 07:49:50 crc kubenswrapper[4599]: I1012 07:49:50.064446 4599 scope.go:117] "RemoveContainer" containerID="036a903d34b4436b12b7ccac14fb27518208774daa9e31d4ce65967db6523fc1" Oct 12 07:49:50 crc kubenswrapper[4599]: I1012 07:49:50.075416 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 12 07:49:50 crc kubenswrapper[4599]: I1012 07:49:50.092239 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 12 07:49:50 crc kubenswrapper[4599]: I1012 07:49:50.097167 4599 scope.go:117] "RemoveContainer" containerID="7f1c45d5e3d2dfbf584a7cb0627a5bc4ccaff62de2ca235d6cde978919ccda81" Oct 12 07:49:50 crc kubenswrapper[4599]: I1012 07:49:50.101989 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 12 07:49:50 crc kubenswrapper[4599]: E1012 07:49:50.102672 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f94bf31d-933c-4cca-998a-4c3fc9a451f4" containerName="ceilometer-notification-agent" Oct 12 07:49:50 crc kubenswrapper[4599]: I1012 07:49:50.102694 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="f94bf31d-933c-4cca-998a-4c3fc9a451f4" containerName="ceilometer-notification-agent" Oct 12 07:49:50 crc kubenswrapper[4599]: E1012 07:49:50.102718 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f94bf31d-933c-4cca-998a-4c3fc9a451f4" containerName="proxy-httpd" Oct 12 07:49:50 crc kubenswrapper[4599]: I1012 07:49:50.102724 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="f94bf31d-933c-4cca-998a-4c3fc9a451f4" containerName="proxy-httpd" Oct 12 07:49:50 crc kubenswrapper[4599]: E1012 07:49:50.102766 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ea02d25-75be-4c25-bc35-539b4d590e10" containerName="init" Oct 12 07:49:50 crc kubenswrapper[4599]: I1012 07:49:50.102773 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ea02d25-75be-4c25-bc35-539b4d590e10" containerName="init" Oct 12 07:49:50 crc kubenswrapper[4599]: E1012 07:49:50.102793 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f94bf31d-933c-4cca-998a-4c3fc9a451f4" containerName="sg-core" Oct 12 07:49:50 crc kubenswrapper[4599]: I1012 07:49:50.102798 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="f94bf31d-933c-4cca-998a-4c3fc9a451f4" containerName="sg-core" Oct 12 07:49:50 crc kubenswrapper[4599]: E1012 07:49:50.102829 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f94bf31d-933c-4cca-998a-4c3fc9a451f4" containerName="ceilometer-central-agent" Oct 12 07:49:50 crc kubenswrapper[4599]: I1012 07:49:50.102835 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="f94bf31d-933c-4cca-998a-4c3fc9a451f4" containerName="ceilometer-central-agent" Oct 12 07:49:50 crc kubenswrapper[4599]: I1012 07:49:50.103041 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="f94bf31d-933c-4cca-998a-4c3fc9a451f4" containerName="sg-core" Oct 12 07:49:50 crc kubenswrapper[4599]: I1012 07:49:50.103069 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ea02d25-75be-4c25-bc35-539b4d590e10" containerName="init" Oct 12 07:49:50 crc kubenswrapper[4599]: I1012 07:49:50.103089 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="f94bf31d-933c-4cca-998a-4c3fc9a451f4" containerName="ceilometer-notification-agent" Oct 12 07:49:50 crc kubenswrapper[4599]: I1012 07:49:50.103098 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="f94bf31d-933c-4cca-998a-4c3fc9a451f4" containerName="proxy-httpd" Oct 12 07:49:50 crc kubenswrapper[4599]: I1012 07:49:50.103119 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="f94bf31d-933c-4cca-998a-4c3fc9a451f4" containerName="ceilometer-central-agent" Oct 12 07:49:50 crc kubenswrapper[4599]: I1012 07:49:50.105099 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 12 07:49:50 crc kubenswrapper[4599]: I1012 07:49:50.109884 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 12 07:49:50 crc kubenswrapper[4599]: I1012 07:49:50.110047 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 12 07:49:50 crc kubenswrapper[4599]: I1012 07:49:50.110313 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 12 07:49:50 crc kubenswrapper[4599]: I1012 07:49:50.138620 4599 scope.go:117] "RemoveContainer" containerID="63718947180acc94069912a66deb1f5589cb279521cf6d6036133cb90c25f4be" Oct 12 07:49:50 crc kubenswrapper[4599]: I1012 07:49:50.260801 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48bhv\" (UniqueName: \"kubernetes.io/projected/412835ea-6470-43ff-896b-98da92bb7393-kube-api-access-48bhv\") pod \"ceilometer-0\" (UID: \"412835ea-6470-43ff-896b-98da92bb7393\") " pod="openstack/ceilometer-0" Oct 12 07:49:50 crc kubenswrapper[4599]: I1012 07:49:50.260854 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/412835ea-6470-43ff-896b-98da92bb7393-config-data\") pod \"ceilometer-0\" (UID: \"412835ea-6470-43ff-896b-98da92bb7393\") " pod="openstack/ceilometer-0" Oct 12 07:49:50 crc kubenswrapper[4599]: I1012 07:49:50.260980 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/412835ea-6470-43ff-896b-98da92bb7393-run-httpd\") pod \"ceilometer-0\" (UID: \"412835ea-6470-43ff-896b-98da92bb7393\") " pod="openstack/ceilometer-0" Oct 12 07:49:50 crc kubenswrapper[4599]: I1012 07:49:50.261096 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/412835ea-6470-43ff-896b-98da92bb7393-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"412835ea-6470-43ff-896b-98da92bb7393\") " pod="openstack/ceilometer-0" Oct 12 07:49:50 crc kubenswrapper[4599]: I1012 07:49:50.261137 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/412835ea-6470-43ff-896b-98da92bb7393-log-httpd\") pod \"ceilometer-0\" (UID: \"412835ea-6470-43ff-896b-98da92bb7393\") " pod="openstack/ceilometer-0" Oct 12 07:49:50 crc kubenswrapper[4599]: I1012 07:49:50.261161 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/412835ea-6470-43ff-896b-98da92bb7393-scripts\") pod \"ceilometer-0\" (UID: \"412835ea-6470-43ff-896b-98da92bb7393\") " pod="openstack/ceilometer-0" Oct 12 07:49:50 crc kubenswrapper[4599]: I1012 07:49:50.261290 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/412835ea-6470-43ff-896b-98da92bb7393-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"412835ea-6470-43ff-896b-98da92bb7393\") " pod="openstack/ceilometer-0" Oct 12 07:49:50 crc kubenswrapper[4599]: I1012 07:49:50.363894 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48bhv\" (UniqueName: \"kubernetes.io/projected/412835ea-6470-43ff-896b-98da92bb7393-kube-api-access-48bhv\") pod \"ceilometer-0\" (UID: \"412835ea-6470-43ff-896b-98da92bb7393\") " pod="openstack/ceilometer-0" Oct 12 07:49:50 crc kubenswrapper[4599]: I1012 07:49:50.364301 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/412835ea-6470-43ff-896b-98da92bb7393-config-data\") pod \"ceilometer-0\" (UID: \"412835ea-6470-43ff-896b-98da92bb7393\") " pod="openstack/ceilometer-0" Oct 12 07:49:50 crc kubenswrapper[4599]: I1012 07:49:50.364913 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/412835ea-6470-43ff-896b-98da92bb7393-run-httpd\") pod \"ceilometer-0\" (UID: \"412835ea-6470-43ff-896b-98da92bb7393\") " pod="openstack/ceilometer-0" Oct 12 07:49:50 crc kubenswrapper[4599]: I1012 07:49:50.364982 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/412835ea-6470-43ff-896b-98da92bb7393-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"412835ea-6470-43ff-896b-98da92bb7393\") " pod="openstack/ceilometer-0" Oct 12 07:49:50 crc kubenswrapper[4599]: I1012 07:49:50.365007 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/412835ea-6470-43ff-896b-98da92bb7393-log-httpd\") pod \"ceilometer-0\" (UID: \"412835ea-6470-43ff-896b-98da92bb7393\") " pod="openstack/ceilometer-0" Oct 12 07:49:50 crc kubenswrapper[4599]: I1012 07:49:50.365025 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/412835ea-6470-43ff-896b-98da92bb7393-scripts\") pod \"ceilometer-0\" (UID: \"412835ea-6470-43ff-896b-98da92bb7393\") " pod="openstack/ceilometer-0" Oct 12 07:49:50 crc kubenswrapper[4599]: I1012 07:49:50.365099 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/412835ea-6470-43ff-896b-98da92bb7393-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"412835ea-6470-43ff-896b-98da92bb7393\") " pod="openstack/ceilometer-0" Oct 12 07:49:50 crc kubenswrapper[4599]: I1012 07:49:50.365428 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/412835ea-6470-43ff-896b-98da92bb7393-run-httpd\") pod \"ceilometer-0\" (UID: \"412835ea-6470-43ff-896b-98da92bb7393\") " pod="openstack/ceilometer-0" Oct 12 07:49:50 crc kubenswrapper[4599]: I1012 07:49:50.365850 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/412835ea-6470-43ff-896b-98da92bb7393-log-httpd\") pod \"ceilometer-0\" (UID: \"412835ea-6470-43ff-896b-98da92bb7393\") " pod="openstack/ceilometer-0" Oct 12 07:49:50 crc kubenswrapper[4599]: I1012 07:49:50.368020 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/412835ea-6470-43ff-896b-98da92bb7393-config-data\") pod \"ceilometer-0\" (UID: \"412835ea-6470-43ff-896b-98da92bb7393\") " pod="openstack/ceilometer-0" Oct 12 07:49:50 crc kubenswrapper[4599]: I1012 07:49:50.368039 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/412835ea-6470-43ff-896b-98da92bb7393-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"412835ea-6470-43ff-896b-98da92bb7393\") " pod="openstack/ceilometer-0" Oct 12 07:49:50 crc kubenswrapper[4599]: I1012 07:49:50.368961 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/412835ea-6470-43ff-896b-98da92bb7393-scripts\") pod \"ceilometer-0\" (UID: \"412835ea-6470-43ff-896b-98da92bb7393\") " pod="openstack/ceilometer-0" Oct 12 07:49:50 crc kubenswrapper[4599]: I1012 07:49:50.369071 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/412835ea-6470-43ff-896b-98da92bb7393-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"412835ea-6470-43ff-896b-98da92bb7393\") " pod="openstack/ceilometer-0" Oct 12 07:49:50 crc kubenswrapper[4599]: I1012 07:49:50.382505 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48bhv\" (UniqueName: \"kubernetes.io/projected/412835ea-6470-43ff-896b-98da92bb7393-kube-api-access-48bhv\") pod \"ceilometer-0\" (UID: \"412835ea-6470-43ff-896b-98da92bb7393\") " pod="openstack/ceilometer-0" Oct 12 07:49:50 crc kubenswrapper[4599]: I1012 07:49:50.389952 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 12 07:49:50 crc kubenswrapper[4599]: I1012 07:49:50.429827 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 12 07:49:50 crc kubenswrapper[4599]: I1012 07:49:50.846951 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 12 07:49:50 crc kubenswrapper[4599]: I1012 07:49:50.856542 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56c55fd88c-fnjlg" event={"ID":"ebd046c8-0af0-4d96-99a4-ce5bb0dec770","Type":"ContainerStarted","Data":"658f7996d8872d03e757d2c4af95c3bc377dfbed1c8efa8a57c96049fa3952f0"} Oct 12 07:49:50 crc kubenswrapper[4599]: I1012 07:49:50.857500 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-56c55fd88c-fnjlg" Oct 12 07:49:51 crc kubenswrapper[4599]: I1012 07:49:51.009998 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-56c55fd88c-fnjlg" podStartSLOduration=4.009975506 podStartE2EDuration="4.009975506s" podCreationTimestamp="2025-10-12 07:49:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:49:50.876630943 +0000 UTC m=+887.665826446" watchObservedRunningTime="2025-10-12 07:49:51.009975506 +0000 UTC m=+887.799171008" Oct 12 07:49:51 crc kubenswrapper[4599]: I1012 07:49:51.011883 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6dcf447b8f-d5qql"] Oct 12 07:49:51 crc kubenswrapper[4599]: I1012 07:49:51.013268 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6dcf447b8f-d5qql" Oct 12 07:49:51 crc kubenswrapper[4599]: I1012 07:49:51.014938 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Oct 12 07:49:51 crc kubenswrapper[4599]: I1012 07:49:51.015289 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Oct 12 07:49:51 crc kubenswrapper[4599]: I1012 07:49:51.021593 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6dcf447b8f-d5qql"] Oct 12 07:49:51 crc kubenswrapper[4599]: I1012 07:49:51.180265 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f21bc7c-6371-45de-9766-ce9ad07df644-public-tls-certs\") pod \"neutron-6dcf447b8f-d5qql\" (UID: \"1f21bc7c-6371-45de-9766-ce9ad07df644\") " pod="openstack/neutron-6dcf447b8f-d5qql" Oct 12 07:49:51 crc kubenswrapper[4599]: I1012 07:49:51.180365 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1f21bc7c-6371-45de-9766-ce9ad07df644-config\") pod \"neutron-6dcf447b8f-d5qql\" (UID: \"1f21bc7c-6371-45de-9766-ce9ad07df644\") " pod="openstack/neutron-6dcf447b8f-d5qql" Oct 12 07:49:51 crc kubenswrapper[4599]: I1012 07:49:51.180404 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/1f21bc7c-6371-45de-9766-ce9ad07df644-httpd-config\") pod \"neutron-6dcf447b8f-d5qql\" (UID: \"1f21bc7c-6371-45de-9766-ce9ad07df644\") " pod="openstack/neutron-6dcf447b8f-d5qql" Oct 12 07:49:51 crc kubenswrapper[4599]: I1012 07:49:51.180435 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f21bc7c-6371-45de-9766-ce9ad07df644-ovndb-tls-certs\") pod \"neutron-6dcf447b8f-d5qql\" (UID: \"1f21bc7c-6371-45de-9766-ce9ad07df644\") " pod="openstack/neutron-6dcf447b8f-d5qql" Oct 12 07:49:51 crc kubenswrapper[4599]: I1012 07:49:51.180454 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f21bc7c-6371-45de-9766-ce9ad07df644-combined-ca-bundle\") pod \"neutron-6dcf447b8f-d5qql\" (UID: \"1f21bc7c-6371-45de-9766-ce9ad07df644\") " pod="openstack/neutron-6dcf447b8f-d5qql" Oct 12 07:49:51 crc kubenswrapper[4599]: I1012 07:49:51.180474 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f21bc7c-6371-45de-9766-ce9ad07df644-internal-tls-certs\") pod \"neutron-6dcf447b8f-d5qql\" (UID: \"1f21bc7c-6371-45de-9766-ce9ad07df644\") " pod="openstack/neutron-6dcf447b8f-d5qql" Oct 12 07:49:51 crc kubenswrapper[4599]: I1012 07:49:51.180493 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfmxx\" (UniqueName: \"kubernetes.io/projected/1f21bc7c-6371-45de-9766-ce9ad07df644-kube-api-access-cfmxx\") pod \"neutron-6dcf447b8f-d5qql\" (UID: \"1f21bc7c-6371-45de-9766-ce9ad07df644\") " pod="openstack/neutron-6dcf447b8f-d5qql" Oct 12 07:49:51 crc kubenswrapper[4599]: I1012 07:49:51.282629 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f21bc7c-6371-45de-9766-ce9ad07df644-public-tls-certs\") pod \"neutron-6dcf447b8f-d5qql\" (UID: \"1f21bc7c-6371-45de-9766-ce9ad07df644\") " pod="openstack/neutron-6dcf447b8f-d5qql" Oct 12 07:49:51 crc kubenswrapper[4599]: I1012 07:49:51.282942 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1f21bc7c-6371-45de-9766-ce9ad07df644-config\") pod \"neutron-6dcf447b8f-d5qql\" (UID: \"1f21bc7c-6371-45de-9766-ce9ad07df644\") " pod="openstack/neutron-6dcf447b8f-d5qql" Oct 12 07:49:51 crc kubenswrapper[4599]: I1012 07:49:51.283063 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/1f21bc7c-6371-45de-9766-ce9ad07df644-httpd-config\") pod \"neutron-6dcf447b8f-d5qql\" (UID: \"1f21bc7c-6371-45de-9766-ce9ad07df644\") " pod="openstack/neutron-6dcf447b8f-d5qql" Oct 12 07:49:51 crc kubenswrapper[4599]: I1012 07:49:51.283188 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f21bc7c-6371-45de-9766-ce9ad07df644-ovndb-tls-certs\") pod \"neutron-6dcf447b8f-d5qql\" (UID: \"1f21bc7c-6371-45de-9766-ce9ad07df644\") " pod="openstack/neutron-6dcf447b8f-d5qql" Oct 12 07:49:51 crc kubenswrapper[4599]: I1012 07:49:51.283262 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f21bc7c-6371-45de-9766-ce9ad07df644-combined-ca-bundle\") pod \"neutron-6dcf447b8f-d5qql\" (UID: \"1f21bc7c-6371-45de-9766-ce9ad07df644\") " pod="openstack/neutron-6dcf447b8f-d5qql" Oct 12 07:49:51 crc kubenswrapper[4599]: I1012 07:49:51.283362 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f21bc7c-6371-45de-9766-ce9ad07df644-internal-tls-certs\") pod \"neutron-6dcf447b8f-d5qql\" (UID: \"1f21bc7c-6371-45de-9766-ce9ad07df644\") " pod="openstack/neutron-6dcf447b8f-d5qql" Oct 12 07:49:51 crc kubenswrapper[4599]: I1012 07:49:51.283449 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfmxx\" (UniqueName: \"kubernetes.io/projected/1f21bc7c-6371-45de-9766-ce9ad07df644-kube-api-access-cfmxx\") pod \"neutron-6dcf447b8f-d5qql\" (UID: \"1f21bc7c-6371-45de-9766-ce9ad07df644\") " pod="openstack/neutron-6dcf447b8f-d5qql" Oct 12 07:49:51 crc kubenswrapper[4599]: I1012 07:49:51.290004 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f21bc7c-6371-45de-9766-ce9ad07df644-public-tls-certs\") pod \"neutron-6dcf447b8f-d5qql\" (UID: \"1f21bc7c-6371-45de-9766-ce9ad07df644\") " pod="openstack/neutron-6dcf447b8f-d5qql" Oct 12 07:49:51 crc kubenswrapper[4599]: I1012 07:49:51.292358 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f21bc7c-6371-45de-9766-ce9ad07df644-combined-ca-bundle\") pod \"neutron-6dcf447b8f-d5qql\" (UID: \"1f21bc7c-6371-45de-9766-ce9ad07df644\") " pod="openstack/neutron-6dcf447b8f-d5qql" Oct 12 07:49:51 crc kubenswrapper[4599]: I1012 07:49:51.296322 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f21bc7c-6371-45de-9766-ce9ad07df644-ovndb-tls-certs\") pod \"neutron-6dcf447b8f-d5qql\" (UID: \"1f21bc7c-6371-45de-9766-ce9ad07df644\") " pod="openstack/neutron-6dcf447b8f-d5qql" Oct 12 07:49:51 crc kubenswrapper[4599]: I1012 07:49:51.297475 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f21bc7c-6371-45de-9766-ce9ad07df644-internal-tls-certs\") pod \"neutron-6dcf447b8f-d5qql\" (UID: \"1f21bc7c-6371-45de-9766-ce9ad07df644\") " pod="openstack/neutron-6dcf447b8f-d5qql" Oct 12 07:49:51 crc kubenswrapper[4599]: I1012 07:49:51.300589 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/1f21bc7c-6371-45de-9766-ce9ad07df644-httpd-config\") pod \"neutron-6dcf447b8f-d5qql\" (UID: \"1f21bc7c-6371-45de-9766-ce9ad07df644\") " pod="openstack/neutron-6dcf447b8f-d5qql" Oct 12 07:49:51 crc kubenswrapper[4599]: I1012 07:49:51.311902 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/1f21bc7c-6371-45de-9766-ce9ad07df644-config\") pod \"neutron-6dcf447b8f-d5qql\" (UID: \"1f21bc7c-6371-45de-9766-ce9ad07df644\") " pod="openstack/neutron-6dcf447b8f-d5qql" Oct 12 07:49:51 crc kubenswrapper[4599]: I1012 07:49:51.312616 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfmxx\" (UniqueName: \"kubernetes.io/projected/1f21bc7c-6371-45de-9766-ce9ad07df644-kube-api-access-cfmxx\") pod \"neutron-6dcf447b8f-d5qql\" (UID: \"1f21bc7c-6371-45de-9766-ce9ad07df644\") " pod="openstack/neutron-6dcf447b8f-d5qql" Oct 12 07:49:51 crc kubenswrapper[4599]: I1012 07:49:51.330181 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6dcf447b8f-d5qql" Oct 12 07:49:51 crc kubenswrapper[4599]: I1012 07:49:51.566087 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ea02d25-75be-4c25-bc35-539b4d590e10" path="/var/lib/kubelet/pods/7ea02d25-75be-4c25-bc35-539b4d590e10/volumes" Oct 12 07:49:51 crc kubenswrapper[4599]: I1012 07:49:51.567267 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f94bf31d-933c-4cca-998a-4c3fc9a451f4" path="/var/lib/kubelet/pods/f94bf31d-933c-4cca-998a-4c3fc9a451f4/volumes" Oct 12 07:49:51 crc kubenswrapper[4599]: I1012 07:49:51.825198 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6dcf447b8f-d5qql"] Oct 12 07:49:51 crc kubenswrapper[4599]: W1012 07:49:51.829731 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1f21bc7c_6371_45de_9766_ce9ad07df644.slice/crio-22536d2dd14544f79feccc48af23e39da6fdb78954f8bcbe71441011f979470f WatchSource:0}: Error finding container 22536d2dd14544f79feccc48af23e39da6fdb78954f8bcbe71441011f979470f: Status 404 returned error can't find the container with id 22536d2dd14544f79feccc48af23e39da6fdb78954f8bcbe71441011f979470f Oct 12 07:49:51 crc kubenswrapper[4599]: I1012 07:49:51.872727 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"412835ea-6470-43ff-896b-98da92bb7393","Type":"ContainerStarted","Data":"9f1e3309f87e6bc3c1a86687b03d8d42468c0a05e56b65a1d42faa3f80cb0de9"} Oct 12 07:49:51 crc kubenswrapper[4599]: I1012 07:49:51.872840 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"412835ea-6470-43ff-896b-98da92bb7393","Type":"ContainerStarted","Data":"020e24f9fcc28864b5d078659cbc3da7fa453b8a26a4b9e6ea96dd95cb9212c1"} Oct 12 07:49:51 crc kubenswrapper[4599]: I1012 07:49:51.873535 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6dcf447b8f-d5qql" event={"ID":"1f21bc7c-6371-45de-9766-ce9ad07df644","Type":"ContainerStarted","Data":"22536d2dd14544f79feccc48af23e39da6fdb78954f8bcbe71441011f979470f"} Oct 12 07:49:52 crc kubenswrapper[4599]: I1012 07:49:52.517060 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-689dd94bf4-pwcdz" Oct 12 07:49:52 crc kubenswrapper[4599]: I1012 07:49:52.587652 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-689dd94bf4-pwcdz" Oct 12 07:49:52 crc kubenswrapper[4599]: I1012 07:49:52.654314 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-bdcbd85c8-pvjzc"] Oct 12 07:49:52 crc kubenswrapper[4599]: I1012 07:49:52.654589 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-bdcbd85c8-pvjzc" podUID="d5b5eea9-05b1-417c-a181-bb58db532acc" containerName="barbican-api-log" containerID="cri-o://0aeb413e3dd8be9b9cf5e4df72e9eb5dd9985af11d03e78f933583c58116d3ee" gracePeriod=30 Oct 12 07:49:52 crc kubenswrapper[4599]: I1012 07:49:52.655093 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-bdcbd85c8-pvjzc" podUID="d5b5eea9-05b1-417c-a181-bb58db532acc" containerName="barbican-api" containerID="cri-o://31fbe1e390315b7164049d2e778c5dc4012cec9e0dd2e8e2f9c81250418829b7" gracePeriod=30 Oct 12 07:49:52 crc kubenswrapper[4599]: I1012 07:49:52.886877 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6dcf447b8f-d5qql" event={"ID":"1f21bc7c-6371-45de-9766-ce9ad07df644","Type":"ContainerStarted","Data":"071e1a2f26b3f3eec181c32dfb12189a58075f36154e8ef5e290c4cac975173a"} Oct 12 07:49:52 crc kubenswrapper[4599]: I1012 07:49:52.887298 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6dcf447b8f-d5qql" Oct 12 07:49:52 crc kubenswrapper[4599]: I1012 07:49:52.887316 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6dcf447b8f-d5qql" event={"ID":"1f21bc7c-6371-45de-9766-ce9ad07df644","Type":"ContainerStarted","Data":"c63cdaa8b738cbf5fd005a9aa89528cfdbfa6fc5857338389f2631b8d7bf30e9"} Oct 12 07:49:52 crc kubenswrapper[4599]: I1012 07:49:52.888389 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"412835ea-6470-43ff-896b-98da92bb7393","Type":"ContainerStarted","Data":"0b52b9ddb35fd1b5fde0183f396cf3cc0ff2109d3a809f577cd974374526c102"} Oct 12 07:49:52 crc kubenswrapper[4599]: I1012 07:49:52.890015 4599 generic.go:334] "Generic (PLEG): container finished" podID="d5b5eea9-05b1-417c-a181-bb58db532acc" containerID="0aeb413e3dd8be9b9cf5e4df72e9eb5dd9985af11d03e78f933583c58116d3ee" exitCode=143 Oct 12 07:49:52 crc kubenswrapper[4599]: I1012 07:49:52.890077 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-bdcbd85c8-pvjzc" event={"ID":"d5b5eea9-05b1-417c-a181-bb58db532acc","Type":"ContainerDied","Data":"0aeb413e3dd8be9b9cf5e4df72e9eb5dd9985af11d03e78f933583c58116d3ee"} Oct 12 07:49:52 crc kubenswrapper[4599]: I1012 07:49:52.909956 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6dcf447b8f-d5qql" podStartSLOduration=2.909941858 podStartE2EDuration="2.909941858s" podCreationTimestamp="2025-10-12 07:49:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:49:52.902388383 +0000 UTC m=+889.691583885" watchObservedRunningTime="2025-10-12 07:49:52.909941858 +0000 UTC m=+889.699137360" Oct 12 07:49:53 crc kubenswrapper[4599]: I1012 07:49:53.907905 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"412835ea-6470-43ff-896b-98da92bb7393","Type":"ContainerStarted","Data":"6dc982e6d6678280a87bbc087bc07e454ed6ac050468fb389f84f383b998624b"} Oct 12 07:49:54 crc kubenswrapper[4599]: I1012 07:49:54.917486 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"412835ea-6470-43ff-896b-98da92bb7393","Type":"ContainerStarted","Data":"7c4482729785f9de8488a78c1c54e0e328e2b6a2cd8a10effc46ef27c500410d"} Oct 12 07:49:54 crc kubenswrapper[4599]: I1012 07:49:54.917861 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 12 07:49:54 crc kubenswrapper[4599]: I1012 07:49:54.937978 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.485631464 podStartE2EDuration="4.937950166s" podCreationTimestamp="2025-10-12 07:49:50 +0000 UTC" firstStartedPulling="2025-10-12 07:49:50.860601322 +0000 UTC m=+887.649796824" lastFinishedPulling="2025-10-12 07:49:54.312920024 +0000 UTC m=+891.102115526" observedRunningTime="2025-10-12 07:49:54.934636703 +0000 UTC m=+891.723832205" watchObservedRunningTime="2025-10-12 07:49:54.937950166 +0000 UTC m=+891.727145668" Oct 12 07:49:55 crc kubenswrapper[4599]: I1012 07:49:55.287463 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-55dcb544bd-wvf5d" Oct 12 07:49:55 crc kubenswrapper[4599]: I1012 07:49:55.331284 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-55dcb544bd-wvf5d" Oct 12 07:49:55 crc kubenswrapper[4599]: I1012 07:49:55.667727 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 12 07:49:55 crc kubenswrapper[4599]: I1012 07:49:55.706972 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 12 07:49:55 crc kubenswrapper[4599]: I1012 07:49:55.813371 4599 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-bdcbd85c8-pvjzc" podUID="d5b5eea9-05b1-417c-a181-bb58db532acc" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.152:9311/healthcheck\": read tcp 10.217.0.2:36026->10.217.0.152:9311: read: connection reset by peer" Oct 12 07:49:55 crc kubenswrapper[4599]: I1012 07:49:55.813992 4599 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-bdcbd85c8-pvjzc" podUID="d5b5eea9-05b1-417c-a181-bb58db532acc" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.152:9311/healthcheck\": read tcp 10.217.0.2:36034->10.217.0.152:9311: read: connection reset by peer" Oct 12 07:49:55 crc kubenswrapper[4599]: I1012 07:49:55.867719 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-5d9d5b7486-4r48s" Oct 12 07:49:55 crc kubenswrapper[4599]: I1012 07:49:55.930773 4599 generic.go:334] "Generic (PLEG): container finished" podID="d5b5eea9-05b1-417c-a181-bb58db532acc" containerID="31fbe1e390315b7164049d2e778c5dc4012cec9e0dd2e8e2f9c81250418829b7" exitCode=0 Oct 12 07:49:55 crc kubenswrapper[4599]: I1012 07:49:55.931090 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="b6dbf57f-d6e6-417f-9c84-968e6336e3a6" containerName="cinder-scheduler" containerID="cri-o://dc8ce30ad7abfcaccc058bec4b88434be1c58dd95b2012cfd0da33292d6d2a8f" gracePeriod=30 Oct 12 07:49:55 crc kubenswrapper[4599]: I1012 07:49:55.931234 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-bdcbd85c8-pvjzc" event={"ID":"d5b5eea9-05b1-417c-a181-bb58db532acc","Type":"ContainerDied","Data":"31fbe1e390315b7164049d2e778c5dc4012cec9e0dd2e8e2f9c81250418829b7"} Oct 12 07:49:55 crc kubenswrapper[4599]: I1012 07:49:55.932757 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="b6dbf57f-d6e6-417f-9c84-968e6336e3a6" containerName="probe" containerID="cri-o://0f6bb05a52897aac97d0fa11262581ea437fe7ef13f2f0f39c5270a9ffcfa14a" gracePeriod=30 Oct 12 07:49:56 crc kubenswrapper[4599]: I1012 07:49:56.211369 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-bdcbd85c8-pvjzc" Oct 12 07:49:56 crc kubenswrapper[4599]: I1012 07:49:56.287614 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tp752\" (UniqueName: \"kubernetes.io/projected/d5b5eea9-05b1-417c-a181-bb58db532acc-kube-api-access-tp752\") pod \"d5b5eea9-05b1-417c-a181-bb58db532acc\" (UID: \"d5b5eea9-05b1-417c-a181-bb58db532acc\") " Oct 12 07:49:56 crc kubenswrapper[4599]: I1012 07:49:56.287772 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5b5eea9-05b1-417c-a181-bb58db532acc-combined-ca-bundle\") pod \"d5b5eea9-05b1-417c-a181-bb58db532acc\" (UID: \"d5b5eea9-05b1-417c-a181-bb58db532acc\") " Oct 12 07:49:56 crc kubenswrapper[4599]: I1012 07:49:56.287803 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5b5eea9-05b1-417c-a181-bb58db532acc-logs\") pod \"d5b5eea9-05b1-417c-a181-bb58db532acc\" (UID: \"d5b5eea9-05b1-417c-a181-bb58db532acc\") " Oct 12 07:49:56 crc kubenswrapper[4599]: I1012 07:49:56.287899 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d5b5eea9-05b1-417c-a181-bb58db532acc-config-data-custom\") pod \"d5b5eea9-05b1-417c-a181-bb58db532acc\" (UID: \"d5b5eea9-05b1-417c-a181-bb58db532acc\") " Oct 12 07:49:56 crc kubenswrapper[4599]: I1012 07:49:56.288008 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5b5eea9-05b1-417c-a181-bb58db532acc-config-data\") pod \"d5b5eea9-05b1-417c-a181-bb58db532acc\" (UID: \"d5b5eea9-05b1-417c-a181-bb58db532acc\") " Oct 12 07:49:56 crc kubenswrapper[4599]: I1012 07:49:56.288422 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5b5eea9-05b1-417c-a181-bb58db532acc-logs" (OuterVolumeSpecName: "logs") pod "d5b5eea9-05b1-417c-a181-bb58db532acc" (UID: "d5b5eea9-05b1-417c-a181-bb58db532acc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:49:56 crc kubenswrapper[4599]: I1012 07:49:56.294656 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5b5eea9-05b1-417c-a181-bb58db532acc-kube-api-access-tp752" (OuterVolumeSpecName: "kube-api-access-tp752") pod "d5b5eea9-05b1-417c-a181-bb58db532acc" (UID: "d5b5eea9-05b1-417c-a181-bb58db532acc"). InnerVolumeSpecName "kube-api-access-tp752". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:49:56 crc kubenswrapper[4599]: I1012 07:49:56.295645 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5b5eea9-05b1-417c-a181-bb58db532acc-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "d5b5eea9-05b1-417c-a181-bb58db532acc" (UID: "d5b5eea9-05b1-417c-a181-bb58db532acc"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:49:56 crc kubenswrapper[4599]: I1012 07:49:56.311918 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5b5eea9-05b1-417c-a181-bb58db532acc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d5b5eea9-05b1-417c-a181-bb58db532acc" (UID: "d5b5eea9-05b1-417c-a181-bb58db532acc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:49:56 crc kubenswrapper[4599]: I1012 07:49:56.330478 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5b5eea9-05b1-417c-a181-bb58db532acc-config-data" (OuterVolumeSpecName: "config-data") pod "d5b5eea9-05b1-417c-a181-bb58db532acc" (UID: "d5b5eea9-05b1-417c-a181-bb58db532acc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:49:56 crc kubenswrapper[4599]: I1012 07:49:56.390726 4599 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5b5eea9-05b1-417c-a181-bb58db532acc-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:56 crc kubenswrapper[4599]: I1012 07:49:56.390760 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tp752\" (UniqueName: \"kubernetes.io/projected/d5b5eea9-05b1-417c-a181-bb58db532acc-kube-api-access-tp752\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:56 crc kubenswrapper[4599]: I1012 07:49:56.390774 4599 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5b5eea9-05b1-417c-a181-bb58db532acc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:56 crc kubenswrapper[4599]: I1012 07:49:56.390782 4599 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5b5eea9-05b1-417c-a181-bb58db532acc-logs\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:56 crc kubenswrapper[4599]: I1012 07:49:56.390791 4599 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d5b5eea9-05b1-417c-a181-bb58db532acc-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:56 crc kubenswrapper[4599]: I1012 07:49:56.939190 4599 generic.go:334] "Generic (PLEG): container finished" podID="b6dbf57f-d6e6-417f-9c84-968e6336e3a6" containerID="0f6bb05a52897aac97d0fa11262581ea437fe7ef13f2f0f39c5270a9ffcfa14a" exitCode=0 Oct 12 07:49:56 crc kubenswrapper[4599]: I1012 07:49:56.939365 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b6dbf57f-d6e6-417f-9c84-968e6336e3a6","Type":"ContainerDied","Data":"0f6bb05a52897aac97d0fa11262581ea437fe7ef13f2f0f39c5270a9ffcfa14a"} Oct 12 07:49:56 crc kubenswrapper[4599]: I1012 07:49:56.941415 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-bdcbd85c8-pvjzc" event={"ID":"d5b5eea9-05b1-417c-a181-bb58db532acc","Type":"ContainerDied","Data":"0ca9a93a1222b4cb8067f5662b287e52f7ea492cb96f97f2c7592c5f6cc75682"} Oct 12 07:49:56 crc kubenswrapper[4599]: I1012 07:49:56.941463 4599 scope.go:117] "RemoveContainer" containerID="31fbe1e390315b7164049d2e778c5dc4012cec9e0dd2e8e2f9c81250418829b7" Oct 12 07:49:56 crc kubenswrapper[4599]: I1012 07:49:56.941483 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-bdcbd85c8-pvjzc" Oct 12 07:49:56 crc kubenswrapper[4599]: I1012 07:49:56.980074 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-bdcbd85c8-pvjzc"] Oct 12 07:49:56 crc kubenswrapper[4599]: I1012 07:49:56.984996 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-bdcbd85c8-pvjzc"] Oct 12 07:49:56 crc kubenswrapper[4599]: I1012 07:49:56.989837 4599 scope.go:117] "RemoveContainer" containerID="0aeb413e3dd8be9b9cf5e4df72e9eb5dd9985af11d03e78f933583c58116d3ee" Oct 12 07:49:57 crc kubenswrapper[4599]: I1012 07:49:57.441781 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Oct 12 07:49:57 crc kubenswrapper[4599]: I1012 07:49:57.557771 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5b5eea9-05b1-417c-a181-bb58db532acc" path="/var/lib/kubelet/pods/d5b5eea9-05b1-417c-a181-bb58db532acc/volumes" Oct 12 07:49:58 crc kubenswrapper[4599]: I1012 07:49:58.075436 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-56c55fd88c-fnjlg" Oct 12 07:49:58 crc kubenswrapper[4599]: I1012 07:49:58.115923 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-977d96ff-6cnmb"] Oct 12 07:49:58 crc kubenswrapper[4599]: I1012 07:49:58.116365 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-977d96ff-6cnmb" podUID="76d0d44d-634f-43a2-89b0-59874fdf35b5" containerName="dnsmasq-dns" containerID="cri-o://47b68a2bf58dedea2ed87f85daf28cc535cac673a563c130380a309b685a6627" gracePeriod=10 Oct 12 07:49:58 crc kubenswrapper[4599]: I1012 07:49:58.321929 4599 patch_prober.go:28] interesting pod/machine-config-daemon-5mz5c container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 07:49:58 crc kubenswrapper[4599]: I1012 07:49:58.322280 4599 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 07:49:58 crc kubenswrapper[4599]: I1012 07:49:58.554520 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-977d96ff-6cnmb" Oct 12 07:49:58 crc kubenswrapper[4599]: I1012 07:49:58.644600 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/76d0d44d-634f-43a2-89b0-59874fdf35b5-dns-svc\") pod \"76d0d44d-634f-43a2-89b0-59874fdf35b5\" (UID: \"76d0d44d-634f-43a2-89b0-59874fdf35b5\") " Oct 12 07:49:58 crc kubenswrapper[4599]: I1012 07:49:58.644656 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/76d0d44d-634f-43a2-89b0-59874fdf35b5-ovsdbserver-nb\") pod \"76d0d44d-634f-43a2-89b0-59874fdf35b5\" (UID: \"76d0d44d-634f-43a2-89b0-59874fdf35b5\") " Oct 12 07:49:58 crc kubenswrapper[4599]: I1012 07:49:58.644687 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76d0d44d-634f-43a2-89b0-59874fdf35b5-config\") pod \"76d0d44d-634f-43a2-89b0-59874fdf35b5\" (UID: \"76d0d44d-634f-43a2-89b0-59874fdf35b5\") " Oct 12 07:49:58 crc kubenswrapper[4599]: I1012 07:49:58.644753 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/76d0d44d-634f-43a2-89b0-59874fdf35b5-dns-swift-storage-0\") pod \"76d0d44d-634f-43a2-89b0-59874fdf35b5\" (UID: \"76d0d44d-634f-43a2-89b0-59874fdf35b5\") " Oct 12 07:49:58 crc kubenswrapper[4599]: I1012 07:49:58.644778 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-56mvd\" (UniqueName: \"kubernetes.io/projected/76d0d44d-634f-43a2-89b0-59874fdf35b5-kube-api-access-56mvd\") pod \"76d0d44d-634f-43a2-89b0-59874fdf35b5\" (UID: \"76d0d44d-634f-43a2-89b0-59874fdf35b5\") " Oct 12 07:49:58 crc kubenswrapper[4599]: I1012 07:49:58.644942 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/76d0d44d-634f-43a2-89b0-59874fdf35b5-ovsdbserver-sb\") pod \"76d0d44d-634f-43a2-89b0-59874fdf35b5\" (UID: \"76d0d44d-634f-43a2-89b0-59874fdf35b5\") " Oct 12 07:49:58 crc kubenswrapper[4599]: I1012 07:49:58.650612 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76d0d44d-634f-43a2-89b0-59874fdf35b5-kube-api-access-56mvd" (OuterVolumeSpecName: "kube-api-access-56mvd") pod "76d0d44d-634f-43a2-89b0-59874fdf35b5" (UID: "76d0d44d-634f-43a2-89b0-59874fdf35b5"). InnerVolumeSpecName "kube-api-access-56mvd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:49:58 crc kubenswrapper[4599]: I1012 07:49:58.685493 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76d0d44d-634f-43a2-89b0-59874fdf35b5-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "76d0d44d-634f-43a2-89b0-59874fdf35b5" (UID: "76d0d44d-634f-43a2-89b0-59874fdf35b5"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:49:58 crc kubenswrapper[4599]: I1012 07:49:58.686312 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76d0d44d-634f-43a2-89b0-59874fdf35b5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "76d0d44d-634f-43a2-89b0-59874fdf35b5" (UID: "76d0d44d-634f-43a2-89b0-59874fdf35b5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:49:58 crc kubenswrapper[4599]: I1012 07:49:58.699163 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76d0d44d-634f-43a2-89b0-59874fdf35b5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "76d0d44d-634f-43a2-89b0-59874fdf35b5" (UID: "76d0d44d-634f-43a2-89b0-59874fdf35b5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:49:58 crc kubenswrapper[4599]: I1012 07:49:58.700769 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76d0d44d-634f-43a2-89b0-59874fdf35b5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "76d0d44d-634f-43a2-89b0-59874fdf35b5" (UID: "76d0d44d-634f-43a2-89b0-59874fdf35b5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:49:58 crc kubenswrapper[4599]: I1012 07:49:58.705640 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76d0d44d-634f-43a2-89b0-59874fdf35b5-config" (OuterVolumeSpecName: "config") pod "76d0d44d-634f-43a2-89b0-59874fdf35b5" (UID: "76d0d44d-634f-43a2-89b0-59874fdf35b5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:49:58 crc kubenswrapper[4599]: I1012 07:49:58.747640 4599 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/76d0d44d-634f-43a2-89b0-59874fdf35b5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:58 crc kubenswrapper[4599]: I1012 07:49:58.748615 4599 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/76d0d44d-634f-43a2-89b0-59874fdf35b5-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:58 crc kubenswrapper[4599]: I1012 07:49:58.748683 4599 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/76d0d44d-634f-43a2-89b0-59874fdf35b5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:58 crc kubenswrapper[4599]: I1012 07:49:58.748749 4599 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76d0d44d-634f-43a2-89b0-59874fdf35b5-config\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:58 crc kubenswrapper[4599]: I1012 07:49:58.748806 4599 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/76d0d44d-634f-43a2-89b0-59874fdf35b5-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:58 crc kubenswrapper[4599]: I1012 07:49:58.748855 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-56mvd\" (UniqueName: \"kubernetes.io/projected/76d0d44d-634f-43a2-89b0-59874fdf35b5-kube-api-access-56mvd\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:58 crc kubenswrapper[4599]: I1012 07:49:58.958790 4599 generic.go:334] "Generic (PLEG): container finished" podID="b6dbf57f-d6e6-417f-9c84-968e6336e3a6" containerID="dc8ce30ad7abfcaccc058bec4b88434be1c58dd95b2012cfd0da33292d6d2a8f" exitCode=0 Oct 12 07:49:58 crc kubenswrapper[4599]: I1012 07:49:58.958849 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b6dbf57f-d6e6-417f-9c84-968e6336e3a6","Type":"ContainerDied","Data":"dc8ce30ad7abfcaccc058bec4b88434be1c58dd95b2012cfd0da33292d6d2a8f"} Oct 12 07:49:58 crc kubenswrapper[4599]: I1012 07:49:58.960975 4599 generic.go:334] "Generic (PLEG): container finished" podID="76d0d44d-634f-43a2-89b0-59874fdf35b5" containerID="47b68a2bf58dedea2ed87f85daf28cc535cac673a563c130380a309b685a6627" exitCode=0 Oct 12 07:49:58 crc kubenswrapper[4599]: I1012 07:49:58.961072 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-977d96ff-6cnmb" event={"ID":"76d0d44d-634f-43a2-89b0-59874fdf35b5","Type":"ContainerDied","Data":"47b68a2bf58dedea2ed87f85daf28cc535cac673a563c130380a309b685a6627"} Oct 12 07:49:58 crc kubenswrapper[4599]: I1012 07:49:58.961235 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-977d96ff-6cnmb" event={"ID":"76d0d44d-634f-43a2-89b0-59874fdf35b5","Type":"ContainerDied","Data":"6ac04af0c17716c449dcc8f620738a11533e7a1cbf2ab11fc097d6d01477fd06"} Oct 12 07:49:58 crc kubenswrapper[4599]: I1012 07:49:58.961346 4599 scope.go:117] "RemoveContainer" containerID="47b68a2bf58dedea2ed87f85daf28cc535cac673a563c130380a309b685a6627" Oct 12 07:49:58 crc kubenswrapper[4599]: I1012 07:49:58.961142 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-977d96ff-6cnmb" Oct 12 07:49:58 crc kubenswrapper[4599]: I1012 07:49:58.992707 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-977d96ff-6cnmb"] Oct 12 07:49:58 crc kubenswrapper[4599]: I1012 07:49:58.994882 4599 scope.go:117] "RemoveContainer" containerID="43519f28e0fdc3576efd7baa3c616384644e0f881b9d82799f685dcef1e6361c" Oct 12 07:49:58 crc kubenswrapper[4599]: I1012 07:49:58.997256 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-977d96ff-6cnmb"] Oct 12 07:49:59 crc kubenswrapper[4599]: I1012 07:49:59.016014 4599 scope.go:117] "RemoveContainer" containerID="47b68a2bf58dedea2ed87f85daf28cc535cac673a563c130380a309b685a6627" Oct 12 07:49:59 crc kubenswrapper[4599]: E1012 07:49:59.018293 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47b68a2bf58dedea2ed87f85daf28cc535cac673a563c130380a309b685a6627\": container with ID starting with 47b68a2bf58dedea2ed87f85daf28cc535cac673a563c130380a309b685a6627 not found: ID does not exist" containerID="47b68a2bf58dedea2ed87f85daf28cc535cac673a563c130380a309b685a6627" Oct 12 07:49:59 crc kubenswrapper[4599]: I1012 07:49:59.018418 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47b68a2bf58dedea2ed87f85daf28cc535cac673a563c130380a309b685a6627"} err="failed to get container status \"47b68a2bf58dedea2ed87f85daf28cc535cac673a563c130380a309b685a6627\": rpc error: code = NotFound desc = could not find container \"47b68a2bf58dedea2ed87f85daf28cc535cac673a563c130380a309b685a6627\": container with ID starting with 47b68a2bf58dedea2ed87f85daf28cc535cac673a563c130380a309b685a6627 not found: ID does not exist" Oct 12 07:49:59 crc kubenswrapper[4599]: I1012 07:49:59.018501 4599 scope.go:117] "RemoveContainer" containerID="43519f28e0fdc3576efd7baa3c616384644e0f881b9d82799f685dcef1e6361c" Oct 12 07:49:59 crc kubenswrapper[4599]: E1012 07:49:59.018927 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43519f28e0fdc3576efd7baa3c616384644e0f881b9d82799f685dcef1e6361c\": container with ID starting with 43519f28e0fdc3576efd7baa3c616384644e0f881b9d82799f685dcef1e6361c not found: ID does not exist" containerID="43519f28e0fdc3576efd7baa3c616384644e0f881b9d82799f685dcef1e6361c" Oct 12 07:49:59 crc kubenswrapper[4599]: I1012 07:49:59.018963 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43519f28e0fdc3576efd7baa3c616384644e0f881b9d82799f685dcef1e6361c"} err="failed to get container status \"43519f28e0fdc3576efd7baa3c616384644e0f881b9d82799f685dcef1e6361c\": rpc error: code = NotFound desc = could not find container \"43519f28e0fdc3576efd7baa3c616384644e0f881b9d82799f685dcef1e6361c\": container with ID starting with 43519f28e0fdc3576efd7baa3c616384644e0f881b9d82799f685dcef1e6361c not found: ID does not exist" Oct 12 07:49:59 crc kubenswrapper[4599]: I1012 07:49:59.218384 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 12 07:49:59 crc kubenswrapper[4599]: I1012 07:49:59.282413 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 12 07:49:59 crc kubenswrapper[4599]: E1012 07:49:59.282773 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5b5eea9-05b1-417c-a181-bb58db532acc" containerName="barbican-api" Oct 12 07:49:59 crc kubenswrapper[4599]: I1012 07:49:59.282789 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5b5eea9-05b1-417c-a181-bb58db532acc" containerName="barbican-api" Oct 12 07:49:59 crc kubenswrapper[4599]: E1012 07:49:59.282797 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6dbf57f-d6e6-417f-9c84-968e6336e3a6" containerName="probe" Oct 12 07:49:59 crc kubenswrapper[4599]: I1012 07:49:59.282803 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6dbf57f-d6e6-417f-9c84-968e6336e3a6" containerName="probe" Oct 12 07:49:59 crc kubenswrapper[4599]: E1012 07:49:59.282813 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6dbf57f-d6e6-417f-9c84-968e6336e3a6" containerName="cinder-scheduler" Oct 12 07:49:59 crc kubenswrapper[4599]: I1012 07:49:59.282819 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6dbf57f-d6e6-417f-9c84-968e6336e3a6" containerName="cinder-scheduler" Oct 12 07:49:59 crc kubenswrapper[4599]: E1012 07:49:59.282825 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76d0d44d-634f-43a2-89b0-59874fdf35b5" containerName="dnsmasq-dns" Oct 12 07:49:59 crc kubenswrapper[4599]: I1012 07:49:59.282832 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="76d0d44d-634f-43a2-89b0-59874fdf35b5" containerName="dnsmasq-dns" Oct 12 07:49:59 crc kubenswrapper[4599]: E1012 07:49:59.282845 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5b5eea9-05b1-417c-a181-bb58db532acc" containerName="barbican-api-log" Oct 12 07:49:59 crc kubenswrapper[4599]: I1012 07:49:59.282851 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5b5eea9-05b1-417c-a181-bb58db532acc" containerName="barbican-api-log" Oct 12 07:49:59 crc kubenswrapper[4599]: E1012 07:49:59.282873 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76d0d44d-634f-43a2-89b0-59874fdf35b5" containerName="init" Oct 12 07:49:59 crc kubenswrapper[4599]: I1012 07:49:59.282878 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="76d0d44d-634f-43a2-89b0-59874fdf35b5" containerName="init" Oct 12 07:49:59 crc kubenswrapper[4599]: I1012 07:49:59.283041 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="76d0d44d-634f-43a2-89b0-59874fdf35b5" containerName="dnsmasq-dns" Oct 12 07:49:59 crc kubenswrapper[4599]: I1012 07:49:59.283053 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5b5eea9-05b1-417c-a181-bb58db532acc" containerName="barbican-api" Oct 12 07:49:59 crc kubenswrapper[4599]: I1012 07:49:59.283065 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6dbf57f-d6e6-417f-9c84-968e6336e3a6" containerName="probe" Oct 12 07:49:59 crc kubenswrapper[4599]: I1012 07:49:59.283078 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6dbf57f-d6e6-417f-9c84-968e6336e3a6" containerName="cinder-scheduler" Oct 12 07:49:59 crc kubenswrapper[4599]: I1012 07:49:59.283086 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5b5eea9-05b1-417c-a181-bb58db532acc" containerName="barbican-api-log" Oct 12 07:49:59 crc kubenswrapper[4599]: I1012 07:49:59.283669 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 12 07:49:59 crc kubenswrapper[4599]: I1012 07:49:59.286280 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Oct 12 07:49:59 crc kubenswrapper[4599]: I1012 07:49:59.286675 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-qwr7w" Oct 12 07:49:59 crc kubenswrapper[4599]: I1012 07:49:59.286975 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Oct 12 07:49:59 crc kubenswrapper[4599]: I1012 07:49:59.293287 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 12 07:49:59 crc kubenswrapper[4599]: I1012 07:49:59.361248 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6dbf57f-d6e6-417f-9c84-968e6336e3a6-config-data\") pod \"b6dbf57f-d6e6-417f-9c84-968e6336e3a6\" (UID: \"b6dbf57f-d6e6-417f-9c84-968e6336e3a6\") " Oct 12 07:49:59 crc kubenswrapper[4599]: I1012 07:49:59.361580 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78s68\" (UniqueName: \"kubernetes.io/projected/b6dbf57f-d6e6-417f-9c84-968e6336e3a6-kube-api-access-78s68\") pod \"b6dbf57f-d6e6-417f-9c84-968e6336e3a6\" (UID: \"b6dbf57f-d6e6-417f-9c84-968e6336e3a6\") " Oct 12 07:49:59 crc kubenswrapper[4599]: I1012 07:49:59.361659 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b6dbf57f-d6e6-417f-9c84-968e6336e3a6-etc-machine-id\") pod \"b6dbf57f-d6e6-417f-9c84-968e6336e3a6\" (UID: \"b6dbf57f-d6e6-417f-9c84-968e6336e3a6\") " Oct 12 07:49:59 crc kubenswrapper[4599]: I1012 07:49:59.361702 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6dbf57f-d6e6-417f-9c84-968e6336e3a6-combined-ca-bundle\") pod \"b6dbf57f-d6e6-417f-9c84-968e6336e3a6\" (UID: \"b6dbf57f-d6e6-417f-9c84-968e6336e3a6\") " Oct 12 07:49:59 crc kubenswrapper[4599]: I1012 07:49:59.361750 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b6dbf57f-d6e6-417f-9c84-968e6336e3a6-config-data-custom\") pod \"b6dbf57f-d6e6-417f-9c84-968e6336e3a6\" (UID: \"b6dbf57f-d6e6-417f-9c84-968e6336e3a6\") " Oct 12 07:49:59 crc kubenswrapper[4599]: I1012 07:49:59.361804 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6dbf57f-d6e6-417f-9c84-968e6336e3a6-scripts\") pod \"b6dbf57f-d6e6-417f-9c84-968e6336e3a6\" (UID: \"b6dbf57f-d6e6-417f-9c84-968e6336e3a6\") " Oct 12 07:49:59 crc kubenswrapper[4599]: I1012 07:49:59.361795 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b6dbf57f-d6e6-417f-9c84-968e6336e3a6-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "b6dbf57f-d6e6-417f-9c84-968e6336e3a6" (UID: "b6dbf57f-d6e6-417f-9c84-968e6336e3a6"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 12 07:49:59 crc kubenswrapper[4599]: I1012 07:49:59.362367 4599 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b6dbf57f-d6e6-417f-9c84-968e6336e3a6-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:59 crc kubenswrapper[4599]: I1012 07:49:59.368738 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6dbf57f-d6e6-417f-9c84-968e6336e3a6-scripts" (OuterVolumeSpecName: "scripts") pod "b6dbf57f-d6e6-417f-9c84-968e6336e3a6" (UID: "b6dbf57f-d6e6-417f-9c84-968e6336e3a6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:49:59 crc kubenswrapper[4599]: I1012 07:49:59.376264 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6dbf57f-d6e6-417f-9c84-968e6336e3a6-kube-api-access-78s68" (OuterVolumeSpecName: "kube-api-access-78s68") pod "b6dbf57f-d6e6-417f-9c84-968e6336e3a6" (UID: "b6dbf57f-d6e6-417f-9c84-968e6336e3a6"). InnerVolumeSpecName "kube-api-access-78s68". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:49:59 crc kubenswrapper[4599]: I1012 07:49:59.383139 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6dbf57f-d6e6-417f-9c84-968e6336e3a6-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "b6dbf57f-d6e6-417f-9c84-968e6336e3a6" (UID: "b6dbf57f-d6e6-417f-9c84-968e6336e3a6"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:49:59 crc kubenswrapper[4599]: I1012 07:49:59.407534 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6dbf57f-d6e6-417f-9c84-968e6336e3a6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b6dbf57f-d6e6-417f-9c84-968e6336e3a6" (UID: "b6dbf57f-d6e6-417f-9c84-968e6336e3a6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:49:59 crc kubenswrapper[4599]: I1012 07:49:59.446859 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6dbf57f-d6e6-417f-9c84-968e6336e3a6-config-data" (OuterVolumeSpecName: "config-data") pod "b6dbf57f-d6e6-417f-9c84-968e6336e3a6" (UID: "b6dbf57f-d6e6-417f-9c84-968e6336e3a6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:49:59 crc kubenswrapper[4599]: I1012 07:49:59.464555 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wp5rt\" (UniqueName: \"kubernetes.io/projected/c2b81c78-c166-45cd-866b-e58db68f708e-kube-api-access-wp5rt\") pod \"openstackclient\" (UID: \"c2b81c78-c166-45cd-866b-e58db68f708e\") " pod="openstack/openstackclient" Oct 12 07:49:59 crc kubenswrapper[4599]: I1012 07:49:59.464791 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c2b81c78-c166-45cd-866b-e58db68f708e-openstack-config-secret\") pod \"openstackclient\" (UID: \"c2b81c78-c166-45cd-866b-e58db68f708e\") " pod="openstack/openstackclient" Oct 12 07:49:59 crc kubenswrapper[4599]: I1012 07:49:59.465195 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2b81c78-c166-45cd-866b-e58db68f708e-combined-ca-bundle\") pod \"openstackclient\" (UID: \"c2b81c78-c166-45cd-866b-e58db68f708e\") " pod="openstack/openstackclient" Oct 12 07:49:59 crc kubenswrapper[4599]: I1012 07:49:59.465291 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c2b81c78-c166-45cd-866b-e58db68f708e-openstack-config\") pod \"openstackclient\" (UID: \"c2b81c78-c166-45cd-866b-e58db68f708e\") " pod="openstack/openstackclient" Oct 12 07:49:59 crc kubenswrapper[4599]: I1012 07:49:59.465655 4599 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6dbf57f-d6e6-417f-9c84-968e6336e3a6-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:59 crc kubenswrapper[4599]: I1012 07:49:59.465677 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78s68\" (UniqueName: \"kubernetes.io/projected/b6dbf57f-d6e6-417f-9c84-968e6336e3a6-kube-api-access-78s68\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:59 crc kubenswrapper[4599]: I1012 07:49:59.465691 4599 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6dbf57f-d6e6-417f-9c84-968e6336e3a6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:59 crc kubenswrapper[4599]: I1012 07:49:59.465701 4599 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b6dbf57f-d6e6-417f-9c84-968e6336e3a6-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:59 crc kubenswrapper[4599]: I1012 07:49:59.465717 4599 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6dbf57f-d6e6-417f-9c84-968e6336e3a6-scripts\") on node \"crc\" DevicePath \"\"" Oct 12 07:49:59 crc kubenswrapper[4599]: I1012 07:49:59.504481 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Oct 12 07:49:59 crc kubenswrapper[4599]: E1012 07:49:59.505904 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle kube-api-access-wp5rt openstack-config openstack-config-secret], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/openstackclient" podUID="c2b81c78-c166-45cd-866b-e58db68f708e" Oct 12 07:49:59 crc kubenswrapper[4599]: I1012 07:49:59.513366 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Oct 12 07:49:59 crc kubenswrapper[4599]: I1012 07:49:59.556938 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76d0d44d-634f-43a2-89b0-59874fdf35b5" path="/var/lib/kubelet/pods/76d0d44d-634f-43a2-89b0-59874fdf35b5/volumes" Oct 12 07:49:59 crc kubenswrapper[4599]: I1012 07:49:59.557729 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2b81c78-c166-45cd-866b-e58db68f708e" path="/var/lib/kubelet/pods/c2b81c78-c166-45cd-866b-e58db68f708e/volumes" Oct 12 07:49:59 crc kubenswrapper[4599]: I1012 07:49:59.567413 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wp5rt\" (UniqueName: \"kubernetes.io/projected/c2b81c78-c166-45cd-866b-e58db68f708e-kube-api-access-wp5rt\") pod \"openstackclient\" (UID: \"c2b81c78-c166-45cd-866b-e58db68f708e\") " pod="openstack/openstackclient" Oct 12 07:49:59 crc kubenswrapper[4599]: I1012 07:49:59.567462 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c2b81c78-c166-45cd-866b-e58db68f708e-openstack-config-secret\") pod \"openstackclient\" (UID: \"c2b81c78-c166-45cd-866b-e58db68f708e\") " pod="openstack/openstackclient" Oct 12 07:49:59 crc kubenswrapper[4599]: I1012 07:49:59.567529 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2b81c78-c166-45cd-866b-e58db68f708e-combined-ca-bundle\") pod \"openstackclient\" (UID: \"c2b81c78-c166-45cd-866b-e58db68f708e\") " pod="openstack/openstackclient" Oct 12 07:49:59 crc kubenswrapper[4599]: I1012 07:49:59.567556 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c2b81c78-c166-45cd-866b-e58db68f708e-openstack-config\") pod \"openstackclient\" (UID: \"c2b81c78-c166-45cd-866b-e58db68f708e\") " pod="openstack/openstackclient" Oct 12 07:49:59 crc kubenswrapper[4599]: I1012 07:49:59.569148 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c2b81c78-c166-45cd-866b-e58db68f708e-openstack-config\") pod \"openstackclient\" (UID: \"c2b81c78-c166-45cd-866b-e58db68f708e\") " pod="openstack/openstackclient" Oct 12 07:49:59 crc kubenswrapper[4599]: E1012 07:49:59.570872 4599 projected.go:194] Error preparing data for projected volume kube-api-access-wp5rt for pod openstack/openstackclient: failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: User "system:node:crc" cannot create resource "serviceaccounts/token" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Oct 12 07:49:59 crc kubenswrapper[4599]: E1012 07:49:59.570973 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c2b81c78-c166-45cd-866b-e58db68f708e-kube-api-access-wp5rt podName:c2b81c78-c166-45cd-866b-e58db68f708e nodeName:}" failed. No retries permitted until 2025-10-12 07:50:00.070943304 +0000 UTC m=+896.860138807 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-wp5rt" (UniqueName: "kubernetes.io/projected/c2b81c78-c166-45cd-866b-e58db68f708e-kube-api-access-wp5rt") pod "openstackclient" (UID: "c2b81c78-c166-45cd-866b-e58db68f708e") : failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: User "system:node:crc" cannot create resource "serviceaccounts/token" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Oct 12 07:49:59 crc kubenswrapper[4599]: I1012 07:49:59.572384 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2b81c78-c166-45cd-866b-e58db68f708e-combined-ca-bundle\") pod \"openstackclient\" (UID: \"c2b81c78-c166-45cd-866b-e58db68f708e\") " pod="openstack/openstackclient" Oct 12 07:49:59 crc kubenswrapper[4599]: I1012 07:49:59.573795 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c2b81c78-c166-45cd-866b-e58db68f708e-openstack-config-secret\") pod \"openstackclient\" (UID: \"c2b81c78-c166-45cd-866b-e58db68f708e\") " pod="openstack/openstackclient" Oct 12 07:49:59 crc kubenswrapper[4599]: I1012 07:49:59.597885 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 12 07:49:59 crc kubenswrapper[4599]: I1012 07:49:59.599148 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 12 07:49:59 crc kubenswrapper[4599]: I1012 07:49:59.604797 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 12 07:49:59 crc kubenswrapper[4599]: I1012 07:49:59.774821 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b6db97ed-496b-4f4d-bb27-2bce6e003912-openstack-config-secret\") pod \"openstackclient\" (UID: \"b6db97ed-496b-4f4d-bb27-2bce6e003912\") " pod="openstack/openstackclient" Oct 12 07:49:59 crc kubenswrapper[4599]: I1012 07:49:59.774912 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlwgt\" (UniqueName: \"kubernetes.io/projected/b6db97ed-496b-4f4d-bb27-2bce6e003912-kube-api-access-nlwgt\") pod \"openstackclient\" (UID: \"b6db97ed-496b-4f4d-bb27-2bce6e003912\") " pod="openstack/openstackclient" Oct 12 07:49:59 crc kubenswrapper[4599]: I1012 07:49:59.774932 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b6db97ed-496b-4f4d-bb27-2bce6e003912-openstack-config\") pod \"openstackclient\" (UID: \"b6db97ed-496b-4f4d-bb27-2bce6e003912\") " pod="openstack/openstackclient" Oct 12 07:49:59 crc kubenswrapper[4599]: I1012 07:49:59.775000 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6db97ed-496b-4f4d-bb27-2bce6e003912-combined-ca-bundle\") pod \"openstackclient\" (UID: \"b6db97ed-496b-4f4d-bb27-2bce6e003912\") " pod="openstack/openstackclient" Oct 12 07:49:59 crc kubenswrapper[4599]: I1012 07:49:59.877769 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b6db97ed-496b-4f4d-bb27-2bce6e003912-openstack-config-secret\") pod \"openstackclient\" (UID: \"b6db97ed-496b-4f4d-bb27-2bce6e003912\") " pod="openstack/openstackclient" Oct 12 07:49:59 crc kubenswrapper[4599]: I1012 07:49:59.878090 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlwgt\" (UniqueName: \"kubernetes.io/projected/b6db97ed-496b-4f4d-bb27-2bce6e003912-kube-api-access-nlwgt\") pod \"openstackclient\" (UID: \"b6db97ed-496b-4f4d-bb27-2bce6e003912\") " pod="openstack/openstackclient" Oct 12 07:49:59 crc kubenswrapper[4599]: I1012 07:49:59.878628 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b6db97ed-496b-4f4d-bb27-2bce6e003912-openstack-config\") pod \"openstackclient\" (UID: \"b6db97ed-496b-4f4d-bb27-2bce6e003912\") " pod="openstack/openstackclient" Oct 12 07:49:59 crc kubenswrapper[4599]: I1012 07:49:59.878793 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6db97ed-496b-4f4d-bb27-2bce6e003912-combined-ca-bundle\") pod \"openstackclient\" (UID: \"b6db97ed-496b-4f4d-bb27-2bce6e003912\") " pod="openstack/openstackclient" Oct 12 07:49:59 crc kubenswrapper[4599]: I1012 07:49:59.879728 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b6db97ed-496b-4f4d-bb27-2bce6e003912-openstack-config\") pod \"openstackclient\" (UID: \"b6db97ed-496b-4f4d-bb27-2bce6e003912\") " pod="openstack/openstackclient" Oct 12 07:49:59 crc kubenswrapper[4599]: I1012 07:49:59.882714 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b6db97ed-496b-4f4d-bb27-2bce6e003912-openstack-config-secret\") pod \"openstackclient\" (UID: \"b6db97ed-496b-4f4d-bb27-2bce6e003912\") " pod="openstack/openstackclient" Oct 12 07:49:59 crc kubenswrapper[4599]: I1012 07:49:59.882721 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6db97ed-496b-4f4d-bb27-2bce6e003912-combined-ca-bundle\") pod \"openstackclient\" (UID: \"b6db97ed-496b-4f4d-bb27-2bce6e003912\") " pod="openstack/openstackclient" Oct 12 07:49:59 crc kubenswrapper[4599]: I1012 07:49:59.893258 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlwgt\" (UniqueName: \"kubernetes.io/projected/b6db97ed-496b-4f4d-bb27-2bce6e003912-kube-api-access-nlwgt\") pod \"openstackclient\" (UID: \"b6db97ed-496b-4f4d-bb27-2bce6e003912\") " pod="openstack/openstackclient" Oct 12 07:49:59 crc kubenswrapper[4599]: I1012 07:49:59.912801 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 12 07:49:59 crc kubenswrapper[4599]: I1012 07:49:59.970181 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b6dbf57f-d6e6-417f-9c84-968e6336e3a6","Type":"ContainerDied","Data":"42d3d8dba34475067a6f8ff5a8c1acad6aa7ea396d755054f8b4746bd177532a"} Oct 12 07:49:59 crc kubenswrapper[4599]: I1012 07:49:59.970510 4599 scope.go:117] "RemoveContainer" containerID="0f6bb05a52897aac97d0fa11262581ea437fe7ef13f2f0f39c5270a9ffcfa14a" Oct 12 07:49:59 crc kubenswrapper[4599]: I1012 07:49:59.970208 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 12 07:49:59 crc kubenswrapper[4599]: I1012 07:49:59.971608 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 12 07:49:59 crc kubenswrapper[4599]: I1012 07:49:59.982367 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 12 07:49:59 crc kubenswrapper[4599]: I1012 07:49:59.994395 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 12 07:50:00 crc kubenswrapper[4599]: I1012 07:50:00.019253 4599 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="c2b81c78-c166-45cd-866b-e58db68f708e" podUID="b6db97ed-496b-4f4d-bb27-2bce6e003912" Oct 12 07:50:00 crc kubenswrapper[4599]: I1012 07:50:00.029395 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 12 07:50:00 crc kubenswrapper[4599]: I1012 07:50:00.079486 4599 scope.go:117] "RemoveContainer" containerID="dc8ce30ad7abfcaccc058bec4b88434be1c58dd95b2012cfd0da33292d6d2a8f" Oct 12 07:50:00 crc kubenswrapper[4599]: I1012 07:50:00.082206 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c2b81c78-c166-45cd-866b-e58db68f708e-openstack-config\") pod \"c2b81c78-c166-45cd-866b-e58db68f708e\" (UID: \"c2b81c78-c166-45cd-866b-e58db68f708e\") " Oct 12 07:50:00 crc kubenswrapper[4599]: I1012 07:50:00.082244 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2b81c78-c166-45cd-866b-e58db68f708e-combined-ca-bundle\") pod \"c2b81c78-c166-45cd-866b-e58db68f708e\" (UID: \"c2b81c78-c166-45cd-866b-e58db68f708e\") " Oct 12 07:50:00 crc kubenswrapper[4599]: I1012 07:50:00.082351 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c2b81c78-c166-45cd-866b-e58db68f708e-openstack-config-secret\") pod \"c2b81c78-c166-45cd-866b-e58db68f708e\" (UID: \"c2b81c78-c166-45cd-866b-e58db68f708e\") " Oct 12 07:50:00 crc kubenswrapper[4599]: I1012 07:50:00.082900 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wp5rt\" (UniqueName: \"kubernetes.io/projected/c2b81c78-c166-45cd-866b-e58db68f708e-kube-api-access-wp5rt\") on node \"crc\" DevicePath \"\"" Oct 12 07:50:00 crc kubenswrapper[4599]: I1012 07:50:00.101380 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2b81c78-c166-45cd-866b-e58db68f708e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c2b81c78-c166-45cd-866b-e58db68f708e" (UID: "c2b81c78-c166-45cd-866b-e58db68f708e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:50:00 crc kubenswrapper[4599]: I1012 07:50:00.103180 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2b81c78-c166-45cd-866b-e58db68f708e-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "c2b81c78-c166-45cd-866b-e58db68f708e" (UID: "c2b81c78-c166-45cd-866b-e58db68f708e"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:50:00 crc kubenswrapper[4599]: I1012 07:50:00.113465 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2b81c78-c166-45cd-866b-e58db68f708e-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "c2b81c78-c166-45cd-866b-e58db68f708e" (UID: "c2b81c78-c166-45cd-866b-e58db68f708e"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:50:00 crc kubenswrapper[4599]: I1012 07:50:00.125407 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 12 07:50:00 crc kubenswrapper[4599]: I1012 07:50:00.127090 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 12 07:50:00 crc kubenswrapper[4599]: I1012 07:50:00.133093 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 12 07:50:00 crc kubenswrapper[4599]: I1012 07:50:00.136354 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 12 07:50:00 crc kubenswrapper[4599]: I1012 07:50:00.184712 4599 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c2b81c78-c166-45cd-866b-e58db68f708e-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 12 07:50:00 crc kubenswrapper[4599]: I1012 07:50:00.184768 4599 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2b81c78-c166-45cd-866b-e58db68f708e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 07:50:00 crc kubenswrapper[4599]: I1012 07:50:00.184779 4599 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c2b81c78-c166-45cd-866b-e58db68f708e-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 12 07:50:00 crc kubenswrapper[4599]: I1012 07:50:00.244399 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 12 07:50:00 crc kubenswrapper[4599]: W1012 07:50:00.255420 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb6db97ed_496b_4f4d_bb27_2bce6e003912.slice/crio-14102e7e1899cdd360ff30290d3511054605c7e1e8c10c4a76d044d6b65bc5bc WatchSource:0}: Error finding container 14102e7e1899cdd360ff30290d3511054605c7e1e8c10c4a76d044d6b65bc5bc: Status 404 returned error can't find the container with id 14102e7e1899cdd360ff30290d3511054605c7e1e8c10c4a76d044d6b65bc5bc Oct 12 07:50:00 crc kubenswrapper[4599]: I1012 07:50:00.285981 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dfa968eb-bc29-434a-a2f6-6aebf5c8beda-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"dfa968eb-bc29-434a-a2f6-6aebf5c8beda\") " pod="openstack/cinder-scheduler-0" Oct 12 07:50:00 crc kubenswrapper[4599]: I1012 07:50:00.286086 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hd6jc\" (UniqueName: \"kubernetes.io/projected/dfa968eb-bc29-434a-a2f6-6aebf5c8beda-kube-api-access-hd6jc\") pod \"cinder-scheduler-0\" (UID: \"dfa968eb-bc29-434a-a2f6-6aebf5c8beda\") " pod="openstack/cinder-scheduler-0" Oct 12 07:50:00 crc kubenswrapper[4599]: I1012 07:50:00.286150 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfa968eb-bc29-434a-a2f6-6aebf5c8beda-config-data\") pod \"cinder-scheduler-0\" (UID: \"dfa968eb-bc29-434a-a2f6-6aebf5c8beda\") " pod="openstack/cinder-scheduler-0" Oct 12 07:50:00 crc kubenswrapper[4599]: I1012 07:50:00.286266 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dfa968eb-bc29-434a-a2f6-6aebf5c8beda-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"dfa968eb-bc29-434a-a2f6-6aebf5c8beda\") " pod="openstack/cinder-scheduler-0" Oct 12 07:50:00 crc kubenswrapper[4599]: I1012 07:50:00.286288 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfa968eb-bc29-434a-a2f6-6aebf5c8beda-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"dfa968eb-bc29-434a-a2f6-6aebf5c8beda\") " pod="openstack/cinder-scheduler-0" Oct 12 07:50:00 crc kubenswrapper[4599]: I1012 07:50:00.286345 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dfa968eb-bc29-434a-a2f6-6aebf5c8beda-scripts\") pod \"cinder-scheduler-0\" (UID: \"dfa968eb-bc29-434a-a2f6-6aebf5c8beda\") " pod="openstack/cinder-scheduler-0" Oct 12 07:50:00 crc kubenswrapper[4599]: I1012 07:50:00.387035 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dfa968eb-bc29-434a-a2f6-6aebf5c8beda-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"dfa968eb-bc29-434a-a2f6-6aebf5c8beda\") " pod="openstack/cinder-scheduler-0" Oct 12 07:50:00 crc kubenswrapper[4599]: I1012 07:50:00.387682 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfa968eb-bc29-434a-a2f6-6aebf5c8beda-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"dfa968eb-bc29-434a-a2f6-6aebf5c8beda\") " pod="openstack/cinder-scheduler-0" Oct 12 07:50:00 crc kubenswrapper[4599]: I1012 07:50:00.387869 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dfa968eb-bc29-434a-a2f6-6aebf5c8beda-scripts\") pod \"cinder-scheduler-0\" (UID: \"dfa968eb-bc29-434a-a2f6-6aebf5c8beda\") " pod="openstack/cinder-scheduler-0" Oct 12 07:50:00 crc kubenswrapper[4599]: I1012 07:50:00.388496 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dfa968eb-bc29-434a-a2f6-6aebf5c8beda-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"dfa968eb-bc29-434a-a2f6-6aebf5c8beda\") " pod="openstack/cinder-scheduler-0" Oct 12 07:50:00 crc kubenswrapper[4599]: I1012 07:50:00.388654 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hd6jc\" (UniqueName: \"kubernetes.io/projected/dfa968eb-bc29-434a-a2f6-6aebf5c8beda-kube-api-access-hd6jc\") pod \"cinder-scheduler-0\" (UID: \"dfa968eb-bc29-434a-a2f6-6aebf5c8beda\") " pod="openstack/cinder-scheduler-0" Oct 12 07:50:00 crc kubenswrapper[4599]: I1012 07:50:00.388748 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfa968eb-bc29-434a-a2f6-6aebf5c8beda-config-data\") pod \"cinder-scheduler-0\" (UID: \"dfa968eb-bc29-434a-a2f6-6aebf5c8beda\") " pod="openstack/cinder-scheduler-0" Oct 12 07:50:00 crc kubenswrapper[4599]: I1012 07:50:00.388865 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dfa968eb-bc29-434a-a2f6-6aebf5c8beda-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"dfa968eb-bc29-434a-a2f6-6aebf5c8beda\") " pod="openstack/cinder-scheduler-0" Oct 12 07:50:00 crc kubenswrapper[4599]: I1012 07:50:00.392070 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfa968eb-bc29-434a-a2f6-6aebf5c8beda-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"dfa968eb-bc29-434a-a2f6-6aebf5c8beda\") " pod="openstack/cinder-scheduler-0" Oct 12 07:50:00 crc kubenswrapper[4599]: I1012 07:50:00.392143 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dfa968eb-bc29-434a-a2f6-6aebf5c8beda-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"dfa968eb-bc29-434a-a2f6-6aebf5c8beda\") " pod="openstack/cinder-scheduler-0" Oct 12 07:50:00 crc kubenswrapper[4599]: I1012 07:50:00.392440 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dfa968eb-bc29-434a-a2f6-6aebf5c8beda-scripts\") pod \"cinder-scheduler-0\" (UID: \"dfa968eb-bc29-434a-a2f6-6aebf5c8beda\") " pod="openstack/cinder-scheduler-0" Oct 12 07:50:00 crc kubenswrapper[4599]: I1012 07:50:00.393794 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfa968eb-bc29-434a-a2f6-6aebf5c8beda-config-data\") pod \"cinder-scheduler-0\" (UID: \"dfa968eb-bc29-434a-a2f6-6aebf5c8beda\") " pod="openstack/cinder-scheduler-0" Oct 12 07:50:00 crc kubenswrapper[4599]: I1012 07:50:00.403979 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hd6jc\" (UniqueName: \"kubernetes.io/projected/dfa968eb-bc29-434a-a2f6-6aebf5c8beda-kube-api-access-hd6jc\") pod \"cinder-scheduler-0\" (UID: \"dfa968eb-bc29-434a-a2f6-6aebf5c8beda\") " pod="openstack/cinder-scheduler-0" Oct 12 07:50:00 crc kubenswrapper[4599]: I1012 07:50:00.452177 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 12 07:50:00 crc kubenswrapper[4599]: I1012 07:50:00.855873 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 12 07:50:00 crc kubenswrapper[4599]: I1012 07:50:00.984165 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"b6db97ed-496b-4f4d-bb27-2bce6e003912","Type":"ContainerStarted","Data":"14102e7e1899cdd360ff30290d3511054605c7e1e8c10c4a76d044d6b65bc5bc"} Oct 12 07:50:00 crc kubenswrapper[4599]: I1012 07:50:00.986384 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 12 07:50:00 crc kubenswrapper[4599]: I1012 07:50:00.992610 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"dfa968eb-bc29-434a-a2f6-6aebf5c8beda","Type":"ContainerStarted","Data":"1aba69d864d023b26c8829e1614d99eee06c8f7c3e810732273758ff8cd177e5"} Oct 12 07:50:01 crc kubenswrapper[4599]: I1012 07:50:01.003294 4599 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="c2b81c78-c166-45cd-866b-e58db68f708e" podUID="b6db97ed-496b-4f4d-bb27-2bce6e003912" Oct 12 07:50:01 crc kubenswrapper[4599]: I1012 07:50:01.554697 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6dbf57f-d6e6-417f-9c84-968e6336e3a6" path="/var/lib/kubelet/pods/b6dbf57f-d6e6-417f-9c84-968e6336e3a6/volumes" Oct 12 07:50:01 crc kubenswrapper[4599]: I1012 07:50:01.555425 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2b81c78-c166-45cd-866b-e58db68f708e" path="/var/lib/kubelet/pods/c2b81c78-c166-45cd-866b-e58db68f708e/volumes" Oct 12 07:50:02 crc kubenswrapper[4599]: I1012 07:50:02.000758 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"dfa968eb-bc29-434a-a2f6-6aebf5c8beda","Type":"ContainerStarted","Data":"89b55a1aee4db5048a0f73470537e205c947a6d288707f3411467d42dfd6d37c"} Oct 12 07:50:02 crc kubenswrapper[4599]: I1012 07:50:02.001031 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"dfa968eb-bc29-434a-a2f6-6aebf5c8beda","Type":"ContainerStarted","Data":"26f597cabdffda3dc8549aa4ea7c86865ff2f324c0210f35478574e3e565a6d6"} Oct 12 07:50:02 crc kubenswrapper[4599]: I1012 07:50:02.020985 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=2.020952169 podStartE2EDuration="2.020952169s" podCreationTimestamp="2025-10-12 07:50:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:50:02.014516262 +0000 UTC m=+898.803711763" watchObservedRunningTime="2025-10-12 07:50:02.020952169 +0000 UTC m=+898.810147670" Oct 12 07:50:03 crc kubenswrapper[4599]: I1012 07:50:03.777168 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 12 07:50:03 crc kubenswrapper[4599]: I1012 07:50:03.777675 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="dc31b00c-21aa-4427-9b7a-c505c0520a2b" containerName="glance-log" containerID="cri-o://66e2798ff6540355b4f8a0476df8a51f5593324c7dc6dcff58b90c70631d878f" gracePeriod=30 Oct 12 07:50:03 crc kubenswrapper[4599]: I1012 07:50:03.777782 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="dc31b00c-21aa-4427-9b7a-c505c0520a2b" containerName="glance-httpd" containerID="cri-o://f912827e300eaa1554f6da57f1cfb2f16457c99ae40ae89ae4358b683feb7234" gracePeriod=30 Oct 12 07:50:03 crc kubenswrapper[4599]: I1012 07:50:03.812568 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 12 07:50:03 crc kubenswrapper[4599]: I1012 07:50:03.813175 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="412835ea-6470-43ff-896b-98da92bb7393" containerName="sg-core" containerID="cri-o://6dc982e6d6678280a87bbc087bc07e454ed6ac050468fb389f84f383b998624b" gracePeriod=30 Oct 12 07:50:03 crc kubenswrapper[4599]: I1012 07:50:03.813119 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="412835ea-6470-43ff-896b-98da92bb7393" containerName="ceilometer-central-agent" containerID="cri-o://9f1e3309f87e6bc3c1a86687b03d8d42468c0a05e56b65a1d42faa3f80cb0de9" gracePeriod=30 Oct 12 07:50:03 crc kubenswrapper[4599]: I1012 07:50:03.813162 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="412835ea-6470-43ff-896b-98da92bb7393" containerName="proxy-httpd" containerID="cri-o://7c4482729785f9de8488a78c1c54e0e328e2b6a2cd8a10effc46ef27c500410d" gracePeriod=30 Oct 12 07:50:03 crc kubenswrapper[4599]: I1012 07:50:03.813189 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="412835ea-6470-43ff-896b-98da92bb7393" containerName="ceilometer-notification-agent" containerID="cri-o://0b52b9ddb35fd1b5fde0183f396cf3cc0ff2109d3a809f577cd974374526c102" gracePeriod=30 Oct 12 07:50:04 crc kubenswrapper[4599]: I1012 07:50:04.024945 4599 generic.go:334] "Generic (PLEG): container finished" podID="dc31b00c-21aa-4427-9b7a-c505c0520a2b" containerID="66e2798ff6540355b4f8a0476df8a51f5593324c7dc6dcff58b90c70631d878f" exitCode=143 Oct 12 07:50:04 crc kubenswrapper[4599]: I1012 07:50:04.025002 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"dc31b00c-21aa-4427-9b7a-c505c0520a2b","Type":"ContainerDied","Data":"66e2798ff6540355b4f8a0476df8a51f5593324c7dc6dcff58b90c70631d878f"} Oct 12 07:50:04 crc kubenswrapper[4599]: I1012 07:50:04.027400 4599 generic.go:334] "Generic (PLEG): container finished" podID="412835ea-6470-43ff-896b-98da92bb7393" containerID="7c4482729785f9de8488a78c1c54e0e328e2b6a2cd8a10effc46ef27c500410d" exitCode=0 Oct 12 07:50:04 crc kubenswrapper[4599]: I1012 07:50:04.027420 4599 generic.go:334] "Generic (PLEG): container finished" podID="412835ea-6470-43ff-896b-98da92bb7393" containerID="6dc982e6d6678280a87bbc087bc07e454ed6ac050468fb389f84f383b998624b" exitCode=2 Oct 12 07:50:04 crc kubenswrapper[4599]: I1012 07:50:04.027434 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"412835ea-6470-43ff-896b-98da92bb7393","Type":"ContainerDied","Data":"7c4482729785f9de8488a78c1c54e0e328e2b6a2cd8a10effc46ef27c500410d"} Oct 12 07:50:04 crc kubenswrapper[4599]: I1012 07:50:04.027449 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"412835ea-6470-43ff-896b-98da92bb7393","Type":"ContainerDied","Data":"6dc982e6d6678280a87bbc087bc07e454ed6ac050468fb389f84f383b998624b"} Oct 12 07:50:04 crc kubenswrapper[4599]: I1012 07:50:04.615772 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-786bc7649f-6qt66"] Oct 12 07:50:04 crc kubenswrapper[4599]: I1012 07:50:04.617434 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-786bc7649f-6qt66" Oct 12 07:50:04 crc kubenswrapper[4599]: I1012 07:50:04.623948 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 12 07:50:04 crc kubenswrapper[4599]: I1012 07:50:04.624211 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Oct 12 07:50:04 crc kubenswrapper[4599]: I1012 07:50:04.624877 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Oct 12 07:50:04 crc kubenswrapper[4599]: I1012 07:50:04.630868 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-786bc7649f-6qt66"] Oct 12 07:50:04 crc kubenswrapper[4599]: I1012 07:50:04.713247 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a10375e-8317-473b-87b4-07c82831ac41-public-tls-certs\") pod \"swift-proxy-786bc7649f-6qt66\" (UID: \"9a10375e-8317-473b-87b4-07c82831ac41\") " pod="openstack/swift-proxy-786bc7649f-6qt66" Oct 12 07:50:04 crc kubenswrapper[4599]: I1012 07:50:04.713460 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a10375e-8317-473b-87b4-07c82831ac41-internal-tls-certs\") pod \"swift-proxy-786bc7649f-6qt66\" (UID: \"9a10375e-8317-473b-87b4-07c82831ac41\") " pod="openstack/swift-proxy-786bc7649f-6qt66" Oct 12 07:50:04 crc kubenswrapper[4599]: I1012 07:50:04.713492 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a10375e-8317-473b-87b4-07c82831ac41-combined-ca-bundle\") pod \"swift-proxy-786bc7649f-6qt66\" (UID: \"9a10375e-8317-473b-87b4-07c82831ac41\") " pod="openstack/swift-proxy-786bc7649f-6qt66" Oct 12 07:50:04 crc kubenswrapper[4599]: I1012 07:50:04.713522 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9a10375e-8317-473b-87b4-07c82831ac41-run-httpd\") pod \"swift-proxy-786bc7649f-6qt66\" (UID: \"9a10375e-8317-473b-87b4-07c82831ac41\") " pod="openstack/swift-proxy-786bc7649f-6qt66" Oct 12 07:50:04 crc kubenswrapper[4599]: I1012 07:50:04.713554 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xthh8\" (UniqueName: \"kubernetes.io/projected/9a10375e-8317-473b-87b4-07c82831ac41-kube-api-access-xthh8\") pod \"swift-proxy-786bc7649f-6qt66\" (UID: \"9a10375e-8317-473b-87b4-07c82831ac41\") " pod="openstack/swift-proxy-786bc7649f-6qt66" Oct 12 07:50:04 crc kubenswrapper[4599]: I1012 07:50:04.713582 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9a10375e-8317-473b-87b4-07c82831ac41-etc-swift\") pod \"swift-proxy-786bc7649f-6qt66\" (UID: \"9a10375e-8317-473b-87b4-07c82831ac41\") " pod="openstack/swift-proxy-786bc7649f-6qt66" Oct 12 07:50:04 crc kubenswrapper[4599]: I1012 07:50:04.713628 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a10375e-8317-473b-87b4-07c82831ac41-config-data\") pod \"swift-proxy-786bc7649f-6qt66\" (UID: \"9a10375e-8317-473b-87b4-07c82831ac41\") " pod="openstack/swift-proxy-786bc7649f-6qt66" Oct 12 07:50:04 crc kubenswrapper[4599]: I1012 07:50:04.713644 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9a10375e-8317-473b-87b4-07c82831ac41-log-httpd\") pod \"swift-proxy-786bc7649f-6qt66\" (UID: \"9a10375e-8317-473b-87b4-07c82831ac41\") " pod="openstack/swift-proxy-786bc7649f-6qt66" Oct 12 07:50:04 crc kubenswrapper[4599]: I1012 07:50:04.816948 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a10375e-8317-473b-87b4-07c82831ac41-internal-tls-certs\") pod \"swift-proxy-786bc7649f-6qt66\" (UID: \"9a10375e-8317-473b-87b4-07c82831ac41\") " pod="openstack/swift-proxy-786bc7649f-6qt66" Oct 12 07:50:04 crc kubenswrapper[4599]: I1012 07:50:04.816998 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a10375e-8317-473b-87b4-07c82831ac41-combined-ca-bundle\") pod \"swift-proxy-786bc7649f-6qt66\" (UID: \"9a10375e-8317-473b-87b4-07c82831ac41\") " pod="openstack/swift-proxy-786bc7649f-6qt66" Oct 12 07:50:04 crc kubenswrapper[4599]: I1012 07:50:04.817043 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9a10375e-8317-473b-87b4-07c82831ac41-run-httpd\") pod \"swift-proxy-786bc7649f-6qt66\" (UID: \"9a10375e-8317-473b-87b4-07c82831ac41\") " pod="openstack/swift-proxy-786bc7649f-6qt66" Oct 12 07:50:04 crc kubenswrapper[4599]: I1012 07:50:04.817079 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xthh8\" (UniqueName: \"kubernetes.io/projected/9a10375e-8317-473b-87b4-07c82831ac41-kube-api-access-xthh8\") pod \"swift-proxy-786bc7649f-6qt66\" (UID: \"9a10375e-8317-473b-87b4-07c82831ac41\") " pod="openstack/swift-proxy-786bc7649f-6qt66" Oct 12 07:50:04 crc kubenswrapper[4599]: I1012 07:50:04.817116 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9a10375e-8317-473b-87b4-07c82831ac41-etc-swift\") pod \"swift-proxy-786bc7649f-6qt66\" (UID: \"9a10375e-8317-473b-87b4-07c82831ac41\") " pod="openstack/swift-proxy-786bc7649f-6qt66" Oct 12 07:50:04 crc kubenswrapper[4599]: I1012 07:50:04.817192 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a10375e-8317-473b-87b4-07c82831ac41-config-data\") pod \"swift-proxy-786bc7649f-6qt66\" (UID: \"9a10375e-8317-473b-87b4-07c82831ac41\") " pod="openstack/swift-proxy-786bc7649f-6qt66" Oct 12 07:50:04 crc kubenswrapper[4599]: I1012 07:50:04.817214 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9a10375e-8317-473b-87b4-07c82831ac41-log-httpd\") pod \"swift-proxy-786bc7649f-6qt66\" (UID: \"9a10375e-8317-473b-87b4-07c82831ac41\") " pod="openstack/swift-proxy-786bc7649f-6qt66" Oct 12 07:50:04 crc kubenswrapper[4599]: I1012 07:50:04.817238 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a10375e-8317-473b-87b4-07c82831ac41-public-tls-certs\") pod \"swift-proxy-786bc7649f-6qt66\" (UID: \"9a10375e-8317-473b-87b4-07c82831ac41\") " pod="openstack/swift-proxy-786bc7649f-6qt66" Oct 12 07:50:04 crc kubenswrapper[4599]: I1012 07:50:04.818799 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9a10375e-8317-473b-87b4-07c82831ac41-run-httpd\") pod \"swift-proxy-786bc7649f-6qt66\" (UID: \"9a10375e-8317-473b-87b4-07c82831ac41\") " pod="openstack/swift-proxy-786bc7649f-6qt66" Oct 12 07:50:04 crc kubenswrapper[4599]: I1012 07:50:04.819080 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9a10375e-8317-473b-87b4-07c82831ac41-log-httpd\") pod \"swift-proxy-786bc7649f-6qt66\" (UID: \"9a10375e-8317-473b-87b4-07c82831ac41\") " pod="openstack/swift-proxy-786bc7649f-6qt66" Oct 12 07:50:04 crc kubenswrapper[4599]: I1012 07:50:04.825261 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9a10375e-8317-473b-87b4-07c82831ac41-etc-swift\") pod \"swift-proxy-786bc7649f-6qt66\" (UID: \"9a10375e-8317-473b-87b4-07c82831ac41\") " pod="openstack/swift-proxy-786bc7649f-6qt66" Oct 12 07:50:04 crc kubenswrapper[4599]: I1012 07:50:04.825634 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a10375e-8317-473b-87b4-07c82831ac41-combined-ca-bundle\") pod \"swift-proxy-786bc7649f-6qt66\" (UID: \"9a10375e-8317-473b-87b4-07c82831ac41\") " pod="openstack/swift-proxy-786bc7649f-6qt66" Oct 12 07:50:04 crc kubenswrapper[4599]: I1012 07:50:04.825702 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a10375e-8317-473b-87b4-07c82831ac41-public-tls-certs\") pod \"swift-proxy-786bc7649f-6qt66\" (UID: \"9a10375e-8317-473b-87b4-07c82831ac41\") " pod="openstack/swift-proxy-786bc7649f-6qt66" Oct 12 07:50:04 crc kubenswrapper[4599]: I1012 07:50:04.826721 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a10375e-8317-473b-87b4-07c82831ac41-internal-tls-certs\") pod \"swift-proxy-786bc7649f-6qt66\" (UID: \"9a10375e-8317-473b-87b4-07c82831ac41\") " pod="openstack/swift-proxy-786bc7649f-6qt66" Oct 12 07:50:04 crc kubenswrapper[4599]: I1012 07:50:04.827541 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a10375e-8317-473b-87b4-07c82831ac41-config-data\") pod \"swift-proxy-786bc7649f-6qt66\" (UID: \"9a10375e-8317-473b-87b4-07c82831ac41\") " pod="openstack/swift-proxy-786bc7649f-6qt66" Oct 12 07:50:04 crc kubenswrapper[4599]: I1012 07:50:04.837197 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xthh8\" (UniqueName: \"kubernetes.io/projected/9a10375e-8317-473b-87b4-07c82831ac41-kube-api-access-xthh8\") pod \"swift-proxy-786bc7649f-6qt66\" (UID: \"9a10375e-8317-473b-87b4-07c82831ac41\") " pod="openstack/swift-proxy-786bc7649f-6qt66" Oct 12 07:50:04 crc kubenswrapper[4599]: I1012 07:50:04.931075 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-786bc7649f-6qt66" Oct 12 07:50:05 crc kubenswrapper[4599]: I1012 07:50:05.052303 4599 generic.go:334] "Generic (PLEG): container finished" podID="412835ea-6470-43ff-896b-98da92bb7393" containerID="9f1e3309f87e6bc3c1a86687b03d8d42468c0a05e56b65a1d42faa3f80cb0de9" exitCode=0 Oct 12 07:50:05 crc kubenswrapper[4599]: I1012 07:50:05.052377 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"412835ea-6470-43ff-896b-98da92bb7393","Type":"ContainerDied","Data":"9f1e3309f87e6bc3c1a86687b03d8d42468c0a05e56b65a1d42faa3f80cb0de9"} Oct 12 07:50:05 crc kubenswrapper[4599]: I1012 07:50:05.419470 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-786bc7649f-6qt66"] Oct 12 07:50:05 crc kubenswrapper[4599]: I1012 07:50:05.452714 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 12 07:50:06 crc kubenswrapper[4599]: I1012 07:50:06.067211 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-786bc7649f-6qt66" event={"ID":"9a10375e-8317-473b-87b4-07c82831ac41","Type":"ContainerStarted","Data":"6c2b4306cf590bb2ad3e96ac89e8c9304dba22dec84c61150d17003458396c17"} Oct 12 07:50:06 crc kubenswrapper[4599]: I1012 07:50:06.067539 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-786bc7649f-6qt66" event={"ID":"9a10375e-8317-473b-87b4-07c82831ac41","Type":"ContainerStarted","Data":"c2419bf8e7d1fefcbf3e99d074b0e86e11c220683a4a7ed53b0ed1b0c2bc2817"} Oct 12 07:50:06 crc kubenswrapper[4599]: I1012 07:50:06.067552 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-786bc7649f-6qt66" event={"ID":"9a10375e-8317-473b-87b4-07c82831ac41","Type":"ContainerStarted","Data":"33cc195c96406699bbe1fd8c433fde5490e1e23d054b6bf5585370d38f01b580"} Oct 12 07:50:06 crc kubenswrapper[4599]: I1012 07:50:06.067570 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-786bc7649f-6qt66" Oct 12 07:50:06 crc kubenswrapper[4599]: I1012 07:50:06.090142 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-786bc7649f-6qt66" podStartSLOduration=2.090111547 podStartE2EDuration="2.090111547s" podCreationTimestamp="2025-10-12 07:50:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:50:06.084087458 +0000 UTC m=+902.873282960" watchObservedRunningTime="2025-10-12 07:50:06.090111547 +0000 UTC m=+902.879307049" Oct 12 07:50:07 crc kubenswrapper[4599]: I1012 07:50:07.082826 4599 generic.go:334] "Generic (PLEG): container finished" podID="dc31b00c-21aa-4427-9b7a-c505c0520a2b" containerID="f912827e300eaa1554f6da57f1cfb2f16457c99ae40ae89ae4358b683feb7234" exitCode=0 Oct 12 07:50:07 crc kubenswrapper[4599]: I1012 07:50:07.085526 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"dc31b00c-21aa-4427-9b7a-c505c0520a2b","Type":"ContainerDied","Data":"f912827e300eaa1554f6da57f1cfb2f16457c99ae40ae89ae4358b683feb7234"} Oct 12 07:50:07 crc kubenswrapper[4599]: I1012 07:50:07.085602 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-786bc7649f-6qt66" Oct 12 07:50:08 crc kubenswrapper[4599]: I1012 07:50:08.092345 4599 generic.go:334] "Generic (PLEG): container finished" podID="412835ea-6470-43ff-896b-98da92bb7393" containerID="0b52b9ddb35fd1b5fde0183f396cf3cc0ff2109d3a809f577cd974374526c102" exitCode=0 Oct 12 07:50:08 crc kubenswrapper[4599]: I1012 07:50:08.092365 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"412835ea-6470-43ff-896b-98da92bb7393","Type":"ContainerDied","Data":"0b52b9ddb35fd1b5fde0183f396cf3cc0ff2109d3a809f577cd974374526c102"} Oct 12 07:50:10 crc kubenswrapper[4599]: I1012 07:50:10.654579 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 12 07:50:11 crc kubenswrapper[4599]: I1012 07:50:11.549493 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 12 07:50:11 crc kubenswrapper[4599]: I1012 07:50:11.554643 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x6k49\" (UniqueName: \"kubernetes.io/projected/dc31b00c-21aa-4427-9b7a-c505c0520a2b-kube-api-access-x6k49\") pod \"dc31b00c-21aa-4427-9b7a-c505c0520a2b\" (UID: \"dc31b00c-21aa-4427-9b7a-c505c0520a2b\") " Oct 12 07:50:11 crc kubenswrapper[4599]: I1012 07:50:11.570310 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc31b00c-21aa-4427-9b7a-c505c0520a2b-combined-ca-bundle\") pod \"dc31b00c-21aa-4427-9b7a-c505c0520a2b\" (UID: \"dc31b00c-21aa-4427-9b7a-c505c0520a2b\") " Oct 12 07:50:11 crc kubenswrapper[4599]: I1012 07:50:11.570384 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc31b00c-21aa-4427-9b7a-c505c0520a2b-internal-tls-certs\") pod \"dc31b00c-21aa-4427-9b7a-c505c0520a2b\" (UID: \"dc31b00c-21aa-4427-9b7a-c505c0520a2b\") " Oct 12 07:50:11 crc kubenswrapper[4599]: I1012 07:50:11.570422 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"dc31b00c-21aa-4427-9b7a-c505c0520a2b\" (UID: \"dc31b00c-21aa-4427-9b7a-c505c0520a2b\") " Oct 12 07:50:11 crc kubenswrapper[4599]: I1012 07:50:11.570460 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc31b00c-21aa-4427-9b7a-c505c0520a2b-scripts\") pod \"dc31b00c-21aa-4427-9b7a-c505c0520a2b\" (UID: \"dc31b00c-21aa-4427-9b7a-c505c0520a2b\") " Oct 12 07:50:11 crc kubenswrapper[4599]: I1012 07:50:11.570484 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc31b00c-21aa-4427-9b7a-c505c0520a2b-config-data\") pod \"dc31b00c-21aa-4427-9b7a-c505c0520a2b\" (UID: \"dc31b00c-21aa-4427-9b7a-c505c0520a2b\") " Oct 12 07:50:11 crc kubenswrapper[4599]: I1012 07:50:11.570531 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc31b00c-21aa-4427-9b7a-c505c0520a2b-logs\") pod \"dc31b00c-21aa-4427-9b7a-c505c0520a2b\" (UID: \"dc31b00c-21aa-4427-9b7a-c505c0520a2b\") " Oct 12 07:50:11 crc kubenswrapper[4599]: I1012 07:50:11.570577 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dc31b00c-21aa-4427-9b7a-c505c0520a2b-httpd-run\") pod \"dc31b00c-21aa-4427-9b7a-c505c0520a2b\" (UID: \"dc31b00c-21aa-4427-9b7a-c505c0520a2b\") " Oct 12 07:50:11 crc kubenswrapper[4599]: I1012 07:50:11.571388 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc31b00c-21aa-4427-9b7a-c505c0520a2b-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "dc31b00c-21aa-4427-9b7a-c505c0520a2b" (UID: "dc31b00c-21aa-4427-9b7a-c505c0520a2b"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:50:11 crc kubenswrapper[4599]: I1012 07:50:11.574814 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc31b00c-21aa-4427-9b7a-c505c0520a2b-kube-api-access-x6k49" (OuterVolumeSpecName: "kube-api-access-x6k49") pod "dc31b00c-21aa-4427-9b7a-c505c0520a2b" (UID: "dc31b00c-21aa-4427-9b7a-c505c0520a2b"). InnerVolumeSpecName "kube-api-access-x6k49". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:50:11 crc kubenswrapper[4599]: I1012 07:50:11.577654 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc31b00c-21aa-4427-9b7a-c505c0520a2b-logs" (OuterVolumeSpecName: "logs") pod "dc31b00c-21aa-4427-9b7a-c505c0520a2b" (UID: "dc31b00c-21aa-4427-9b7a-c505c0520a2b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:50:11 crc kubenswrapper[4599]: I1012 07:50:11.585079 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc31b00c-21aa-4427-9b7a-c505c0520a2b-scripts" (OuterVolumeSpecName: "scripts") pod "dc31b00c-21aa-4427-9b7a-c505c0520a2b" (UID: "dc31b00c-21aa-4427-9b7a-c505c0520a2b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:50:11 crc kubenswrapper[4599]: I1012 07:50:11.607802 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 12 07:50:11 crc kubenswrapper[4599]: I1012 07:50:11.608060 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "dc31b00c-21aa-4427-9b7a-c505c0520a2b" (UID: "dc31b00c-21aa-4427-9b7a-c505c0520a2b"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 12 07:50:11 crc kubenswrapper[4599]: I1012 07:50:11.608113 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="88d1ee0b-4340-44c2-b357-1eb1741e30ca" containerName="glance-log" containerID="cri-o://69ac54b513822bd5d17bba9e6347a4aa4221ed49830122d6174fde84876ecdd4" gracePeriod=30 Oct 12 07:50:11 crc kubenswrapper[4599]: I1012 07:50:11.608389 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="88d1ee0b-4340-44c2-b357-1eb1741e30ca" containerName="glance-httpd" containerID="cri-o://c1e366286f4e9ddc378f0672a98684e65e1219d248fbd57600ff6fceda569cec" gracePeriod=30 Oct 12 07:50:11 crc kubenswrapper[4599]: I1012 07:50:11.656405 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc31b00c-21aa-4427-9b7a-c505c0520a2b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dc31b00c-21aa-4427-9b7a-c505c0520a2b" (UID: "dc31b00c-21aa-4427-9b7a-c505c0520a2b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:50:11 crc kubenswrapper[4599]: I1012 07:50:11.659626 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc31b00c-21aa-4427-9b7a-c505c0520a2b-config-data" (OuterVolumeSpecName: "config-data") pod "dc31b00c-21aa-4427-9b7a-c505c0520a2b" (UID: "dc31b00c-21aa-4427-9b7a-c505c0520a2b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:50:11 crc kubenswrapper[4599]: I1012 07:50:11.661356 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc31b00c-21aa-4427-9b7a-c505c0520a2b-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "dc31b00c-21aa-4427-9b7a-c505c0520a2b" (UID: "dc31b00c-21aa-4427-9b7a-c505c0520a2b"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:50:11 crc kubenswrapper[4599]: I1012 07:50:11.672886 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x6k49\" (UniqueName: \"kubernetes.io/projected/dc31b00c-21aa-4427-9b7a-c505c0520a2b-kube-api-access-x6k49\") on node \"crc\" DevicePath \"\"" Oct 12 07:50:11 crc kubenswrapper[4599]: I1012 07:50:11.672911 4599 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc31b00c-21aa-4427-9b7a-c505c0520a2b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 07:50:11 crc kubenswrapper[4599]: I1012 07:50:11.672921 4599 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc31b00c-21aa-4427-9b7a-c505c0520a2b-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 12 07:50:11 crc kubenswrapper[4599]: I1012 07:50:11.672946 4599 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Oct 12 07:50:11 crc kubenswrapper[4599]: I1012 07:50:11.672955 4599 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc31b00c-21aa-4427-9b7a-c505c0520a2b-scripts\") on node \"crc\" DevicePath \"\"" Oct 12 07:50:11 crc kubenswrapper[4599]: I1012 07:50:11.672964 4599 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc31b00c-21aa-4427-9b7a-c505c0520a2b-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 07:50:11 crc kubenswrapper[4599]: I1012 07:50:11.672971 4599 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc31b00c-21aa-4427-9b7a-c505c0520a2b-logs\") on node \"crc\" DevicePath \"\"" Oct 12 07:50:11 crc kubenswrapper[4599]: I1012 07:50:11.672980 4599 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dc31b00c-21aa-4427-9b7a-c505c0520a2b-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 12 07:50:11 crc kubenswrapper[4599]: I1012 07:50:11.688062 4599 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Oct 12 07:50:11 crc kubenswrapper[4599]: I1012 07:50:11.712697 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 12 07:50:11 crc kubenswrapper[4599]: I1012 07:50:11.774791 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48bhv\" (UniqueName: \"kubernetes.io/projected/412835ea-6470-43ff-896b-98da92bb7393-kube-api-access-48bhv\") pod \"412835ea-6470-43ff-896b-98da92bb7393\" (UID: \"412835ea-6470-43ff-896b-98da92bb7393\") " Oct 12 07:50:11 crc kubenswrapper[4599]: I1012 07:50:11.774936 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/412835ea-6470-43ff-896b-98da92bb7393-run-httpd\") pod \"412835ea-6470-43ff-896b-98da92bb7393\" (UID: \"412835ea-6470-43ff-896b-98da92bb7393\") " Oct 12 07:50:11 crc kubenswrapper[4599]: I1012 07:50:11.775026 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/412835ea-6470-43ff-896b-98da92bb7393-combined-ca-bundle\") pod \"412835ea-6470-43ff-896b-98da92bb7393\" (UID: \"412835ea-6470-43ff-896b-98da92bb7393\") " Oct 12 07:50:11 crc kubenswrapper[4599]: I1012 07:50:11.775072 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/412835ea-6470-43ff-896b-98da92bb7393-config-data\") pod \"412835ea-6470-43ff-896b-98da92bb7393\" (UID: \"412835ea-6470-43ff-896b-98da92bb7393\") " Oct 12 07:50:11 crc kubenswrapper[4599]: I1012 07:50:11.775101 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/412835ea-6470-43ff-896b-98da92bb7393-sg-core-conf-yaml\") pod \"412835ea-6470-43ff-896b-98da92bb7393\" (UID: \"412835ea-6470-43ff-896b-98da92bb7393\") " Oct 12 07:50:11 crc kubenswrapper[4599]: I1012 07:50:11.775124 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/412835ea-6470-43ff-896b-98da92bb7393-log-httpd\") pod \"412835ea-6470-43ff-896b-98da92bb7393\" (UID: \"412835ea-6470-43ff-896b-98da92bb7393\") " Oct 12 07:50:11 crc kubenswrapper[4599]: I1012 07:50:11.775145 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/412835ea-6470-43ff-896b-98da92bb7393-scripts\") pod \"412835ea-6470-43ff-896b-98da92bb7393\" (UID: \"412835ea-6470-43ff-896b-98da92bb7393\") " Oct 12 07:50:11 crc kubenswrapper[4599]: I1012 07:50:11.775434 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/412835ea-6470-43ff-896b-98da92bb7393-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "412835ea-6470-43ff-896b-98da92bb7393" (UID: "412835ea-6470-43ff-896b-98da92bb7393"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:50:11 crc kubenswrapper[4599]: I1012 07:50:11.776187 4599 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Oct 12 07:50:11 crc kubenswrapper[4599]: I1012 07:50:11.776208 4599 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/412835ea-6470-43ff-896b-98da92bb7393-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 12 07:50:11 crc kubenswrapper[4599]: I1012 07:50:11.776648 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/412835ea-6470-43ff-896b-98da92bb7393-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "412835ea-6470-43ff-896b-98da92bb7393" (UID: "412835ea-6470-43ff-896b-98da92bb7393"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:50:11 crc kubenswrapper[4599]: I1012 07:50:11.781548 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/412835ea-6470-43ff-896b-98da92bb7393-kube-api-access-48bhv" (OuterVolumeSpecName: "kube-api-access-48bhv") pod "412835ea-6470-43ff-896b-98da92bb7393" (UID: "412835ea-6470-43ff-896b-98da92bb7393"). InnerVolumeSpecName "kube-api-access-48bhv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:50:11 crc kubenswrapper[4599]: I1012 07:50:11.783728 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/412835ea-6470-43ff-896b-98da92bb7393-scripts" (OuterVolumeSpecName: "scripts") pod "412835ea-6470-43ff-896b-98da92bb7393" (UID: "412835ea-6470-43ff-896b-98da92bb7393"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:50:11 crc kubenswrapper[4599]: I1012 07:50:11.803904 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/412835ea-6470-43ff-896b-98da92bb7393-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "412835ea-6470-43ff-896b-98da92bb7393" (UID: "412835ea-6470-43ff-896b-98da92bb7393"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:50:11 crc kubenswrapper[4599]: I1012 07:50:11.843611 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/412835ea-6470-43ff-896b-98da92bb7393-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "412835ea-6470-43ff-896b-98da92bb7393" (UID: "412835ea-6470-43ff-896b-98da92bb7393"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:50:11 crc kubenswrapper[4599]: I1012 07:50:11.872912 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/412835ea-6470-43ff-896b-98da92bb7393-config-data" (OuterVolumeSpecName: "config-data") pod "412835ea-6470-43ff-896b-98da92bb7393" (UID: "412835ea-6470-43ff-896b-98da92bb7393"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:50:11 crc kubenswrapper[4599]: I1012 07:50:11.877765 4599 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/412835ea-6470-43ff-896b-98da92bb7393-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 07:50:11 crc kubenswrapper[4599]: I1012 07:50:11.877804 4599 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/412835ea-6470-43ff-896b-98da92bb7393-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 12 07:50:11 crc kubenswrapper[4599]: I1012 07:50:11.877815 4599 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/412835ea-6470-43ff-896b-98da92bb7393-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 12 07:50:11 crc kubenswrapper[4599]: I1012 07:50:11.877824 4599 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/412835ea-6470-43ff-896b-98da92bb7393-scripts\") on node \"crc\" DevicePath \"\"" Oct 12 07:50:11 crc kubenswrapper[4599]: I1012 07:50:11.877835 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-48bhv\" (UniqueName: \"kubernetes.io/projected/412835ea-6470-43ff-896b-98da92bb7393-kube-api-access-48bhv\") on node \"crc\" DevicePath \"\"" Oct 12 07:50:11 crc kubenswrapper[4599]: I1012 07:50:11.877843 4599 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/412835ea-6470-43ff-896b-98da92bb7393-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 07:50:12 crc kubenswrapper[4599]: I1012 07:50:12.132214 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"b6db97ed-496b-4f4d-bb27-2bce6e003912","Type":"ContainerStarted","Data":"e9f8ca2f7eb76943f666d8d7b2be6484eabf55fdd2fc0bbb8d234acd38d71b4c"} Oct 12 07:50:12 crc kubenswrapper[4599]: I1012 07:50:12.135480 4599 generic.go:334] "Generic (PLEG): container finished" podID="88d1ee0b-4340-44c2-b357-1eb1741e30ca" containerID="69ac54b513822bd5d17bba9e6347a4aa4221ed49830122d6174fde84876ecdd4" exitCode=143 Oct 12 07:50:12 crc kubenswrapper[4599]: I1012 07:50:12.135545 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"88d1ee0b-4340-44c2-b357-1eb1741e30ca","Type":"ContainerDied","Data":"69ac54b513822bd5d17bba9e6347a4aa4221ed49830122d6174fde84876ecdd4"} Oct 12 07:50:12 crc kubenswrapper[4599]: I1012 07:50:12.138505 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"412835ea-6470-43ff-896b-98da92bb7393","Type":"ContainerDied","Data":"020e24f9fcc28864b5d078659cbc3da7fa453b8a26a4b9e6ea96dd95cb9212c1"} Oct 12 07:50:12 crc kubenswrapper[4599]: I1012 07:50:12.138546 4599 scope.go:117] "RemoveContainer" containerID="7c4482729785f9de8488a78c1c54e0e328e2b6a2cd8a10effc46ef27c500410d" Oct 12 07:50:12 crc kubenswrapper[4599]: I1012 07:50:12.138661 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 12 07:50:12 crc kubenswrapper[4599]: I1012 07:50:12.144986 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"dc31b00c-21aa-4427-9b7a-c505c0520a2b","Type":"ContainerDied","Data":"159b7718922d0ed3a94be25af6599182de8723adcf8b49b4193ff782af3811cd"} Oct 12 07:50:12 crc kubenswrapper[4599]: I1012 07:50:12.145096 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 12 07:50:12 crc kubenswrapper[4599]: I1012 07:50:12.161395 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.963804837 podStartE2EDuration="13.161375446s" podCreationTimestamp="2025-10-12 07:49:59 +0000 UTC" firstStartedPulling="2025-10-12 07:50:00.257517524 +0000 UTC m=+897.046713026" lastFinishedPulling="2025-10-12 07:50:11.455088133 +0000 UTC m=+908.244283635" observedRunningTime="2025-10-12 07:50:12.150751837 +0000 UTC m=+908.939947339" watchObservedRunningTime="2025-10-12 07:50:12.161375446 +0000 UTC m=+908.950570948" Oct 12 07:50:12 crc kubenswrapper[4599]: I1012 07:50:12.173926 4599 scope.go:117] "RemoveContainer" containerID="6dc982e6d6678280a87bbc087bc07e454ed6ac050468fb389f84f383b998624b" Oct 12 07:50:12 crc kubenswrapper[4599]: I1012 07:50:12.178961 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 12 07:50:12 crc kubenswrapper[4599]: I1012 07:50:12.191665 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 12 07:50:12 crc kubenswrapper[4599]: I1012 07:50:12.200810 4599 scope.go:117] "RemoveContainer" containerID="0b52b9ddb35fd1b5fde0183f396cf3cc0ff2109d3a809f577cd974374526c102" Oct 12 07:50:12 crc kubenswrapper[4599]: I1012 07:50:12.207406 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 12 07:50:12 crc kubenswrapper[4599]: I1012 07:50:12.215642 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 12 07:50:12 crc kubenswrapper[4599]: I1012 07:50:12.221231 4599 scope.go:117] "RemoveContainer" containerID="9f1e3309f87e6bc3c1a86687b03d8d42468c0a05e56b65a1d42faa3f80cb0de9" Oct 12 07:50:12 crc kubenswrapper[4599]: I1012 07:50:12.228154 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 12 07:50:12 crc kubenswrapper[4599]: E1012 07:50:12.228583 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc31b00c-21aa-4427-9b7a-c505c0520a2b" containerName="glance-log" Oct 12 07:50:12 crc kubenswrapper[4599]: I1012 07:50:12.228598 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc31b00c-21aa-4427-9b7a-c505c0520a2b" containerName="glance-log" Oct 12 07:50:12 crc kubenswrapper[4599]: E1012 07:50:12.228615 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="412835ea-6470-43ff-896b-98da92bb7393" containerName="sg-core" Oct 12 07:50:12 crc kubenswrapper[4599]: I1012 07:50:12.228621 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="412835ea-6470-43ff-896b-98da92bb7393" containerName="sg-core" Oct 12 07:50:12 crc kubenswrapper[4599]: E1012 07:50:12.228636 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="412835ea-6470-43ff-896b-98da92bb7393" containerName="proxy-httpd" Oct 12 07:50:12 crc kubenswrapper[4599]: I1012 07:50:12.228644 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="412835ea-6470-43ff-896b-98da92bb7393" containerName="proxy-httpd" Oct 12 07:50:12 crc kubenswrapper[4599]: E1012 07:50:12.228661 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="412835ea-6470-43ff-896b-98da92bb7393" containerName="ceilometer-central-agent" Oct 12 07:50:12 crc kubenswrapper[4599]: I1012 07:50:12.228666 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="412835ea-6470-43ff-896b-98da92bb7393" containerName="ceilometer-central-agent" Oct 12 07:50:12 crc kubenswrapper[4599]: E1012 07:50:12.228679 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="412835ea-6470-43ff-896b-98da92bb7393" containerName="ceilometer-notification-agent" Oct 12 07:50:12 crc kubenswrapper[4599]: I1012 07:50:12.228684 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="412835ea-6470-43ff-896b-98da92bb7393" containerName="ceilometer-notification-agent" Oct 12 07:50:12 crc kubenswrapper[4599]: E1012 07:50:12.228692 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc31b00c-21aa-4427-9b7a-c505c0520a2b" containerName="glance-httpd" Oct 12 07:50:12 crc kubenswrapper[4599]: I1012 07:50:12.228699 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc31b00c-21aa-4427-9b7a-c505c0520a2b" containerName="glance-httpd" Oct 12 07:50:12 crc kubenswrapper[4599]: I1012 07:50:12.228864 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="412835ea-6470-43ff-896b-98da92bb7393" containerName="proxy-httpd" Oct 12 07:50:12 crc kubenswrapper[4599]: I1012 07:50:12.228875 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc31b00c-21aa-4427-9b7a-c505c0520a2b" containerName="glance-log" Oct 12 07:50:12 crc kubenswrapper[4599]: I1012 07:50:12.228889 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc31b00c-21aa-4427-9b7a-c505c0520a2b" containerName="glance-httpd" Oct 12 07:50:12 crc kubenswrapper[4599]: I1012 07:50:12.228898 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="412835ea-6470-43ff-896b-98da92bb7393" containerName="sg-core" Oct 12 07:50:12 crc kubenswrapper[4599]: I1012 07:50:12.228908 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="412835ea-6470-43ff-896b-98da92bb7393" containerName="ceilometer-notification-agent" Oct 12 07:50:12 crc kubenswrapper[4599]: I1012 07:50:12.228918 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="412835ea-6470-43ff-896b-98da92bb7393" containerName="ceilometer-central-agent" Oct 12 07:50:12 crc kubenswrapper[4599]: I1012 07:50:12.229913 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 12 07:50:12 crc kubenswrapper[4599]: I1012 07:50:12.236306 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 12 07:50:12 crc kubenswrapper[4599]: I1012 07:50:12.236566 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 12 07:50:12 crc kubenswrapper[4599]: I1012 07:50:12.236738 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 12 07:50:12 crc kubenswrapper[4599]: I1012 07:50:12.238917 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 12 07:50:12 crc kubenswrapper[4599]: I1012 07:50:12.241382 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 12 07:50:12 crc kubenswrapper[4599]: I1012 07:50:12.241582 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 12 07:50:12 crc kubenswrapper[4599]: I1012 07:50:12.241619 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 12 07:50:12 crc kubenswrapper[4599]: I1012 07:50:12.244951 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 12 07:50:12 crc kubenswrapper[4599]: I1012 07:50:12.245553 4599 scope.go:117] "RemoveContainer" containerID="f912827e300eaa1554f6da57f1cfb2f16457c99ae40ae89ae4358b683feb7234" Oct 12 07:50:12 crc kubenswrapper[4599]: I1012 07:50:12.284314 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/231584a7-c306-4904-9e9b-33789d2d42bd-scripts\") pod \"ceilometer-0\" (UID: \"231584a7-c306-4904-9e9b-33789d2d42bd\") " pod="openstack/ceilometer-0" Oct 12 07:50:12 crc kubenswrapper[4599]: I1012 07:50:12.284385 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fec8099-a2ba-4cbd-af30-75a787e3ead1-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7fec8099-a2ba-4cbd-af30-75a787e3ead1\") " pod="openstack/glance-default-internal-api-0" Oct 12 07:50:12 crc kubenswrapper[4599]: I1012 07:50:12.284461 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/231584a7-c306-4904-9e9b-33789d2d42bd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"231584a7-c306-4904-9e9b-33789d2d42bd\") " pod="openstack/ceilometer-0" Oct 12 07:50:12 crc kubenswrapper[4599]: I1012 07:50:12.284500 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7fec8099-a2ba-4cbd-af30-75a787e3ead1-logs\") pod \"glance-default-internal-api-0\" (UID: \"7fec8099-a2ba-4cbd-af30-75a787e3ead1\") " pod="openstack/glance-default-internal-api-0" Oct 12 07:50:12 crc kubenswrapper[4599]: I1012 07:50:12.284545 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/231584a7-c306-4904-9e9b-33789d2d42bd-config-data\") pod \"ceilometer-0\" (UID: \"231584a7-c306-4904-9e9b-33789d2d42bd\") " pod="openstack/ceilometer-0" Oct 12 07:50:12 crc kubenswrapper[4599]: I1012 07:50:12.284560 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/231584a7-c306-4904-9e9b-33789d2d42bd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"231584a7-c306-4904-9e9b-33789d2d42bd\") " pod="openstack/ceilometer-0" Oct 12 07:50:12 crc kubenswrapper[4599]: I1012 07:50:12.284589 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7fec8099-a2ba-4cbd-af30-75a787e3ead1-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7fec8099-a2ba-4cbd-af30-75a787e3ead1\") " pod="openstack/glance-default-internal-api-0" Oct 12 07:50:12 crc kubenswrapper[4599]: I1012 07:50:12.284642 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snfzw\" (UniqueName: \"kubernetes.io/projected/231584a7-c306-4904-9e9b-33789d2d42bd-kube-api-access-snfzw\") pod \"ceilometer-0\" (UID: \"231584a7-c306-4904-9e9b-33789d2d42bd\") " pod="openstack/ceilometer-0" Oct 12 07:50:12 crc kubenswrapper[4599]: I1012 07:50:12.284676 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"7fec8099-a2ba-4cbd-af30-75a787e3ead1\") " pod="openstack/glance-default-internal-api-0" Oct 12 07:50:12 crc kubenswrapper[4599]: I1012 07:50:12.284692 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/231584a7-c306-4904-9e9b-33789d2d42bd-run-httpd\") pod \"ceilometer-0\" (UID: \"231584a7-c306-4904-9e9b-33789d2d42bd\") " pod="openstack/ceilometer-0" Oct 12 07:50:12 crc kubenswrapper[4599]: I1012 07:50:12.284735 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhm97\" (UniqueName: \"kubernetes.io/projected/7fec8099-a2ba-4cbd-af30-75a787e3ead1-kube-api-access-rhm97\") pod \"glance-default-internal-api-0\" (UID: \"7fec8099-a2ba-4cbd-af30-75a787e3ead1\") " pod="openstack/glance-default-internal-api-0" Oct 12 07:50:12 crc kubenswrapper[4599]: I1012 07:50:12.284777 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/231584a7-c306-4904-9e9b-33789d2d42bd-log-httpd\") pod \"ceilometer-0\" (UID: \"231584a7-c306-4904-9e9b-33789d2d42bd\") " pod="openstack/ceilometer-0" Oct 12 07:50:12 crc kubenswrapper[4599]: I1012 07:50:12.284932 4599 scope.go:117] "RemoveContainer" containerID="66e2798ff6540355b4f8a0476df8a51f5593324c7dc6dcff58b90c70631d878f" Oct 12 07:50:12 crc kubenswrapper[4599]: I1012 07:50:12.286043 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7fec8099-a2ba-4cbd-af30-75a787e3ead1-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7fec8099-a2ba-4cbd-af30-75a787e3ead1\") " pod="openstack/glance-default-internal-api-0" Oct 12 07:50:12 crc kubenswrapper[4599]: I1012 07:50:12.286102 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fec8099-a2ba-4cbd-af30-75a787e3ead1-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7fec8099-a2ba-4cbd-af30-75a787e3ead1\") " pod="openstack/glance-default-internal-api-0" Oct 12 07:50:12 crc kubenswrapper[4599]: I1012 07:50:12.286123 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7fec8099-a2ba-4cbd-af30-75a787e3ead1-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7fec8099-a2ba-4cbd-af30-75a787e3ead1\") " pod="openstack/glance-default-internal-api-0" Oct 12 07:50:12 crc kubenswrapper[4599]: I1012 07:50:12.387624 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/231584a7-c306-4904-9e9b-33789d2d42bd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"231584a7-c306-4904-9e9b-33789d2d42bd\") " pod="openstack/ceilometer-0" Oct 12 07:50:12 crc kubenswrapper[4599]: I1012 07:50:12.387703 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7fec8099-a2ba-4cbd-af30-75a787e3ead1-logs\") pod \"glance-default-internal-api-0\" (UID: \"7fec8099-a2ba-4cbd-af30-75a787e3ead1\") " pod="openstack/glance-default-internal-api-0" Oct 12 07:50:12 crc kubenswrapper[4599]: I1012 07:50:12.387748 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/231584a7-c306-4904-9e9b-33789d2d42bd-config-data\") pod \"ceilometer-0\" (UID: \"231584a7-c306-4904-9e9b-33789d2d42bd\") " pod="openstack/ceilometer-0" Oct 12 07:50:12 crc kubenswrapper[4599]: I1012 07:50:12.387766 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/231584a7-c306-4904-9e9b-33789d2d42bd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"231584a7-c306-4904-9e9b-33789d2d42bd\") " pod="openstack/ceilometer-0" Oct 12 07:50:12 crc kubenswrapper[4599]: I1012 07:50:12.387795 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7fec8099-a2ba-4cbd-af30-75a787e3ead1-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7fec8099-a2ba-4cbd-af30-75a787e3ead1\") " pod="openstack/glance-default-internal-api-0" Oct 12 07:50:12 crc kubenswrapper[4599]: I1012 07:50:12.387829 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snfzw\" (UniqueName: \"kubernetes.io/projected/231584a7-c306-4904-9e9b-33789d2d42bd-kube-api-access-snfzw\") pod \"ceilometer-0\" (UID: \"231584a7-c306-4904-9e9b-33789d2d42bd\") " pod="openstack/ceilometer-0" Oct 12 07:50:12 crc kubenswrapper[4599]: I1012 07:50:12.387868 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"7fec8099-a2ba-4cbd-af30-75a787e3ead1\") " pod="openstack/glance-default-internal-api-0" Oct 12 07:50:12 crc kubenswrapper[4599]: I1012 07:50:12.387886 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/231584a7-c306-4904-9e9b-33789d2d42bd-run-httpd\") pod \"ceilometer-0\" (UID: \"231584a7-c306-4904-9e9b-33789d2d42bd\") " pod="openstack/ceilometer-0" Oct 12 07:50:12 crc kubenswrapper[4599]: I1012 07:50:12.387927 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhm97\" (UniqueName: \"kubernetes.io/projected/7fec8099-a2ba-4cbd-af30-75a787e3ead1-kube-api-access-rhm97\") pod \"glance-default-internal-api-0\" (UID: \"7fec8099-a2ba-4cbd-af30-75a787e3ead1\") " pod="openstack/glance-default-internal-api-0" Oct 12 07:50:12 crc kubenswrapper[4599]: I1012 07:50:12.387956 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/231584a7-c306-4904-9e9b-33789d2d42bd-log-httpd\") pod \"ceilometer-0\" (UID: \"231584a7-c306-4904-9e9b-33789d2d42bd\") " pod="openstack/ceilometer-0" Oct 12 07:50:12 crc kubenswrapper[4599]: I1012 07:50:12.387972 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7fec8099-a2ba-4cbd-af30-75a787e3ead1-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7fec8099-a2ba-4cbd-af30-75a787e3ead1\") " pod="openstack/glance-default-internal-api-0" Oct 12 07:50:12 crc kubenswrapper[4599]: I1012 07:50:12.387998 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fec8099-a2ba-4cbd-af30-75a787e3ead1-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7fec8099-a2ba-4cbd-af30-75a787e3ead1\") " pod="openstack/glance-default-internal-api-0" Oct 12 07:50:12 crc kubenswrapper[4599]: I1012 07:50:12.388017 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7fec8099-a2ba-4cbd-af30-75a787e3ead1-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7fec8099-a2ba-4cbd-af30-75a787e3ead1\") " pod="openstack/glance-default-internal-api-0" Oct 12 07:50:12 crc kubenswrapper[4599]: I1012 07:50:12.388039 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/231584a7-c306-4904-9e9b-33789d2d42bd-scripts\") pod \"ceilometer-0\" (UID: \"231584a7-c306-4904-9e9b-33789d2d42bd\") " pod="openstack/ceilometer-0" Oct 12 07:50:12 crc kubenswrapper[4599]: I1012 07:50:12.388064 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fec8099-a2ba-4cbd-af30-75a787e3ead1-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7fec8099-a2ba-4cbd-af30-75a787e3ead1\") " pod="openstack/glance-default-internal-api-0" Oct 12 07:50:12 crc kubenswrapper[4599]: I1012 07:50:12.389432 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/231584a7-c306-4904-9e9b-33789d2d42bd-log-httpd\") pod \"ceilometer-0\" (UID: \"231584a7-c306-4904-9e9b-33789d2d42bd\") " pod="openstack/ceilometer-0" Oct 12 07:50:12 crc kubenswrapper[4599]: I1012 07:50:12.389483 4599 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"7fec8099-a2ba-4cbd-af30-75a787e3ead1\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-internal-api-0" Oct 12 07:50:12 crc kubenswrapper[4599]: I1012 07:50:12.389666 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/231584a7-c306-4904-9e9b-33789d2d42bd-run-httpd\") pod \"ceilometer-0\" (UID: \"231584a7-c306-4904-9e9b-33789d2d42bd\") " pod="openstack/ceilometer-0" Oct 12 07:50:12 crc kubenswrapper[4599]: I1012 07:50:12.390132 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7fec8099-a2ba-4cbd-af30-75a787e3ead1-logs\") pod \"glance-default-internal-api-0\" (UID: \"7fec8099-a2ba-4cbd-af30-75a787e3ead1\") " pod="openstack/glance-default-internal-api-0" Oct 12 07:50:12 crc kubenswrapper[4599]: I1012 07:50:12.390512 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7fec8099-a2ba-4cbd-af30-75a787e3ead1-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7fec8099-a2ba-4cbd-af30-75a787e3ead1\") " pod="openstack/glance-default-internal-api-0" Oct 12 07:50:12 crc kubenswrapper[4599]: I1012 07:50:12.396671 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7fec8099-a2ba-4cbd-af30-75a787e3ead1-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7fec8099-a2ba-4cbd-af30-75a787e3ead1\") " pod="openstack/glance-default-internal-api-0" Oct 12 07:50:12 crc kubenswrapper[4599]: I1012 07:50:12.396729 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/231584a7-c306-4904-9e9b-33789d2d42bd-config-data\") pod \"ceilometer-0\" (UID: \"231584a7-c306-4904-9e9b-33789d2d42bd\") " pod="openstack/ceilometer-0" Oct 12 07:50:12 crc kubenswrapper[4599]: I1012 07:50:12.396973 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fec8099-a2ba-4cbd-af30-75a787e3ead1-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7fec8099-a2ba-4cbd-af30-75a787e3ead1\") " pod="openstack/glance-default-internal-api-0" Oct 12 07:50:12 crc kubenswrapper[4599]: I1012 07:50:12.397187 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/231584a7-c306-4904-9e9b-33789d2d42bd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"231584a7-c306-4904-9e9b-33789d2d42bd\") " pod="openstack/ceilometer-0" Oct 12 07:50:12 crc kubenswrapper[4599]: I1012 07:50:12.399812 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/231584a7-c306-4904-9e9b-33789d2d42bd-scripts\") pod \"ceilometer-0\" (UID: \"231584a7-c306-4904-9e9b-33789d2d42bd\") " pod="openstack/ceilometer-0" Oct 12 07:50:12 crc kubenswrapper[4599]: I1012 07:50:12.400990 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fec8099-a2ba-4cbd-af30-75a787e3ead1-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7fec8099-a2ba-4cbd-af30-75a787e3ead1\") " pod="openstack/glance-default-internal-api-0" Oct 12 07:50:12 crc kubenswrapper[4599]: I1012 07:50:12.404536 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/231584a7-c306-4904-9e9b-33789d2d42bd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"231584a7-c306-4904-9e9b-33789d2d42bd\") " pod="openstack/ceilometer-0" Oct 12 07:50:12 crc kubenswrapper[4599]: I1012 07:50:12.405073 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7fec8099-a2ba-4cbd-af30-75a787e3ead1-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7fec8099-a2ba-4cbd-af30-75a787e3ead1\") " pod="openstack/glance-default-internal-api-0" Oct 12 07:50:12 crc kubenswrapper[4599]: I1012 07:50:12.406743 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhm97\" (UniqueName: \"kubernetes.io/projected/7fec8099-a2ba-4cbd-af30-75a787e3ead1-kube-api-access-rhm97\") pod \"glance-default-internal-api-0\" (UID: \"7fec8099-a2ba-4cbd-af30-75a787e3ead1\") " pod="openstack/glance-default-internal-api-0" Oct 12 07:50:12 crc kubenswrapper[4599]: I1012 07:50:12.407172 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snfzw\" (UniqueName: \"kubernetes.io/projected/231584a7-c306-4904-9e9b-33789d2d42bd-kube-api-access-snfzw\") pod \"ceilometer-0\" (UID: \"231584a7-c306-4904-9e9b-33789d2d42bd\") " pod="openstack/ceilometer-0" Oct 12 07:50:12 crc kubenswrapper[4599]: I1012 07:50:12.429440 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"7fec8099-a2ba-4cbd-af30-75a787e3ead1\") " pod="openstack/glance-default-internal-api-0" Oct 12 07:50:12 crc kubenswrapper[4599]: I1012 07:50:12.560432 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 12 07:50:12 crc kubenswrapper[4599]: I1012 07:50:12.569760 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 12 07:50:12 crc kubenswrapper[4599]: I1012 07:50:12.989947 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 12 07:50:13 crc kubenswrapper[4599]: I1012 07:50:13.041354 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 12 07:50:13 crc kubenswrapper[4599]: I1012 07:50:13.167594 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"231584a7-c306-4904-9e9b-33789d2d42bd","Type":"ContainerStarted","Data":"059691a96770f3d3e027021cd291e5406ca3ab15d9e044c27da87547f4d01d10"} Oct 12 07:50:13 crc kubenswrapper[4599]: I1012 07:50:13.168752 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7fec8099-a2ba-4cbd-af30-75a787e3ead1","Type":"ContainerStarted","Data":"b51411b575c7f64781cc8e3296dc05c87783cb58230336a56d4c445ec998ac89"} Oct 12 07:50:13 crc kubenswrapper[4599]: I1012 07:50:13.557130 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="412835ea-6470-43ff-896b-98da92bb7393" path="/var/lib/kubelet/pods/412835ea-6470-43ff-896b-98da92bb7393/volumes" Oct 12 07:50:13 crc kubenswrapper[4599]: I1012 07:50:13.558398 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc31b00c-21aa-4427-9b7a-c505c0520a2b" path="/var/lib/kubelet/pods/dc31b00c-21aa-4427-9b7a-c505c0520a2b/volumes" Oct 12 07:50:13 crc kubenswrapper[4599]: I1012 07:50:13.777064 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 12 07:50:14 crc kubenswrapper[4599]: I1012 07:50:14.181833 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"231584a7-c306-4904-9e9b-33789d2d42bd","Type":"ContainerStarted","Data":"7f6072cfb4f6896768d0e01eca5e8c15338d230a547fd6a1d3ca96a4b37bfe68"} Oct 12 07:50:14 crc kubenswrapper[4599]: I1012 07:50:14.184096 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7fec8099-a2ba-4cbd-af30-75a787e3ead1","Type":"ContainerStarted","Data":"310c72e1c155578e5e16fcdff0c310060924cba4f1d06fe10903ced5bb80bbb8"} Oct 12 07:50:14 crc kubenswrapper[4599]: I1012 07:50:14.184154 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7fec8099-a2ba-4cbd-af30-75a787e3ead1","Type":"ContainerStarted","Data":"53bb2049b2bb4b123871969f8012883c693ac16b34c436216c4d33e9f5b02f7d"} Oct 12 07:50:14 crc kubenswrapper[4599]: I1012 07:50:14.202416 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=2.202395691 podStartE2EDuration="2.202395691s" podCreationTimestamp="2025-10-12 07:50:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:50:14.198356378 +0000 UTC m=+910.987551880" watchObservedRunningTime="2025-10-12 07:50:14.202395691 +0000 UTC m=+910.991591193" Oct 12 07:50:14 crc kubenswrapper[4599]: I1012 07:50:14.758125 4599 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="88d1ee0b-4340-44c2-b357-1eb1741e30ca" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.143:9292/healthcheck\": read tcp 10.217.0.2:49034->10.217.0.143:9292: read: connection reset by peer" Oct 12 07:50:14 crc kubenswrapper[4599]: I1012 07:50:14.758123 4599 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="88d1ee0b-4340-44c2-b357-1eb1741e30ca" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.143:9292/healthcheck\": read tcp 10.217.0.2:49024->10.217.0.143:9292: read: connection reset by peer" Oct 12 07:50:14 crc kubenswrapper[4599]: I1012 07:50:14.943378 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-786bc7649f-6qt66" Oct 12 07:50:14 crc kubenswrapper[4599]: I1012 07:50:14.956375 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-786bc7649f-6qt66" Oct 12 07:50:15 crc kubenswrapper[4599]: I1012 07:50:15.196199 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"231584a7-c306-4904-9e9b-33789d2d42bd","Type":"ContainerStarted","Data":"9f0345b593bc8549548e28ca4afefae19c14e8a42fbfbdf1e752f086d5846fe4"} Oct 12 07:50:15 crc kubenswrapper[4599]: I1012 07:50:15.199533 4599 generic.go:334] "Generic (PLEG): container finished" podID="88d1ee0b-4340-44c2-b357-1eb1741e30ca" containerID="c1e366286f4e9ddc378f0672a98684e65e1219d248fbd57600ff6fceda569cec" exitCode=0 Oct 12 07:50:15 crc kubenswrapper[4599]: I1012 07:50:15.199561 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"88d1ee0b-4340-44c2-b357-1eb1741e30ca","Type":"ContainerDied","Data":"c1e366286f4e9ddc378f0672a98684e65e1219d248fbd57600ff6fceda569cec"} Oct 12 07:50:15 crc kubenswrapper[4599]: I1012 07:50:15.238328 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 12 07:50:15 crc kubenswrapper[4599]: I1012 07:50:15.248940 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88d1ee0b-4340-44c2-b357-1eb1741e30ca-combined-ca-bundle\") pod \"88d1ee0b-4340-44c2-b357-1eb1741e30ca\" (UID: \"88d1ee0b-4340-44c2-b357-1eb1741e30ca\") " Oct 12 07:50:15 crc kubenswrapper[4599]: I1012 07:50:15.248994 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88d1ee0b-4340-44c2-b357-1eb1741e30ca-logs\") pod \"88d1ee0b-4340-44c2-b357-1eb1741e30ca\" (UID: \"88d1ee0b-4340-44c2-b357-1eb1741e30ca\") " Oct 12 07:50:15 crc kubenswrapper[4599]: I1012 07:50:15.249032 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88d1ee0b-4340-44c2-b357-1eb1741e30ca-scripts\") pod \"88d1ee0b-4340-44c2-b357-1eb1741e30ca\" (UID: \"88d1ee0b-4340-44c2-b357-1eb1741e30ca\") " Oct 12 07:50:15 crc kubenswrapper[4599]: I1012 07:50:15.249050 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/88d1ee0b-4340-44c2-b357-1eb1741e30ca-httpd-run\") pod \"88d1ee0b-4340-44c2-b357-1eb1741e30ca\" (UID: \"88d1ee0b-4340-44c2-b357-1eb1741e30ca\") " Oct 12 07:50:15 crc kubenswrapper[4599]: I1012 07:50:15.249079 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"88d1ee0b-4340-44c2-b357-1eb1741e30ca\" (UID: \"88d1ee0b-4340-44c2-b357-1eb1741e30ca\") " Oct 12 07:50:15 crc kubenswrapper[4599]: I1012 07:50:15.249120 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88d1ee0b-4340-44c2-b357-1eb1741e30ca-config-data\") pod \"88d1ee0b-4340-44c2-b357-1eb1741e30ca\" (UID: \"88d1ee0b-4340-44c2-b357-1eb1741e30ca\") " Oct 12 07:50:15 crc kubenswrapper[4599]: I1012 07:50:15.249164 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/88d1ee0b-4340-44c2-b357-1eb1741e30ca-public-tls-certs\") pod \"88d1ee0b-4340-44c2-b357-1eb1741e30ca\" (UID: \"88d1ee0b-4340-44c2-b357-1eb1741e30ca\") " Oct 12 07:50:15 crc kubenswrapper[4599]: I1012 07:50:15.249196 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-76449\" (UniqueName: \"kubernetes.io/projected/88d1ee0b-4340-44c2-b357-1eb1741e30ca-kube-api-access-76449\") pod \"88d1ee0b-4340-44c2-b357-1eb1741e30ca\" (UID: \"88d1ee0b-4340-44c2-b357-1eb1741e30ca\") " Oct 12 07:50:15 crc kubenswrapper[4599]: I1012 07:50:15.250720 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88d1ee0b-4340-44c2-b357-1eb1741e30ca-logs" (OuterVolumeSpecName: "logs") pod "88d1ee0b-4340-44c2-b357-1eb1741e30ca" (UID: "88d1ee0b-4340-44c2-b357-1eb1741e30ca"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:50:15 crc kubenswrapper[4599]: I1012 07:50:15.251583 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88d1ee0b-4340-44c2-b357-1eb1741e30ca-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "88d1ee0b-4340-44c2-b357-1eb1741e30ca" (UID: "88d1ee0b-4340-44c2-b357-1eb1741e30ca"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:50:15 crc kubenswrapper[4599]: I1012 07:50:15.273448 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88d1ee0b-4340-44c2-b357-1eb1741e30ca-kube-api-access-76449" (OuterVolumeSpecName: "kube-api-access-76449") pod "88d1ee0b-4340-44c2-b357-1eb1741e30ca" (UID: "88d1ee0b-4340-44c2-b357-1eb1741e30ca"). InnerVolumeSpecName "kube-api-access-76449". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:50:15 crc kubenswrapper[4599]: I1012 07:50:15.275401 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "88d1ee0b-4340-44c2-b357-1eb1741e30ca" (UID: "88d1ee0b-4340-44c2-b357-1eb1741e30ca"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 12 07:50:15 crc kubenswrapper[4599]: I1012 07:50:15.275468 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88d1ee0b-4340-44c2-b357-1eb1741e30ca-scripts" (OuterVolumeSpecName: "scripts") pod "88d1ee0b-4340-44c2-b357-1eb1741e30ca" (UID: "88d1ee0b-4340-44c2-b357-1eb1741e30ca"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:50:15 crc kubenswrapper[4599]: I1012 07:50:15.302960 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88d1ee0b-4340-44c2-b357-1eb1741e30ca-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "88d1ee0b-4340-44c2-b357-1eb1741e30ca" (UID: "88d1ee0b-4340-44c2-b357-1eb1741e30ca"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:50:15 crc kubenswrapper[4599]: I1012 07:50:15.322470 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88d1ee0b-4340-44c2-b357-1eb1741e30ca-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "88d1ee0b-4340-44c2-b357-1eb1741e30ca" (UID: "88d1ee0b-4340-44c2-b357-1eb1741e30ca"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:50:15 crc kubenswrapper[4599]: I1012 07:50:15.333727 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88d1ee0b-4340-44c2-b357-1eb1741e30ca-config-data" (OuterVolumeSpecName: "config-data") pod "88d1ee0b-4340-44c2-b357-1eb1741e30ca" (UID: "88d1ee0b-4340-44c2-b357-1eb1741e30ca"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:50:15 crc kubenswrapper[4599]: I1012 07:50:15.352978 4599 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88d1ee0b-4340-44c2-b357-1eb1741e30ca-scripts\") on node \"crc\" DevicePath \"\"" Oct 12 07:50:15 crc kubenswrapper[4599]: I1012 07:50:15.353084 4599 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/88d1ee0b-4340-44c2-b357-1eb1741e30ca-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 12 07:50:15 crc kubenswrapper[4599]: I1012 07:50:15.353183 4599 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Oct 12 07:50:15 crc kubenswrapper[4599]: I1012 07:50:15.353243 4599 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88d1ee0b-4340-44c2-b357-1eb1741e30ca-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 07:50:15 crc kubenswrapper[4599]: I1012 07:50:15.353312 4599 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/88d1ee0b-4340-44c2-b357-1eb1741e30ca-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 12 07:50:15 crc kubenswrapper[4599]: I1012 07:50:15.353414 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-76449\" (UniqueName: \"kubernetes.io/projected/88d1ee0b-4340-44c2-b357-1eb1741e30ca-kube-api-access-76449\") on node \"crc\" DevicePath \"\"" Oct 12 07:50:15 crc kubenswrapper[4599]: I1012 07:50:15.353471 4599 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88d1ee0b-4340-44c2-b357-1eb1741e30ca-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 07:50:15 crc kubenswrapper[4599]: I1012 07:50:15.353526 4599 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88d1ee0b-4340-44c2-b357-1eb1741e30ca-logs\") on node \"crc\" DevicePath \"\"" Oct 12 07:50:15 crc kubenswrapper[4599]: I1012 07:50:15.388387 4599 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Oct 12 07:50:15 crc kubenswrapper[4599]: I1012 07:50:15.458150 4599 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Oct 12 07:50:15 crc kubenswrapper[4599]: I1012 07:50:15.476313 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vvd6p"] Oct 12 07:50:15 crc kubenswrapper[4599]: E1012 07:50:15.476681 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88d1ee0b-4340-44c2-b357-1eb1741e30ca" containerName="glance-httpd" Oct 12 07:50:15 crc kubenswrapper[4599]: I1012 07:50:15.476700 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="88d1ee0b-4340-44c2-b357-1eb1741e30ca" containerName="glance-httpd" Oct 12 07:50:15 crc kubenswrapper[4599]: E1012 07:50:15.476715 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88d1ee0b-4340-44c2-b357-1eb1741e30ca" containerName="glance-log" Oct 12 07:50:15 crc kubenswrapper[4599]: I1012 07:50:15.476721 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="88d1ee0b-4340-44c2-b357-1eb1741e30ca" containerName="glance-log" Oct 12 07:50:15 crc kubenswrapper[4599]: I1012 07:50:15.476888 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="88d1ee0b-4340-44c2-b357-1eb1741e30ca" containerName="glance-log" Oct 12 07:50:15 crc kubenswrapper[4599]: I1012 07:50:15.476904 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="88d1ee0b-4340-44c2-b357-1eb1741e30ca" containerName="glance-httpd" Oct 12 07:50:15 crc kubenswrapper[4599]: I1012 07:50:15.478126 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vvd6p" Oct 12 07:50:15 crc kubenswrapper[4599]: I1012 07:50:15.487101 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vvd6p"] Oct 12 07:50:15 crc kubenswrapper[4599]: I1012 07:50:15.561533 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdd9da54-4ee3-40c8-af3b-fb522971c160-utilities\") pod \"certified-operators-vvd6p\" (UID: \"fdd9da54-4ee3-40c8-af3b-fb522971c160\") " pod="openshift-marketplace/certified-operators-vvd6p" Oct 12 07:50:15 crc kubenswrapper[4599]: I1012 07:50:15.561619 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tw9m8\" (UniqueName: \"kubernetes.io/projected/fdd9da54-4ee3-40c8-af3b-fb522971c160-kube-api-access-tw9m8\") pod \"certified-operators-vvd6p\" (UID: \"fdd9da54-4ee3-40c8-af3b-fb522971c160\") " pod="openshift-marketplace/certified-operators-vvd6p" Oct 12 07:50:15 crc kubenswrapper[4599]: I1012 07:50:15.561664 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdd9da54-4ee3-40c8-af3b-fb522971c160-catalog-content\") pod \"certified-operators-vvd6p\" (UID: \"fdd9da54-4ee3-40c8-af3b-fb522971c160\") " pod="openshift-marketplace/certified-operators-vvd6p" Oct 12 07:50:15 crc kubenswrapper[4599]: I1012 07:50:15.663038 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdd9da54-4ee3-40c8-af3b-fb522971c160-catalog-content\") pod \"certified-operators-vvd6p\" (UID: \"fdd9da54-4ee3-40c8-af3b-fb522971c160\") " pod="openshift-marketplace/certified-operators-vvd6p" Oct 12 07:50:15 crc kubenswrapper[4599]: I1012 07:50:15.663179 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdd9da54-4ee3-40c8-af3b-fb522971c160-utilities\") pod \"certified-operators-vvd6p\" (UID: \"fdd9da54-4ee3-40c8-af3b-fb522971c160\") " pod="openshift-marketplace/certified-operators-vvd6p" Oct 12 07:50:15 crc kubenswrapper[4599]: I1012 07:50:15.663248 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tw9m8\" (UniqueName: \"kubernetes.io/projected/fdd9da54-4ee3-40c8-af3b-fb522971c160-kube-api-access-tw9m8\") pod \"certified-operators-vvd6p\" (UID: \"fdd9da54-4ee3-40c8-af3b-fb522971c160\") " pod="openshift-marketplace/certified-operators-vvd6p" Oct 12 07:50:15 crc kubenswrapper[4599]: I1012 07:50:15.663937 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdd9da54-4ee3-40c8-af3b-fb522971c160-catalog-content\") pod \"certified-operators-vvd6p\" (UID: \"fdd9da54-4ee3-40c8-af3b-fb522971c160\") " pod="openshift-marketplace/certified-operators-vvd6p" Oct 12 07:50:15 crc kubenswrapper[4599]: I1012 07:50:15.663937 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdd9da54-4ee3-40c8-af3b-fb522971c160-utilities\") pod \"certified-operators-vvd6p\" (UID: \"fdd9da54-4ee3-40c8-af3b-fb522971c160\") " pod="openshift-marketplace/certified-operators-vvd6p" Oct 12 07:50:15 crc kubenswrapper[4599]: I1012 07:50:15.681923 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tw9m8\" (UniqueName: \"kubernetes.io/projected/fdd9da54-4ee3-40c8-af3b-fb522971c160-kube-api-access-tw9m8\") pod \"certified-operators-vvd6p\" (UID: \"fdd9da54-4ee3-40c8-af3b-fb522971c160\") " pod="openshift-marketplace/certified-operators-vvd6p" Oct 12 07:50:15 crc kubenswrapper[4599]: E1012 07:50:15.763106 4599 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/96c9b4c01826453f6b4834e66cd5afeed5daf7e43e9eb0e3e2d4770f95934c70/diff" to get inode usage: stat /var/lib/containers/storage/overlay/96c9b4c01826453f6b4834e66cd5afeed5daf7e43e9eb0e3e2d4770f95934c70/diff: no such file or directory, extraDiskErr: could not stat "/var/log/pods/openstack_glance-default-internal-api-0_dc31b00c-21aa-4427-9b7a-c505c0520a2b/glance-log/0.log" to get inode usage: stat /var/log/pods/openstack_glance-default-internal-api-0_dc31b00c-21aa-4427-9b7a-c505c0520a2b/glance-log/0.log: no such file or directory Oct 12 07:50:15 crc kubenswrapper[4599]: I1012 07:50:15.800556 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vvd6p" Oct 12 07:50:16 crc kubenswrapper[4599]: I1012 07:50:16.211656 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"231584a7-c306-4904-9e9b-33789d2d42bd","Type":"ContainerStarted","Data":"31fa37b1108d09e65952d61876d188c75a92a7ec64185f51761a8b0c066ca976"} Oct 12 07:50:16 crc kubenswrapper[4599]: I1012 07:50:16.214183 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"88d1ee0b-4340-44c2-b357-1eb1741e30ca","Type":"ContainerDied","Data":"4a806162947aca9c71f7c24295e69d42ef5b1ff04a34e49bf337c8f4cdc1d598"} Oct 12 07:50:16 crc kubenswrapper[4599]: I1012 07:50:16.214248 4599 scope.go:117] "RemoveContainer" containerID="c1e366286f4e9ddc378f0672a98684e65e1219d248fbd57600ff6fceda569cec" Oct 12 07:50:16 crc kubenswrapper[4599]: I1012 07:50:16.214269 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 12 07:50:16 crc kubenswrapper[4599]: I1012 07:50:16.235749 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 12 07:50:16 crc kubenswrapper[4599]: I1012 07:50:16.236779 4599 scope.go:117] "RemoveContainer" containerID="69ac54b513822bd5d17bba9e6347a4aa4221ed49830122d6174fde84876ecdd4" Oct 12 07:50:16 crc kubenswrapper[4599]: I1012 07:50:16.243915 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 12 07:50:16 crc kubenswrapper[4599]: I1012 07:50:16.256962 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 12 07:50:16 crc kubenswrapper[4599]: I1012 07:50:16.263120 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 12 07:50:16 crc kubenswrapper[4599]: I1012 07:50:16.264955 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 12 07:50:16 crc kubenswrapper[4599]: I1012 07:50:16.265315 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 12 07:50:16 crc kubenswrapper[4599]: I1012 07:50:16.275533 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 12 07:50:16 crc kubenswrapper[4599]: I1012 07:50:16.376968 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"a26e3618-b309-4ac5-b0e1-39feba422ef6\") " pod="openstack/glance-default-external-api-0" Oct 12 07:50:16 crc kubenswrapper[4599]: I1012 07:50:16.377012 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a26e3618-b309-4ac5-b0e1-39feba422ef6-scripts\") pod \"glance-default-external-api-0\" (UID: \"a26e3618-b309-4ac5-b0e1-39feba422ef6\") " pod="openstack/glance-default-external-api-0" Oct 12 07:50:16 crc kubenswrapper[4599]: I1012 07:50:16.377046 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a26e3618-b309-4ac5-b0e1-39feba422ef6-config-data\") pod \"glance-default-external-api-0\" (UID: \"a26e3618-b309-4ac5-b0e1-39feba422ef6\") " pod="openstack/glance-default-external-api-0" Oct 12 07:50:16 crc kubenswrapper[4599]: I1012 07:50:16.377074 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a26e3618-b309-4ac5-b0e1-39feba422ef6-logs\") pod \"glance-default-external-api-0\" (UID: \"a26e3618-b309-4ac5-b0e1-39feba422ef6\") " pod="openstack/glance-default-external-api-0" Oct 12 07:50:16 crc kubenswrapper[4599]: I1012 07:50:16.377122 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a26e3618-b309-4ac5-b0e1-39feba422ef6-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a26e3618-b309-4ac5-b0e1-39feba422ef6\") " pod="openstack/glance-default-external-api-0" Oct 12 07:50:16 crc kubenswrapper[4599]: I1012 07:50:16.377164 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5wjg\" (UniqueName: \"kubernetes.io/projected/a26e3618-b309-4ac5-b0e1-39feba422ef6-kube-api-access-b5wjg\") pod \"glance-default-external-api-0\" (UID: \"a26e3618-b309-4ac5-b0e1-39feba422ef6\") " pod="openstack/glance-default-external-api-0" Oct 12 07:50:16 crc kubenswrapper[4599]: I1012 07:50:16.377187 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a26e3618-b309-4ac5-b0e1-39feba422ef6-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a26e3618-b309-4ac5-b0e1-39feba422ef6\") " pod="openstack/glance-default-external-api-0" Oct 12 07:50:16 crc kubenswrapper[4599]: I1012 07:50:16.377233 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a26e3618-b309-4ac5-b0e1-39feba422ef6-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a26e3618-b309-4ac5-b0e1-39feba422ef6\") " pod="openstack/glance-default-external-api-0" Oct 12 07:50:16 crc kubenswrapper[4599]: I1012 07:50:16.411058 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vvd6p"] Oct 12 07:50:16 crc kubenswrapper[4599]: I1012 07:50:16.478702 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rk6kl"] Oct 12 07:50:16 crc kubenswrapper[4599]: I1012 07:50:16.478716 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5wjg\" (UniqueName: \"kubernetes.io/projected/a26e3618-b309-4ac5-b0e1-39feba422ef6-kube-api-access-b5wjg\") pod \"glance-default-external-api-0\" (UID: \"a26e3618-b309-4ac5-b0e1-39feba422ef6\") " pod="openstack/glance-default-external-api-0" Oct 12 07:50:16 crc kubenswrapper[4599]: I1012 07:50:16.480245 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a26e3618-b309-4ac5-b0e1-39feba422ef6-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a26e3618-b309-4ac5-b0e1-39feba422ef6\") " pod="openstack/glance-default-external-api-0" Oct 12 07:50:16 crc kubenswrapper[4599]: I1012 07:50:16.480290 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a26e3618-b309-4ac5-b0e1-39feba422ef6-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a26e3618-b309-4ac5-b0e1-39feba422ef6\") " pod="openstack/glance-default-external-api-0" Oct 12 07:50:16 crc kubenswrapper[4599]: I1012 07:50:16.480449 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a26e3618-b309-4ac5-b0e1-39feba422ef6-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a26e3618-b309-4ac5-b0e1-39feba422ef6\") " pod="openstack/glance-default-external-api-0" Oct 12 07:50:16 crc kubenswrapper[4599]: I1012 07:50:16.480615 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"a26e3618-b309-4ac5-b0e1-39feba422ef6\") " pod="openstack/glance-default-external-api-0" Oct 12 07:50:16 crc kubenswrapper[4599]: I1012 07:50:16.480640 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a26e3618-b309-4ac5-b0e1-39feba422ef6-scripts\") pod \"glance-default-external-api-0\" (UID: \"a26e3618-b309-4ac5-b0e1-39feba422ef6\") " pod="openstack/glance-default-external-api-0" Oct 12 07:50:16 crc kubenswrapper[4599]: I1012 07:50:16.480702 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a26e3618-b309-4ac5-b0e1-39feba422ef6-config-data\") pod \"glance-default-external-api-0\" (UID: \"a26e3618-b309-4ac5-b0e1-39feba422ef6\") " pod="openstack/glance-default-external-api-0" Oct 12 07:50:16 crc kubenswrapper[4599]: I1012 07:50:16.480762 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a26e3618-b309-4ac5-b0e1-39feba422ef6-logs\") pod \"glance-default-external-api-0\" (UID: \"a26e3618-b309-4ac5-b0e1-39feba422ef6\") " pod="openstack/glance-default-external-api-0" Oct 12 07:50:16 crc kubenswrapper[4599]: I1012 07:50:16.481093 4599 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"a26e3618-b309-4ac5-b0e1-39feba422ef6\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-external-api-0" Oct 12 07:50:16 crc kubenswrapper[4599]: I1012 07:50:16.481256 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a26e3618-b309-4ac5-b0e1-39feba422ef6-logs\") pod \"glance-default-external-api-0\" (UID: \"a26e3618-b309-4ac5-b0e1-39feba422ef6\") " pod="openstack/glance-default-external-api-0" Oct 12 07:50:16 crc kubenswrapper[4599]: I1012 07:50:16.490794 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a26e3618-b309-4ac5-b0e1-39feba422ef6-scripts\") pod \"glance-default-external-api-0\" (UID: \"a26e3618-b309-4ac5-b0e1-39feba422ef6\") " pod="openstack/glance-default-external-api-0" Oct 12 07:50:16 crc kubenswrapper[4599]: I1012 07:50:16.491273 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a26e3618-b309-4ac5-b0e1-39feba422ef6-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a26e3618-b309-4ac5-b0e1-39feba422ef6\") " pod="openstack/glance-default-external-api-0" Oct 12 07:50:16 crc kubenswrapper[4599]: I1012 07:50:16.496167 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a26e3618-b309-4ac5-b0e1-39feba422ef6-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a26e3618-b309-4ac5-b0e1-39feba422ef6\") " pod="openstack/glance-default-external-api-0" Oct 12 07:50:16 crc kubenswrapper[4599]: I1012 07:50:16.498990 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a26e3618-b309-4ac5-b0e1-39feba422ef6-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a26e3618-b309-4ac5-b0e1-39feba422ef6\") " pod="openstack/glance-default-external-api-0" Oct 12 07:50:16 crc kubenswrapper[4599]: I1012 07:50:16.505302 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rk6kl"] Oct 12 07:50:16 crc kubenswrapper[4599]: I1012 07:50:16.507474 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rk6kl" Oct 12 07:50:16 crc kubenswrapper[4599]: I1012 07:50:16.508674 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a26e3618-b309-4ac5-b0e1-39feba422ef6-config-data\") pod \"glance-default-external-api-0\" (UID: \"a26e3618-b309-4ac5-b0e1-39feba422ef6\") " pod="openstack/glance-default-external-api-0" Oct 12 07:50:16 crc kubenswrapper[4599]: I1012 07:50:16.515890 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5wjg\" (UniqueName: \"kubernetes.io/projected/a26e3618-b309-4ac5-b0e1-39feba422ef6-kube-api-access-b5wjg\") pod \"glance-default-external-api-0\" (UID: \"a26e3618-b309-4ac5-b0e1-39feba422ef6\") " pod="openstack/glance-default-external-api-0" Oct 12 07:50:16 crc kubenswrapper[4599]: I1012 07:50:16.519932 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"a26e3618-b309-4ac5-b0e1-39feba422ef6\") " pod="openstack/glance-default-external-api-0" Oct 12 07:50:16 crc kubenswrapper[4599]: I1012 07:50:16.582513 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2c5zt\" (UniqueName: \"kubernetes.io/projected/9f317920-4f82-4691-8e1c-31b0b35526e8-kube-api-access-2c5zt\") pod \"redhat-marketplace-rk6kl\" (UID: \"9f317920-4f82-4691-8e1c-31b0b35526e8\") " pod="openshift-marketplace/redhat-marketplace-rk6kl" Oct 12 07:50:16 crc kubenswrapper[4599]: I1012 07:50:16.582565 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f317920-4f82-4691-8e1c-31b0b35526e8-utilities\") pod \"redhat-marketplace-rk6kl\" (UID: \"9f317920-4f82-4691-8e1c-31b0b35526e8\") " pod="openshift-marketplace/redhat-marketplace-rk6kl" Oct 12 07:50:16 crc kubenswrapper[4599]: I1012 07:50:16.582631 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f317920-4f82-4691-8e1c-31b0b35526e8-catalog-content\") pod \"redhat-marketplace-rk6kl\" (UID: \"9f317920-4f82-4691-8e1c-31b0b35526e8\") " pod="openshift-marketplace/redhat-marketplace-rk6kl" Oct 12 07:50:16 crc kubenswrapper[4599]: I1012 07:50:16.594594 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 12 07:50:16 crc kubenswrapper[4599]: I1012 07:50:16.684555 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2c5zt\" (UniqueName: \"kubernetes.io/projected/9f317920-4f82-4691-8e1c-31b0b35526e8-kube-api-access-2c5zt\") pod \"redhat-marketplace-rk6kl\" (UID: \"9f317920-4f82-4691-8e1c-31b0b35526e8\") " pod="openshift-marketplace/redhat-marketplace-rk6kl" Oct 12 07:50:16 crc kubenswrapper[4599]: I1012 07:50:16.684616 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f317920-4f82-4691-8e1c-31b0b35526e8-utilities\") pod \"redhat-marketplace-rk6kl\" (UID: \"9f317920-4f82-4691-8e1c-31b0b35526e8\") " pod="openshift-marketplace/redhat-marketplace-rk6kl" Oct 12 07:50:16 crc kubenswrapper[4599]: I1012 07:50:16.684673 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f317920-4f82-4691-8e1c-31b0b35526e8-catalog-content\") pod \"redhat-marketplace-rk6kl\" (UID: \"9f317920-4f82-4691-8e1c-31b0b35526e8\") " pod="openshift-marketplace/redhat-marketplace-rk6kl" Oct 12 07:50:16 crc kubenswrapper[4599]: I1012 07:50:16.685094 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f317920-4f82-4691-8e1c-31b0b35526e8-catalog-content\") pod \"redhat-marketplace-rk6kl\" (UID: \"9f317920-4f82-4691-8e1c-31b0b35526e8\") " pod="openshift-marketplace/redhat-marketplace-rk6kl" Oct 12 07:50:16 crc kubenswrapper[4599]: I1012 07:50:16.685580 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f317920-4f82-4691-8e1c-31b0b35526e8-utilities\") pod \"redhat-marketplace-rk6kl\" (UID: \"9f317920-4f82-4691-8e1c-31b0b35526e8\") " pod="openshift-marketplace/redhat-marketplace-rk6kl" Oct 12 07:50:16 crc kubenswrapper[4599]: I1012 07:50:16.705234 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2c5zt\" (UniqueName: \"kubernetes.io/projected/9f317920-4f82-4691-8e1c-31b0b35526e8-kube-api-access-2c5zt\") pod \"redhat-marketplace-rk6kl\" (UID: \"9f317920-4f82-4691-8e1c-31b0b35526e8\") " pod="openshift-marketplace/redhat-marketplace-rk6kl" Oct 12 07:50:16 crc kubenswrapper[4599]: I1012 07:50:16.880212 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rk6kl" Oct 12 07:50:17 crc kubenswrapper[4599]: I1012 07:50:17.127434 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 12 07:50:17 crc kubenswrapper[4599]: I1012 07:50:17.234743 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"231584a7-c306-4904-9e9b-33789d2d42bd","Type":"ContainerStarted","Data":"d6a485cb3d09277c99a1466dfa01b9a467b7629f0348cf90738e98d023d5b588"} Oct 12 07:50:17 crc kubenswrapper[4599]: I1012 07:50:17.234915 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="231584a7-c306-4904-9e9b-33789d2d42bd" containerName="ceilometer-central-agent" containerID="cri-o://7f6072cfb4f6896768d0e01eca5e8c15338d230a547fd6a1d3ca96a4b37bfe68" gracePeriod=30 Oct 12 07:50:17 crc kubenswrapper[4599]: I1012 07:50:17.235175 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 12 07:50:17 crc kubenswrapper[4599]: I1012 07:50:17.235445 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="231584a7-c306-4904-9e9b-33789d2d42bd" containerName="proxy-httpd" containerID="cri-o://d6a485cb3d09277c99a1466dfa01b9a467b7629f0348cf90738e98d023d5b588" gracePeriod=30 Oct 12 07:50:17 crc kubenswrapper[4599]: I1012 07:50:17.235489 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="231584a7-c306-4904-9e9b-33789d2d42bd" containerName="sg-core" containerID="cri-o://31fa37b1108d09e65952d61876d188c75a92a7ec64185f51761a8b0c066ca976" gracePeriod=30 Oct 12 07:50:17 crc kubenswrapper[4599]: I1012 07:50:17.235523 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="231584a7-c306-4904-9e9b-33789d2d42bd" containerName="ceilometer-notification-agent" containerID="cri-o://9f0345b593bc8549548e28ca4afefae19c14e8a42fbfbdf1e752f086d5846fe4" gracePeriod=30 Oct 12 07:50:17 crc kubenswrapper[4599]: I1012 07:50:17.242767 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a26e3618-b309-4ac5-b0e1-39feba422ef6","Type":"ContainerStarted","Data":"2e044f32b88769e4553c0f9964072cb2dac174892d23490320dec3a4b20e80a3"} Oct 12 07:50:17 crc kubenswrapper[4599]: I1012 07:50:17.249563 4599 generic.go:334] "Generic (PLEG): container finished" podID="fdd9da54-4ee3-40c8-af3b-fb522971c160" containerID="958085b6614b6b8062018ad8f344e2baa29e201b99985fc68ec2442f058e8924" exitCode=0 Oct 12 07:50:17 crc kubenswrapper[4599]: I1012 07:50:17.249605 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vvd6p" event={"ID":"fdd9da54-4ee3-40c8-af3b-fb522971c160","Type":"ContainerDied","Data":"958085b6614b6b8062018ad8f344e2baa29e201b99985fc68ec2442f058e8924"} Oct 12 07:50:17 crc kubenswrapper[4599]: I1012 07:50:17.249620 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vvd6p" event={"ID":"fdd9da54-4ee3-40c8-af3b-fb522971c160","Type":"ContainerStarted","Data":"98d8990c532bc8d4bd995fad295fda1d7f633ac8745f0c04b80852807c81309a"} Oct 12 07:50:17 crc kubenswrapper[4599]: I1012 07:50:17.263254 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.577455018 podStartE2EDuration="5.263240348s" podCreationTimestamp="2025-10-12 07:50:12 +0000 UTC" firstStartedPulling="2025-10-12 07:50:12.991472469 +0000 UTC m=+909.780667971" lastFinishedPulling="2025-10-12 07:50:16.677257798 +0000 UTC m=+913.466453301" observedRunningTime="2025-10-12 07:50:17.251205867 +0000 UTC m=+914.040401368" watchObservedRunningTime="2025-10-12 07:50:17.263240348 +0000 UTC m=+914.052435851" Oct 12 07:50:17 crc kubenswrapper[4599]: I1012 07:50:17.366822 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rk6kl"] Oct 12 07:50:17 crc kubenswrapper[4599]: I1012 07:50:17.592627 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88d1ee0b-4340-44c2-b357-1eb1741e30ca" path="/var/lib/kubelet/pods/88d1ee0b-4340-44c2-b357-1eb1741e30ca/volumes" Oct 12 07:50:17 crc kubenswrapper[4599]: E1012 07:50:17.965278 4599 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/624b20732ef53d900ce5a394d1d60109fb393035b6eef493c1ab8630a507e9b6/diff" to get inode usage: stat /var/lib/containers/storage/overlay/624b20732ef53d900ce5a394d1d60109fb393035b6eef493c1ab8630a507e9b6/diff: no such file or directory, extraDiskErr: could not stat "/var/log/pods/openstack_glance-default-external-api-0_88d1ee0b-4340-44c2-b357-1eb1741e30ca/glance-httpd/0.log" to get inode usage: stat /var/log/pods/openstack_glance-default-external-api-0_88d1ee0b-4340-44c2-b357-1eb1741e30ca/glance-httpd/0.log: no such file or directory Oct 12 07:50:18 crc kubenswrapper[4599]: I1012 07:50:18.029478 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 12 07:50:18 crc kubenswrapper[4599]: I1012 07:50:18.113586 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df8f9a6f-b14e-4241-9102-4473869410d8-config-data\") pod \"df8f9a6f-b14e-4241-9102-4473869410d8\" (UID: \"df8f9a6f-b14e-4241-9102-4473869410d8\") " Oct 12 07:50:18 crc kubenswrapper[4599]: I1012 07:50:18.113639 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df8f9a6f-b14e-4241-9102-4473869410d8-combined-ca-bundle\") pod \"df8f9a6f-b14e-4241-9102-4473869410d8\" (UID: \"df8f9a6f-b14e-4241-9102-4473869410d8\") " Oct 12 07:50:18 crc kubenswrapper[4599]: I1012 07:50:18.113739 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68qvr\" (UniqueName: \"kubernetes.io/projected/df8f9a6f-b14e-4241-9102-4473869410d8-kube-api-access-68qvr\") pod \"df8f9a6f-b14e-4241-9102-4473869410d8\" (UID: \"df8f9a6f-b14e-4241-9102-4473869410d8\") " Oct 12 07:50:18 crc kubenswrapper[4599]: I1012 07:50:18.113831 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df8f9a6f-b14e-4241-9102-4473869410d8-logs\") pod \"df8f9a6f-b14e-4241-9102-4473869410d8\" (UID: \"df8f9a6f-b14e-4241-9102-4473869410d8\") " Oct 12 07:50:18 crc kubenswrapper[4599]: I1012 07:50:18.113895 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df8f9a6f-b14e-4241-9102-4473869410d8-scripts\") pod \"df8f9a6f-b14e-4241-9102-4473869410d8\" (UID: \"df8f9a6f-b14e-4241-9102-4473869410d8\") " Oct 12 07:50:18 crc kubenswrapper[4599]: I1012 07:50:18.113916 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/df8f9a6f-b14e-4241-9102-4473869410d8-config-data-custom\") pod \"df8f9a6f-b14e-4241-9102-4473869410d8\" (UID: \"df8f9a6f-b14e-4241-9102-4473869410d8\") " Oct 12 07:50:18 crc kubenswrapper[4599]: I1012 07:50:18.114003 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/df8f9a6f-b14e-4241-9102-4473869410d8-etc-machine-id\") pod \"df8f9a6f-b14e-4241-9102-4473869410d8\" (UID: \"df8f9a6f-b14e-4241-9102-4473869410d8\") " Oct 12 07:50:18 crc kubenswrapper[4599]: I1012 07:50:18.114519 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/df8f9a6f-b14e-4241-9102-4473869410d8-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "df8f9a6f-b14e-4241-9102-4473869410d8" (UID: "df8f9a6f-b14e-4241-9102-4473869410d8"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 12 07:50:18 crc kubenswrapper[4599]: I1012 07:50:18.114867 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df8f9a6f-b14e-4241-9102-4473869410d8-logs" (OuterVolumeSpecName: "logs") pod "df8f9a6f-b14e-4241-9102-4473869410d8" (UID: "df8f9a6f-b14e-4241-9102-4473869410d8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:50:18 crc kubenswrapper[4599]: I1012 07:50:18.123811 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df8f9a6f-b14e-4241-9102-4473869410d8-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "df8f9a6f-b14e-4241-9102-4473869410d8" (UID: "df8f9a6f-b14e-4241-9102-4473869410d8"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:50:18 crc kubenswrapper[4599]: I1012 07:50:18.139584 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df8f9a6f-b14e-4241-9102-4473869410d8-scripts" (OuterVolumeSpecName: "scripts") pod "df8f9a6f-b14e-4241-9102-4473869410d8" (UID: "df8f9a6f-b14e-4241-9102-4473869410d8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:50:18 crc kubenswrapper[4599]: I1012 07:50:18.139725 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df8f9a6f-b14e-4241-9102-4473869410d8-kube-api-access-68qvr" (OuterVolumeSpecName: "kube-api-access-68qvr") pod "df8f9a6f-b14e-4241-9102-4473869410d8" (UID: "df8f9a6f-b14e-4241-9102-4473869410d8"). InnerVolumeSpecName "kube-api-access-68qvr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:50:18 crc kubenswrapper[4599]: I1012 07:50:18.146290 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df8f9a6f-b14e-4241-9102-4473869410d8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "df8f9a6f-b14e-4241-9102-4473869410d8" (UID: "df8f9a6f-b14e-4241-9102-4473869410d8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:50:18 crc kubenswrapper[4599]: I1012 07:50:18.188433 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df8f9a6f-b14e-4241-9102-4473869410d8-config-data" (OuterVolumeSpecName: "config-data") pod "df8f9a6f-b14e-4241-9102-4473869410d8" (UID: "df8f9a6f-b14e-4241-9102-4473869410d8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:50:18 crc kubenswrapper[4599]: I1012 07:50:18.218458 4599 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df8f9a6f-b14e-4241-9102-4473869410d8-scripts\") on node \"crc\" DevicePath \"\"" Oct 12 07:50:18 crc kubenswrapper[4599]: I1012 07:50:18.218484 4599 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/df8f9a6f-b14e-4241-9102-4473869410d8-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 12 07:50:18 crc kubenswrapper[4599]: I1012 07:50:18.218496 4599 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/df8f9a6f-b14e-4241-9102-4473869410d8-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 12 07:50:18 crc kubenswrapper[4599]: I1012 07:50:18.218504 4599 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df8f9a6f-b14e-4241-9102-4473869410d8-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 07:50:18 crc kubenswrapper[4599]: I1012 07:50:18.218513 4599 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df8f9a6f-b14e-4241-9102-4473869410d8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 07:50:18 crc kubenswrapper[4599]: I1012 07:50:18.218521 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-68qvr\" (UniqueName: \"kubernetes.io/projected/df8f9a6f-b14e-4241-9102-4473869410d8-kube-api-access-68qvr\") on node \"crc\" DevicePath \"\"" Oct 12 07:50:18 crc kubenswrapper[4599]: I1012 07:50:18.218531 4599 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df8f9a6f-b14e-4241-9102-4473869410d8-logs\") on node \"crc\" DevicePath \"\"" Oct 12 07:50:18 crc kubenswrapper[4599]: I1012 07:50:18.228628 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-586f7cb8d6-xq2lk" Oct 12 07:50:18 crc kubenswrapper[4599]: I1012 07:50:18.272314 4599 generic.go:334] "Generic (PLEG): container finished" podID="df8f9a6f-b14e-4241-9102-4473869410d8" containerID="28a2b22c580e0d331820a034f24313ab13868922510e7fd9b01c2e507e9f2972" exitCode=137 Oct 12 07:50:18 crc kubenswrapper[4599]: I1012 07:50:18.272403 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 12 07:50:18 crc kubenswrapper[4599]: I1012 07:50:18.272408 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"df8f9a6f-b14e-4241-9102-4473869410d8","Type":"ContainerDied","Data":"28a2b22c580e0d331820a034f24313ab13868922510e7fd9b01c2e507e9f2972"} Oct 12 07:50:18 crc kubenswrapper[4599]: I1012 07:50:18.272551 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"df8f9a6f-b14e-4241-9102-4473869410d8","Type":"ContainerDied","Data":"ce6c44a2f2214b77bd3b08852683df1cfd012a0abe8006d0945c8be28d90cc2f"} Oct 12 07:50:18 crc kubenswrapper[4599]: I1012 07:50:18.272583 4599 scope.go:117] "RemoveContainer" containerID="28a2b22c580e0d331820a034f24313ab13868922510e7fd9b01c2e507e9f2972" Oct 12 07:50:18 crc kubenswrapper[4599]: I1012 07:50:18.275831 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vvd6p" event={"ID":"fdd9da54-4ee3-40c8-af3b-fb522971c160","Type":"ContainerStarted","Data":"8b426305db57f447fca7b13b43be22f2b95a9f61dee44bf35039d349186b6df1"} Oct 12 07:50:18 crc kubenswrapper[4599]: I1012 07:50:18.280927 4599 generic.go:334] "Generic (PLEG): container finished" podID="9f317920-4f82-4691-8e1c-31b0b35526e8" containerID="f9cc29531352bab33de61b83e169f4cc518d5c558085a4b25545c45ffe540bc5" exitCode=0 Oct 12 07:50:18 crc kubenswrapper[4599]: I1012 07:50:18.280996 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rk6kl" event={"ID":"9f317920-4f82-4691-8e1c-31b0b35526e8","Type":"ContainerDied","Data":"f9cc29531352bab33de61b83e169f4cc518d5c558085a4b25545c45ffe540bc5"} Oct 12 07:50:18 crc kubenswrapper[4599]: I1012 07:50:18.281025 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rk6kl" event={"ID":"9f317920-4f82-4691-8e1c-31b0b35526e8","Type":"ContainerStarted","Data":"d66a28f2ed0b2c521f2fb1f7a3b55ae8b19955a89a1cec31c7131bce347b42bd"} Oct 12 07:50:18 crc kubenswrapper[4599]: I1012 07:50:18.284232 4599 generic.go:334] "Generic (PLEG): container finished" podID="231584a7-c306-4904-9e9b-33789d2d42bd" containerID="d6a485cb3d09277c99a1466dfa01b9a467b7629f0348cf90738e98d023d5b588" exitCode=0 Oct 12 07:50:18 crc kubenswrapper[4599]: I1012 07:50:18.284263 4599 generic.go:334] "Generic (PLEG): container finished" podID="231584a7-c306-4904-9e9b-33789d2d42bd" containerID="31fa37b1108d09e65952d61876d188c75a92a7ec64185f51761a8b0c066ca976" exitCode=2 Oct 12 07:50:18 crc kubenswrapper[4599]: I1012 07:50:18.284271 4599 generic.go:334] "Generic (PLEG): container finished" podID="231584a7-c306-4904-9e9b-33789d2d42bd" containerID="9f0345b593bc8549548e28ca4afefae19c14e8a42fbfbdf1e752f086d5846fe4" exitCode=0 Oct 12 07:50:18 crc kubenswrapper[4599]: I1012 07:50:18.284315 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"231584a7-c306-4904-9e9b-33789d2d42bd","Type":"ContainerDied","Data":"d6a485cb3d09277c99a1466dfa01b9a467b7629f0348cf90738e98d023d5b588"} Oct 12 07:50:18 crc kubenswrapper[4599]: I1012 07:50:18.284357 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"231584a7-c306-4904-9e9b-33789d2d42bd","Type":"ContainerDied","Data":"31fa37b1108d09e65952d61876d188c75a92a7ec64185f51761a8b0c066ca976"} Oct 12 07:50:18 crc kubenswrapper[4599]: I1012 07:50:18.284370 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"231584a7-c306-4904-9e9b-33789d2d42bd","Type":"ContainerDied","Data":"9f0345b593bc8549548e28ca4afefae19c14e8a42fbfbdf1e752f086d5846fe4"} Oct 12 07:50:18 crc kubenswrapper[4599]: I1012 07:50:18.285544 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a26e3618-b309-4ac5-b0e1-39feba422ef6","Type":"ContainerStarted","Data":"92bbbcbeb0bb4546f381d8c54d99b14f557c569ab6d3a62104494d4da5eb0a18"} Oct 12 07:50:18 crc kubenswrapper[4599]: I1012 07:50:18.295996 4599 scope.go:117] "RemoveContainer" containerID="b116cb821939a9c933101c37932183909c5cfd195bf545978df7572d26a519fe" Oct 12 07:50:18 crc kubenswrapper[4599]: I1012 07:50:18.331828 4599 scope.go:117] "RemoveContainer" containerID="28a2b22c580e0d331820a034f24313ab13868922510e7fd9b01c2e507e9f2972" Oct 12 07:50:18 crc kubenswrapper[4599]: E1012 07:50:18.332759 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28a2b22c580e0d331820a034f24313ab13868922510e7fd9b01c2e507e9f2972\": container with ID starting with 28a2b22c580e0d331820a034f24313ab13868922510e7fd9b01c2e507e9f2972 not found: ID does not exist" containerID="28a2b22c580e0d331820a034f24313ab13868922510e7fd9b01c2e507e9f2972" Oct 12 07:50:18 crc kubenswrapper[4599]: I1012 07:50:18.332817 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28a2b22c580e0d331820a034f24313ab13868922510e7fd9b01c2e507e9f2972"} err="failed to get container status \"28a2b22c580e0d331820a034f24313ab13868922510e7fd9b01c2e507e9f2972\": rpc error: code = NotFound desc = could not find container \"28a2b22c580e0d331820a034f24313ab13868922510e7fd9b01c2e507e9f2972\": container with ID starting with 28a2b22c580e0d331820a034f24313ab13868922510e7fd9b01c2e507e9f2972 not found: ID does not exist" Oct 12 07:50:18 crc kubenswrapper[4599]: I1012 07:50:18.332849 4599 scope.go:117] "RemoveContainer" containerID="b116cb821939a9c933101c37932183909c5cfd195bf545978df7572d26a519fe" Oct 12 07:50:18 crc kubenswrapper[4599]: E1012 07:50:18.334275 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b116cb821939a9c933101c37932183909c5cfd195bf545978df7572d26a519fe\": container with ID starting with b116cb821939a9c933101c37932183909c5cfd195bf545978df7572d26a519fe not found: ID does not exist" containerID="b116cb821939a9c933101c37932183909c5cfd195bf545978df7572d26a519fe" Oct 12 07:50:18 crc kubenswrapper[4599]: I1012 07:50:18.334373 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b116cb821939a9c933101c37932183909c5cfd195bf545978df7572d26a519fe"} err="failed to get container status \"b116cb821939a9c933101c37932183909c5cfd195bf545978df7572d26a519fe\": rpc error: code = NotFound desc = could not find container \"b116cb821939a9c933101c37932183909c5cfd195bf545978df7572d26a519fe\": container with ID starting with b116cb821939a9c933101c37932183909c5cfd195bf545978df7572d26a519fe not found: ID does not exist" Oct 12 07:50:18 crc kubenswrapper[4599]: I1012 07:50:18.337426 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 12 07:50:18 crc kubenswrapper[4599]: I1012 07:50:18.342357 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Oct 12 07:50:18 crc kubenswrapper[4599]: I1012 07:50:18.359131 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 12 07:50:18 crc kubenswrapper[4599]: E1012 07:50:18.359617 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df8f9a6f-b14e-4241-9102-4473869410d8" containerName="cinder-api-log" Oct 12 07:50:18 crc kubenswrapper[4599]: I1012 07:50:18.359638 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="df8f9a6f-b14e-4241-9102-4473869410d8" containerName="cinder-api-log" Oct 12 07:50:18 crc kubenswrapper[4599]: E1012 07:50:18.359680 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df8f9a6f-b14e-4241-9102-4473869410d8" containerName="cinder-api" Oct 12 07:50:18 crc kubenswrapper[4599]: I1012 07:50:18.359686 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="df8f9a6f-b14e-4241-9102-4473869410d8" containerName="cinder-api" Oct 12 07:50:18 crc kubenswrapper[4599]: I1012 07:50:18.359860 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="df8f9a6f-b14e-4241-9102-4473869410d8" containerName="cinder-api-log" Oct 12 07:50:18 crc kubenswrapper[4599]: I1012 07:50:18.359877 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="df8f9a6f-b14e-4241-9102-4473869410d8" containerName="cinder-api" Oct 12 07:50:18 crc kubenswrapper[4599]: I1012 07:50:18.360857 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 12 07:50:18 crc kubenswrapper[4599]: I1012 07:50:18.363197 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Oct 12 07:50:18 crc kubenswrapper[4599]: I1012 07:50:18.363576 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 12 07:50:18 crc kubenswrapper[4599]: I1012 07:50:18.363600 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Oct 12 07:50:18 crc kubenswrapper[4599]: I1012 07:50:18.389703 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 12 07:50:18 crc kubenswrapper[4599]: I1012 07:50:18.425593 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d006848-2829-4dab-b441-dddfc1737bfa-scripts\") pod \"cinder-api-0\" (UID: \"5d006848-2829-4dab-b441-dddfc1737bfa\") " pod="openstack/cinder-api-0" Oct 12 07:50:18 crc kubenswrapper[4599]: I1012 07:50:18.425702 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d006848-2829-4dab-b441-dddfc1737bfa-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"5d006848-2829-4dab-b441-dddfc1737bfa\") " pod="openstack/cinder-api-0" Oct 12 07:50:18 crc kubenswrapper[4599]: I1012 07:50:18.425753 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d006848-2829-4dab-b441-dddfc1737bfa-config-data\") pod \"cinder-api-0\" (UID: \"5d006848-2829-4dab-b441-dddfc1737bfa\") " pod="openstack/cinder-api-0" Oct 12 07:50:18 crc kubenswrapper[4599]: I1012 07:50:18.425780 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5d006848-2829-4dab-b441-dddfc1737bfa-config-data-custom\") pod \"cinder-api-0\" (UID: \"5d006848-2829-4dab-b441-dddfc1737bfa\") " pod="openstack/cinder-api-0" Oct 12 07:50:18 crc kubenswrapper[4599]: I1012 07:50:18.426039 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67sdc\" (UniqueName: \"kubernetes.io/projected/5d006848-2829-4dab-b441-dddfc1737bfa-kube-api-access-67sdc\") pod \"cinder-api-0\" (UID: \"5d006848-2829-4dab-b441-dddfc1737bfa\") " pod="openstack/cinder-api-0" Oct 12 07:50:18 crc kubenswrapper[4599]: I1012 07:50:18.426292 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d006848-2829-4dab-b441-dddfc1737bfa-logs\") pod \"cinder-api-0\" (UID: \"5d006848-2829-4dab-b441-dddfc1737bfa\") " pod="openstack/cinder-api-0" Oct 12 07:50:18 crc kubenswrapper[4599]: I1012 07:50:18.426528 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5d006848-2829-4dab-b441-dddfc1737bfa-etc-machine-id\") pod \"cinder-api-0\" (UID: \"5d006848-2829-4dab-b441-dddfc1737bfa\") " pod="openstack/cinder-api-0" Oct 12 07:50:18 crc kubenswrapper[4599]: I1012 07:50:18.426613 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d006848-2829-4dab-b441-dddfc1737bfa-public-tls-certs\") pod \"cinder-api-0\" (UID: \"5d006848-2829-4dab-b441-dddfc1737bfa\") " pod="openstack/cinder-api-0" Oct 12 07:50:18 crc kubenswrapper[4599]: I1012 07:50:18.426653 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d006848-2829-4dab-b441-dddfc1737bfa-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"5d006848-2829-4dab-b441-dddfc1737bfa\") " pod="openstack/cinder-api-0" Oct 12 07:50:18 crc kubenswrapper[4599]: I1012 07:50:18.529070 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5d006848-2829-4dab-b441-dddfc1737bfa-etc-machine-id\") pod \"cinder-api-0\" (UID: \"5d006848-2829-4dab-b441-dddfc1737bfa\") " pod="openstack/cinder-api-0" Oct 12 07:50:18 crc kubenswrapper[4599]: I1012 07:50:18.529141 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d006848-2829-4dab-b441-dddfc1737bfa-public-tls-certs\") pod \"cinder-api-0\" (UID: \"5d006848-2829-4dab-b441-dddfc1737bfa\") " pod="openstack/cinder-api-0" Oct 12 07:50:18 crc kubenswrapper[4599]: I1012 07:50:18.529176 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d006848-2829-4dab-b441-dddfc1737bfa-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"5d006848-2829-4dab-b441-dddfc1737bfa\") " pod="openstack/cinder-api-0" Oct 12 07:50:18 crc kubenswrapper[4599]: I1012 07:50:18.529242 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d006848-2829-4dab-b441-dddfc1737bfa-scripts\") pod \"cinder-api-0\" (UID: \"5d006848-2829-4dab-b441-dddfc1737bfa\") " pod="openstack/cinder-api-0" Oct 12 07:50:18 crc kubenswrapper[4599]: I1012 07:50:18.529252 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5d006848-2829-4dab-b441-dddfc1737bfa-etc-machine-id\") pod \"cinder-api-0\" (UID: \"5d006848-2829-4dab-b441-dddfc1737bfa\") " pod="openstack/cinder-api-0" Oct 12 07:50:18 crc kubenswrapper[4599]: I1012 07:50:18.529285 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d006848-2829-4dab-b441-dddfc1737bfa-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"5d006848-2829-4dab-b441-dddfc1737bfa\") " pod="openstack/cinder-api-0" Oct 12 07:50:18 crc kubenswrapper[4599]: I1012 07:50:18.529350 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d006848-2829-4dab-b441-dddfc1737bfa-config-data\") pod \"cinder-api-0\" (UID: \"5d006848-2829-4dab-b441-dddfc1737bfa\") " pod="openstack/cinder-api-0" Oct 12 07:50:18 crc kubenswrapper[4599]: I1012 07:50:18.529380 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5d006848-2829-4dab-b441-dddfc1737bfa-config-data-custom\") pod \"cinder-api-0\" (UID: \"5d006848-2829-4dab-b441-dddfc1737bfa\") " pod="openstack/cinder-api-0" Oct 12 07:50:18 crc kubenswrapper[4599]: I1012 07:50:18.529426 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67sdc\" (UniqueName: \"kubernetes.io/projected/5d006848-2829-4dab-b441-dddfc1737bfa-kube-api-access-67sdc\") pod \"cinder-api-0\" (UID: \"5d006848-2829-4dab-b441-dddfc1737bfa\") " pod="openstack/cinder-api-0" Oct 12 07:50:18 crc kubenswrapper[4599]: I1012 07:50:18.529471 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d006848-2829-4dab-b441-dddfc1737bfa-logs\") pod \"cinder-api-0\" (UID: \"5d006848-2829-4dab-b441-dddfc1737bfa\") " pod="openstack/cinder-api-0" Oct 12 07:50:18 crc kubenswrapper[4599]: I1012 07:50:18.530159 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d006848-2829-4dab-b441-dddfc1737bfa-logs\") pod \"cinder-api-0\" (UID: \"5d006848-2829-4dab-b441-dddfc1737bfa\") " pod="openstack/cinder-api-0" Oct 12 07:50:18 crc kubenswrapper[4599]: I1012 07:50:18.535556 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d006848-2829-4dab-b441-dddfc1737bfa-scripts\") pod \"cinder-api-0\" (UID: \"5d006848-2829-4dab-b441-dddfc1737bfa\") " pod="openstack/cinder-api-0" Oct 12 07:50:18 crc kubenswrapper[4599]: I1012 07:50:18.535640 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d006848-2829-4dab-b441-dddfc1737bfa-config-data\") pod \"cinder-api-0\" (UID: \"5d006848-2829-4dab-b441-dddfc1737bfa\") " pod="openstack/cinder-api-0" Oct 12 07:50:18 crc kubenswrapper[4599]: I1012 07:50:18.535727 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d006848-2829-4dab-b441-dddfc1737bfa-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"5d006848-2829-4dab-b441-dddfc1737bfa\") " pod="openstack/cinder-api-0" Oct 12 07:50:18 crc kubenswrapper[4599]: I1012 07:50:18.536451 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5d006848-2829-4dab-b441-dddfc1737bfa-config-data-custom\") pod \"cinder-api-0\" (UID: \"5d006848-2829-4dab-b441-dddfc1737bfa\") " pod="openstack/cinder-api-0" Oct 12 07:50:18 crc kubenswrapper[4599]: I1012 07:50:18.537777 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d006848-2829-4dab-b441-dddfc1737bfa-public-tls-certs\") pod \"cinder-api-0\" (UID: \"5d006848-2829-4dab-b441-dddfc1737bfa\") " pod="openstack/cinder-api-0" Oct 12 07:50:18 crc kubenswrapper[4599]: I1012 07:50:18.537842 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d006848-2829-4dab-b441-dddfc1737bfa-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"5d006848-2829-4dab-b441-dddfc1737bfa\") " pod="openstack/cinder-api-0" Oct 12 07:50:18 crc kubenswrapper[4599]: I1012 07:50:18.544203 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67sdc\" (UniqueName: \"kubernetes.io/projected/5d006848-2829-4dab-b441-dddfc1737bfa-kube-api-access-67sdc\") pod \"cinder-api-0\" (UID: \"5d006848-2829-4dab-b441-dddfc1737bfa\") " pod="openstack/cinder-api-0" Oct 12 07:50:18 crc kubenswrapper[4599]: I1012 07:50:18.681575 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 12 07:50:19 crc kubenswrapper[4599]: I1012 07:50:19.074591 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 12 07:50:19 crc kubenswrapper[4599]: I1012 07:50:19.241483 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-2p2tr"] Oct 12 07:50:19 crc kubenswrapper[4599]: I1012 07:50:19.243067 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-2p2tr" Oct 12 07:50:19 crc kubenswrapper[4599]: I1012 07:50:19.271293 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-2p2tr"] Oct 12 07:50:19 crc kubenswrapper[4599]: I1012 07:50:19.297499 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a26e3618-b309-4ac5-b0e1-39feba422ef6","Type":"ContainerStarted","Data":"c281c3739991f4bcfd274cf832a0d71cfa540b1edf0db7ace762176c3f8ab519"} Oct 12 07:50:19 crc kubenswrapper[4599]: I1012 07:50:19.305302 4599 generic.go:334] "Generic (PLEG): container finished" podID="fdd9da54-4ee3-40c8-af3b-fb522971c160" containerID="8b426305db57f447fca7b13b43be22f2b95a9f61dee44bf35039d349186b6df1" exitCode=0 Oct 12 07:50:19 crc kubenswrapper[4599]: I1012 07:50:19.305395 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vvd6p" event={"ID":"fdd9da54-4ee3-40c8-af3b-fb522971c160","Type":"ContainerDied","Data":"8b426305db57f447fca7b13b43be22f2b95a9f61dee44bf35039d349186b6df1"} Oct 12 07:50:19 crc kubenswrapper[4599]: I1012 07:50:19.308990 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rk6kl" event={"ID":"9f317920-4f82-4691-8e1c-31b0b35526e8","Type":"ContainerStarted","Data":"ab4347a8dd9086883071cf633a1b3ed022ddc06d202e90171c2686bcf93e2b81"} Oct 12 07:50:19 crc kubenswrapper[4599]: I1012 07:50:19.315614 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"5d006848-2829-4dab-b441-dddfc1737bfa","Type":"ContainerStarted","Data":"e8367aee203cf3aa7021edb152a68ec2a6fa768b9913350489cbe4d1136be38c"} Oct 12 07:50:19 crc kubenswrapper[4599]: I1012 07:50:19.322643 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.322629409 podStartE2EDuration="3.322629409s" podCreationTimestamp="2025-10-12 07:50:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:50:19.32187744 +0000 UTC m=+916.111072941" watchObservedRunningTime="2025-10-12 07:50:19.322629409 +0000 UTC m=+916.111824911" Oct 12 07:50:19 crc kubenswrapper[4599]: I1012 07:50:19.340678 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-rfqhk"] Oct 12 07:50:19 crc kubenswrapper[4599]: I1012 07:50:19.343530 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-rfqhk" Oct 12 07:50:19 crc kubenswrapper[4599]: I1012 07:50:19.385483 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-rfqhk"] Oct 12 07:50:19 crc kubenswrapper[4599]: I1012 07:50:19.451897 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-nsn2r"] Oct 12 07:50:19 crc kubenswrapper[4599]: I1012 07:50:19.453657 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-nsn2r" Oct 12 07:50:19 crc kubenswrapper[4599]: I1012 07:50:19.463167 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-nsn2r"] Oct 12 07:50:19 crc kubenswrapper[4599]: I1012 07:50:19.481747 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khtsd\" (UniqueName: \"kubernetes.io/projected/84022a3c-4f3a-40f7-b3f6-6aa3a83fb174-kube-api-access-khtsd\") pod \"nova-cell0-db-create-rfqhk\" (UID: \"84022a3c-4f3a-40f7-b3f6-6aa3a83fb174\") " pod="openstack/nova-cell0-db-create-rfqhk" Oct 12 07:50:19 crc kubenswrapper[4599]: I1012 07:50:19.481882 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pb8hp\" (UniqueName: \"kubernetes.io/projected/ac30765e-04ab-4e53-87ca-8d5bcf9df91b-kube-api-access-pb8hp\") pod \"nova-api-db-create-2p2tr\" (UID: \"ac30765e-04ab-4e53-87ca-8d5bcf9df91b\") " pod="openstack/nova-api-db-create-2p2tr" Oct 12 07:50:19 crc kubenswrapper[4599]: I1012 07:50:19.559601 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df8f9a6f-b14e-4241-9102-4473869410d8" path="/var/lib/kubelet/pods/df8f9a6f-b14e-4241-9102-4473869410d8/volumes" Oct 12 07:50:19 crc kubenswrapper[4599]: I1012 07:50:19.584746 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q78ds\" (UniqueName: \"kubernetes.io/projected/23162836-2301-4422-b6c0-30591a99bed0-kube-api-access-q78ds\") pod \"nova-cell1-db-create-nsn2r\" (UID: \"23162836-2301-4422-b6c0-30591a99bed0\") " pod="openstack/nova-cell1-db-create-nsn2r" Oct 12 07:50:19 crc kubenswrapper[4599]: I1012 07:50:19.584853 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khtsd\" (UniqueName: \"kubernetes.io/projected/84022a3c-4f3a-40f7-b3f6-6aa3a83fb174-kube-api-access-khtsd\") pod \"nova-cell0-db-create-rfqhk\" (UID: \"84022a3c-4f3a-40f7-b3f6-6aa3a83fb174\") " pod="openstack/nova-cell0-db-create-rfqhk" Oct 12 07:50:19 crc kubenswrapper[4599]: I1012 07:50:19.584902 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pb8hp\" (UniqueName: \"kubernetes.io/projected/ac30765e-04ab-4e53-87ca-8d5bcf9df91b-kube-api-access-pb8hp\") pod \"nova-api-db-create-2p2tr\" (UID: \"ac30765e-04ab-4e53-87ca-8d5bcf9df91b\") " pod="openstack/nova-api-db-create-2p2tr" Oct 12 07:50:19 crc kubenswrapper[4599]: I1012 07:50:19.603265 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khtsd\" (UniqueName: \"kubernetes.io/projected/84022a3c-4f3a-40f7-b3f6-6aa3a83fb174-kube-api-access-khtsd\") pod \"nova-cell0-db-create-rfqhk\" (UID: \"84022a3c-4f3a-40f7-b3f6-6aa3a83fb174\") " pod="openstack/nova-cell0-db-create-rfqhk" Oct 12 07:50:19 crc kubenswrapper[4599]: I1012 07:50:19.603789 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pb8hp\" (UniqueName: \"kubernetes.io/projected/ac30765e-04ab-4e53-87ca-8d5bcf9df91b-kube-api-access-pb8hp\") pod \"nova-api-db-create-2p2tr\" (UID: \"ac30765e-04ab-4e53-87ca-8d5bcf9df91b\") " pod="openstack/nova-api-db-create-2p2tr" Oct 12 07:50:19 crc kubenswrapper[4599]: I1012 07:50:19.686362 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-rfqhk" Oct 12 07:50:19 crc kubenswrapper[4599]: I1012 07:50:19.688156 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q78ds\" (UniqueName: \"kubernetes.io/projected/23162836-2301-4422-b6c0-30591a99bed0-kube-api-access-q78ds\") pod \"nova-cell1-db-create-nsn2r\" (UID: \"23162836-2301-4422-b6c0-30591a99bed0\") " pod="openstack/nova-cell1-db-create-nsn2r" Oct 12 07:50:19 crc kubenswrapper[4599]: I1012 07:50:19.703297 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q78ds\" (UniqueName: \"kubernetes.io/projected/23162836-2301-4422-b6c0-30591a99bed0-kube-api-access-q78ds\") pod \"nova-cell1-db-create-nsn2r\" (UID: \"23162836-2301-4422-b6c0-30591a99bed0\") " pod="openstack/nova-cell1-db-create-nsn2r" Oct 12 07:50:19 crc kubenswrapper[4599]: I1012 07:50:19.779102 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-nsn2r" Oct 12 07:50:19 crc kubenswrapper[4599]: I1012 07:50:19.867184 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-2p2tr" Oct 12 07:50:20 crc kubenswrapper[4599]: I1012 07:50:20.086468 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-rfqhk"] Oct 12 07:50:20 crc kubenswrapper[4599]: I1012 07:50:20.215922 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-nsn2r"] Oct 12 07:50:20 crc kubenswrapper[4599]: I1012 07:50:20.334934 4599 generic.go:334] "Generic (PLEG): container finished" podID="84022a3c-4f3a-40f7-b3f6-6aa3a83fb174" containerID="058b39d173e98ee0de66806ea3590e63914b0753557dc443f86bfee5ab33f4e1" exitCode=0 Oct 12 07:50:20 crc kubenswrapper[4599]: I1012 07:50:20.335124 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-rfqhk" event={"ID":"84022a3c-4f3a-40f7-b3f6-6aa3a83fb174","Type":"ContainerDied","Data":"058b39d173e98ee0de66806ea3590e63914b0753557dc443f86bfee5ab33f4e1"} Oct 12 07:50:20 crc kubenswrapper[4599]: I1012 07:50:20.336045 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-rfqhk" event={"ID":"84022a3c-4f3a-40f7-b3f6-6aa3a83fb174","Type":"ContainerStarted","Data":"83417eec70fa4716915d5fe84213c4f74374c33cc05c198fc987bcfabb01bec3"} Oct 12 07:50:20 crc kubenswrapper[4599]: I1012 07:50:20.345678 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vvd6p" event={"ID":"fdd9da54-4ee3-40c8-af3b-fb522971c160","Type":"ContainerStarted","Data":"8a9161f6eb5724ce2b7b744603c36a31e9bbbf8a9f287ef35511a0953a5edf70"} Oct 12 07:50:20 crc kubenswrapper[4599]: I1012 07:50:20.347806 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-2p2tr"] Oct 12 07:50:20 crc kubenswrapper[4599]: I1012 07:50:20.349082 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-nsn2r" event={"ID":"23162836-2301-4422-b6c0-30591a99bed0","Type":"ContainerStarted","Data":"3c8eeb75912fe3fac4b9ed1c57f44973e8c9b1e7b760db3c770803b487df929d"} Oct 12 07:50:20 crc kubenswrapper[4599]: I1012 07:50:20.356129 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"5d006848-2829-4dab-b441-dddfc1737bfa","Type":"ContainerStarted","Data":"1123015fbef33924dcfc910ed316daf84f0d937ed6b2790a23410bff3bd0c0cf"} Oct 12 07:50:20 crc kubenswrapper[4599]: I1012 07:50:20.356189 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 12 07:50:20 crc kubenswrapper[4599]: I1012 07:50:20.358106 4599 generic.go:334] "Generic (PLEG): container finished" podID="9f317920-4f82-4691-8e1c-31b0b35526e8" containerID="ab4347a8dd9086883071cf633a1b3ed022ddc06d202e90171c2686bcf93e2b81" exitCode=0 Oct 12 07:50:20 crc kubenswrapper[4599]: I1012 07:50:20.358958 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rk6kl" event={"ID":"9f317920-4f82-4691-8e1c-31b0b35526e8","Type":"ContainerDied","Data":"ab4347a8dd9086883071cf633a1b3ed022ddc06d202e90171c2686bcf93e2b81"} Oct 12 07:50:20 crc kubenswrapper[4599]: I1012 07:50:20.375721 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vvd6p" podStartSLOduration=2.7104057470000003 podStartE2EDuration="5.375706829s" podCreationTimestamp="2025-10-12 07:50:15 +0000 UTC" firstStartedPulling="2025-10-12 07:50:17.253535453 +0000 UTC m=+914.042730955" lastFinishedPulling="2025-10-12 07:50:19.918836535 +0000 UTC m=+916.708032037" observedRunningTime="2025-10-12 07:50:20.363285606 +0000 UTC m=+917.152481108" watchObservedRunningTime="2025-10-12 07:50:20.375706829 +0000 UTC m=+917.164902331" Oct 12 07:50:20 crc kubenswrapper[4599]: I1012 07:50:20.385535 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=2.38551551 podStartE2EDuration="2.38551551s" podCreationTimestamp="2025-10-12 07:50:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:50:20.38174293 +0000 UTC m=+917.170938432" watchObservedRunningTime="2025-10-12 07:50:20.38551551 +0000 UTC m=+917.174711012" Oct 12 07:50:20 crc kubenswrapper[4599]: I1012 07:50:20.400589 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-nsn2r" podStartSLOduration=1.400568889 podStartE2EDuration="1.400568889s" podCreationTimestamp="2025-10-12 07:50:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:50:20.395633174 +0000 UTC m=+917.184828677" watchObservedRunningTime="2025-10-12 07:50:20.400568889 +0000 UTC m=+917.189764391" Oct 12 07:50:20 crc kubenswrapper[4599]: W1012 07:50:20.452403 4599 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f317920_4f82_4691_8e1c_31b0b35526e8.slice/crio-conmon-ab4347a8dd9086883071cf633a1b3ed022ddc06d202e90171c2686bcf93e2b81.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f317920_4f82_4691_8e1c_31b0b35526e8.slice/crio-conmon-ab4347a8dd9086883071cf633a1b3ed022ddc06d202e90171c2686bcf93e2b81.scope: no such file or directory Oct 12 07:50:20 crc kubenswrapper[4599]: W1012 07:50:20.452459 4599 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f317920_4f82_4691_8e1c_31b0b35526e8.slice/crio-ab4347a8dd9086883071cf633a1b3ed022ddc06d202e90171c2686bcf93e2b81.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f317920_4f82_4691_8e1c_31b0b35526e8.slice/crio-ab4347a8dd9086883071cf633a1b3ed022ddc06d202e90171c2686bcf93e2b81.scope: no such file or directory Oct 12 07:50:20 crc kubenswrapper[4599]: I1012 07:50:20.773252 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 12 07:50:20 crc kubenswrapper[4599]: I1012 07:50:20.818094 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/231584a7-c306-4904-9e9b-33789d2d42bd-run-httpd\") pod \"231584a7-c306-4904-9e9b-33789d2d42bd\" (UID: \"231584a7-c306-4904-9e9b-33789d2d42bd\") " Oct 12 07:50:20 crc kubenswrapper[4599]: I1012 07:50:20.818220 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/231584a7-c306-4904-9e9b-33789d2d42bd-sg-core-conf-yaml\") pod \"231584a7-c306-4904-9e9b-33789d2d42bd\" (UID: \"231584a7-c306-4904-9e9b-33789d2d42bd\") " Oct 12 07:50:20 crc kubenswrapper[4599]: I1012 07:50:20.818246 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/231584a7-c306-4904-9e9b-33789d2d42bd-combined-ca-bundle\") pod \"231584a7-c306-4904-9e9b-33789d2d42bd\" (UID: \"231584a7-c306-4904-9e9b-33789d2d42bd\") " Oct 12 07:50:20 crc kubenswrapper[4599]: I1012 07:50:20.818389 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/231584a7-c306-4904-9e9b-33789d2d42bd-scripts\") pod \"231584a7-c306-4904-9e9b-33789d2d42bd\" (UID: \"231584a7-c306-4904-9e9b-33789d2d42bd\") " Oct 12 07:50:20 crc kubenswrapper[4599]: I1012 07:50:20.818438 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/231584a7-c306-4904-9e9b-33789d2d42bd-log-httpd\") pod \"231584a7-c306-4904-9e9b-33789d2d42bd\" (UID: \"231584a7-c306-4904-9e9b-33789d2d42bd\") " Oct 12 07:50:20 crc kubenswrapper[4599]: I1012 07:50:20.818474 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/231584a7-c306-4904-9e9b-33789d2d42bd-config-data\") pod \"231584a7-c306-4904-9e9b-33789d2d42bd\" (UID: \"231584a7-c306-4904-9e9b-33789d2d42bd\") " Oct 12 07:50:20 crc kubenswrapper[4599]: I1012 07:50:20.818646 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/231584a7-c306-4904-9e9b-33789d2d42bd-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "231584a7-c306-4904-9e9b-33789d2d42bd" (UID: "231584a7-c306-4904-9e9b-33789d2d42bd"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:50:20 crc kubenswrapper[4599]: I1012 07:50:20.818958 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/231584a7-c306-4904-9e9b-33789d2d42bd-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "231584a7-c306-4904-9e9b-33789d2d42bd" (UID: "231584a7-c306-4904-9e9b-33789d2d42bd"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:50:20 crc kubenswrapper[4599]: I1012 07:50:20.818986 4599 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/231584a7-c306-4904-9e9b-33789d2d42bd-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 12 07:50:20 crc kubenswrapper[4599]: I1012 07:50:20.827875 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/231584a7-c306-4904-9e9b-33789d2d42bd-scripts" (OuterVolumeSpecName: "scripts") pod "231584a7-c306-4904-9e9b-33789d2d42bd" (UID: "231584a7-c306-4904-9e9b-33789d2d42bd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:50:20 crc kubenswrapper[4599]: I1012 07:50:20.847296 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/231584a7-c306-4904-9e9b-33789d2d42bd-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "231584a7-c306-4904-9e9b-33789d2d42bd" (UID: "231584a7-c306-4904-9e9b-33789d2d42bd"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:50:20 crc kubenswrapper[4599]: I1012 07:50:20.914257 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/231584a7-c306-4904-9e9b-33789d2d42bd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "231584a7-c306-4904-9e9b-33789d2d42bd" (UID: "231584a7-c306-4904-9e9b-33789d2d42bd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:50:20 crc kubenswrapper[4599]: I1012 07:50:20.916074 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/231584a7-c306-4904-9e9b-33789d2d42bd-config-data" (OuterVolumeSpecName: "config-data") pod "231584a7-c306-4904-9e9b-33789d2d42bd" (UID: "231584a7-c306-4904-9e9b-33789d2d42bd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:50:20 crc kubenswrapper[4599]: I1012 07:50:20.919939 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-snfzw\" (UniqueName: \"kubernetes.io/projected/231584a7-c306-4904-9e9b-33789d2d42bd-kube-api-access-snfzw\") pod \"231584a7-c306-4904-9e9b-33789d2d42bd\" (UID: \"231584a7-c306-4904-9e9b-33789d2d42bd\") " Oct 12 07:50:20 crc kubenswrapper[4599]: I1012 07:50:20.921006 4599 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/231584a7-c306-4904-9e9b-33789d2d42bd-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 12 07:50:20 crc kubenswrapper[4599]: I1012 07:50:20.921098 4599 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/231584a7-c306-4904-9e9b-33789d2d42bd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 07:50:20 crc kubenswrapper[4599]: I1012 07:50:20.921184 4599 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/231584a7-c306-4904-9e9b-33789d2d42bd-scripts\") on node \"crc\" DevicePath \"\"" Oct 12 07:50:20 crc kubenswrapper[4599]: I1012 07:50:20.921261 4599 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/231584a7-c306-4904-9e9b-33789d2d42bd-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 12 07:50:20 crc kubenswrapper[4599]: I1012 07:50:20.921393 4599 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/231584a7-c306-4904-9e9b-33789d2d42bd-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 07:50:20 crc kubenswrapper[4599]: I1012 07:50:20.922280 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/231584a7-c306-4904-9e9b-33789d2d42bd-kube-api-access-snfzw" (OuterVolumeSpecName: "kube-api-access-snfzw") pod "231584a7-c306-4904-9e9b-33789d2d42bd" (UID: "231584a7-c306-4904-9e9b-33789d2d42bd"). InnerVolumeSpecName "kube-api-access-snfzw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:50:21 crc kubenswrapper[4599]: I1012 07:50:21.023003 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-snfzw\" (UniqueName: \"kubernetes.io/projected/231584a7-c306-4904-9e9b-33789d2d42bd-kube-api-access-snfzw\") on node \"crc\" DevicePath \"\"" Oct 12 07:50:21 crc kubenswrapper[4599]: I1012 07:50:21.348102 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6dcf447b8f-d5qql" Oct 12 07:50:21 crc kubenswrapper[4599]: I1012 07:50:21.369356 4599 generic.go:334] "Generic (PLEG): container finished" podID="23162836-2301-4422-b6c0-30591a99bed0" containerID="f7e0ba824b59471ddc71385a78893d38f268f35e5ed401fe423e7e79ae0fb537" exitCode=0 Oct 12 07:50:21 crc kubenswrapper[4599]: I1012 07:50:21.369424 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-nsn2r" event={"ID":"23162836-2301-4422-b6c0-30591a99bed0","Type":"ContainerDied","Data":"f7e0ba824b59471ddc71385a78893d38f268f35e5ed401fe423e7e79ae0fb537"} Oct 12 07:50:21 crc kubenswrapper[4599]: I1012 07:50:21.371537 4599 generic.go:334] "Generic (PLEG): container finished" podID="ac30765e-04ab-4e53-87ca-8d5bcf9df91b" containerID="a84af588ccb5ecbc5a78a080a52ad804c0d650adac4e87c10abdbcf34d96c2e5" exitCode=0 Oct 12 07:50:21 crc kubenswrapper[4599]: I1012 07:50:21.371586 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-2p2tr" event={"ID":"ac30765e-04ab-4e53-87ca-8d5bcf9df91b","Type":"ContainerDied","Data":"a84af588ccb5ecbc5a78a080a52ad804c0d650adac4e87c10abdbcf34d96c2e5"} Oct 12 07:50:21 crc kubenswrapper[4599]: I1012 07:50:21.371603 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-2p2tr" event={"ID":"ac30765e-04ab-4e53-87ca-8d5bcf9df91b","Type":"ContainerStarted","Data":"355f0903fa2a573d2e0ad6e72f5d63d6cef5868a08a0c226d0f43ec8407291cc"} Oct 12 07:50:21 crc kubenswrapper[4599]: I1012 07:50:21.377995 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rk6kl" event={"ID":"9f317920-4f82-4691-8e1c-31b0b35526e8","Type":"ContainerStarted","Data":"b638aaa1589d905594aec15c290c10b79fe2cdcf894d90b0dfacac9aa4f378b7"} Oct 12 07:50:21 crc kubenswrapper[4599]: I1012 07:50:21.409200 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-586f7cb8d6-xq2lk"] Oct 12 07:50:21 crc kubenswrapper[4599]: I1012 07:50:21.409668 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-586f7cb8d6-xq2lk" podUID="fc5e8701-df4e-459c-85ae-78db217fcea0" containerName="neutron-api" containerID="cri-o://94fce9a0f63f7cf8b138f590f6da338cb28eb8bb2a21c68e841b5c625a81f6c4" gracePeriod=30 Oct 12 07:50:21 crc kubenswrapper[4599]: I1012 07:50:21.409819 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-586f7cb8d6-xq2lk" podUID="fc5e8701-df4e-459c-85ae-78db217fcea0" containerName="neutron-httpd" containerID="cri-o://869243432b03a8368e341b8063082bb68d7d1901d66174e6ffed62187df2b87e" gracePeriod=30 Oct 12 07:50:21 crc kubenswrapper[4599]: I1012 07:50:21.411524 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"5d006848-2829-4dab-b441-dddfc1737bfa","Type":"ContainerStarted","Data":"c262d4793ee22a5f9bfb8340b0f831114523f3e959c3e2da3ed1fda235c122f4"} Oct 12 07:50:21 crc kubenswrapper[4599]: I1012 07:50:21.421114 4599 generic.go:334] "Generic (PLEG): container finished" podID="231584a7-c306-4904-9e9b-33789d2d42bd" containerID="7f6072cfb4f6896768d0e01eca5e8c15338d230a547fd6a1d3ca96a4b37bfe68" exitCode=0 Oct 12 07:50:21 crc kubenswrapper[4599]: I1012 07:50:21.421490 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 12 07:50:21 crc kubenswrapper[4599]: I1012 07:50:21.422825 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"231584a7-c306-4904-9e9b-33789d2d42bd","Type":"ContainerDied","Data":"7f6072cfb4f6896768d0e01eca5e8c15338d230a547fd6a1d3ca96a4b37bfe68"} Oct 12 07:50:21 crc kubenswrapper[4599]: I1012 07:50:21.422864 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"231584a7-c306-4904-9e9b-33789d2d42bd","Type":"ContainerDied","Data":"059691a96770f3d3e027021cd291e5406ca3ab15d9e044c27da87547f4d01d10"} Oct 12 07:50:21 crc kubenswrapper[4599]: I1012 07:50:21.422882 4599 scope.go:117] "RemoveContainer" containerID="d6a485cb3d09277c99a1466dfa01b9a467b7629f0348cf90738e98d023d5b588" Oct 12 07:50:21 crc kubenswrapper[4599]: I1012 07:50:21.478801 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rk6kl" podStartSLOduration=2.896779467 podStartE2EDuration="5.478765965s" podCreationTimestamp="2025-10-12 07:50:16 +0000 UTC" firstStartedPulling="2025-10-12 07:50:18.296125219 +0000 UTC m=+915.085320721" lastFinishedPulling="2025-10-12 07:50:20.878111717 +0000 UTC m=+917.667307219" observedRunningTime="2025-10-12 07:50:21.461078184 +0000 UTC m=+918.250273696" watchObservedRunningTime="2025-10-12 07:50:21.478765965 +0000 UTC m=+918.267961467" Oct 12 07:50:21 crc kubenswrapper[4599]: I1012 07:50:21.531679 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 12 07:50:21 crc kubenswrapper[4599]: I1012 07:50:21.533526 4599 scope.go:117] "RemoveContainer" containerID="31fa37b1108d09e65952d61876d188c75a92a7ec64185f51761a8b0c066ca976" Oct 12 07:50:21 crc kubenswrapper[4599]: I1012 07:50:21.556994 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 12 07:50:21 crc kubenswrapper[4599]: I1012 07:50:21.567413 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 12 07:50:21 crc kubenswrapper[4599]: E1012 07:50:21.567840 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="231584a7-c306-4904-9e9b-33789d2d42bd" containerName="ceilometer-notification-agent" Oct 12 07:50:21 crc kubenswrapper[4599]: I1012 07:50:21.567863 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="231584a7-c306-4904-9e9b-33789d2d42bd" containerName="ceilometer-notification-agent" Oct 12 07:50:21 crc kubenswrapper[4599]: E1012 07:50:21.567885 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="231584a7-c306-4904-9e9b-33789d2d42bd" containerName="ceilometer-central-agent" Oct 12 07:50:21 crc kubenswrapper[4599]: I1012 07:50:21.567892 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="231584a7-c306-4904-9e9b-33789d2d42bd" containerName="ceilometer-central-agent" Oct 12 07:50:21 crc kubenswrapper[4599]: E1012 07:50:21.567905 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="231584a7-c306-4904-9e9b-33789d2d42bd" containerName="proxy-httpd" Oct 12 07:50:21 crc kubenswrapper[4599]: I1012 07:50:21.567911 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="231584a7-c306-4904-9e9b-33789d2d42bd" containerName="proxy-httpd" Oct 12 07:50:21 crc kubenswrapper[4599]: E1012 07:50:21.567935 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="231584a7-c306-4904-9e9b-33789d2d42bd" containerName="sg-core" Oct 12 07:50:21 crc kubenswrapper[4599]: I1012 07:50:21.567941 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="231584a7-c306-4904-9e9b-33789d2d42bd" containerName="sg-core" Oct 12 07:50:21 crc kubenswrapper[4599]: I1012 07:50:21.568094 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="231584a7-c306-4904-9e9b-33789d2d42bd" containerName="proxy-httpd" Oct 12 07:50:21 crc kubenswrapper[4599]: I1012 07:50:21.568110 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="231584a7-c306-4904-9e9b-33789d2d42bd" containerName="sg-core" Oct 12 07:50:21 crc kubenswrapper[4599]: I1012 07:50:21.568125 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="231584a7-c306-4904-9e9b-33789d2d42bd" containerName="ceilometer-central-agent" Oct 12 07:50:21 crc kubenswrapper[4599]: I1012 07:50:21.568145 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="231584a7-c306-4904-9e9b-33789d2d42bd" containerName="ceilometer-notification-agent" Oct 12 07:50:21 crc kubenswrapper[4599]: I1012 07:50:21.569878 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 12 07:50:21 crc kubenswrapper[4599]: I1012 07:50:21.572809 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 12 07:50:21 crc kubenswrapper[4599]: I1012 07:50:21.572956 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 12 07:50:21 crc kubenswrapper[4599]: I1012 07:50:21.575551 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 12 07:50:21 crc kubenswrapper[4599]: I1012 07:50:21.605912 4599 scope.go:117] "RemoveContainer" containerID="9f0345b593bc8549548e28ca4afefae19c14e8a42fbfbdf1e752f086d5846fe4" Oct 12 07:50:21 crc kubenswrapper[4599]: I1012 07:50:21.636372 4599 scope.go:117] "RemoveContainer" containerID="7f6072cfb4f6896768d0e01eca5e8c15338d230a547fd6a1d3ca96a4b37bfe68" Oct 12 07:50:21 crc kubenswrapper[4599]: I1012 07:50:21.661093 4599 scope.go:117] "RemoveContainer" containerID="d6a485cb3d09277c99a1466dfa01b9a467b7629f0348cf90738e98d023d5b588" Oct 12 07:50:21 crc kubenswrapper[4599]: E1012 07:50:21.661572 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6a485cb3d09277c99a1466dfa01b9a467b7629f0348cf90738e98d023d5b588\": container with ID starting with d6a485cb3d09277c99a1466dfa01b9a467b7629f0348cf90738e98d023d5b588 not found: ID does not exist" containerID="d6a485cb3d09277c99a1466dfa01b9a467b7629f0348cf90738e98d023d5b588" Oct 12 07:50:21 crc kubenswrapper[4599]: I1012 07:50:21.661605 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6a485cb3d09277c99a1466dfa01b9a467b7629f0348cf90738e98d023d5b588"} err="failed to get container status \"d6a485cb3d09277c99a1466dfa01b9a467b7629f0348cf90738e98d023d5b588\": rpc error: code = NotFound desc = could not find container \"d6a485cb3d09277c99a1466dfa01b9a467b7629f0348cf90738e98d023d5b588\": container with ID starting with d6a485cb3d09277c99a1466dfa01b9a467b7629f0348cf90738e98d023d5b588 not found: ID does not exist" Oct 12 07:50:21 crc kubenswrapper[4599]: I1012 07:50:21.661629 4599 scope.go:117] "RemoveContainer" containerID="31fa37b1108d09e65952d61876d188c75a92a7ec64185f51761a8b0c066ca976" Oct 12 07:50:21 crc kubenswrapper[4599]: E1012 07:50:21.661868 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31fa37b1108d09e65952d61876d188c75a92a7ec64185f51761a8b0c066ca976\": container with ID starting with 31fa37b1108d09e65952d61876d188c75a92a7ec64185f51761a8b0c066ca976 not found: ID does not exist" containerID="31fa37b1108d09e65952d61876d188c75a92a7ec64185f51761a8b0c066ca976" Oct 12 07:50:21 crc kubenswrapper[4599]: I1012 07:50:21.661935 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31fa37b1108d09e65952d61876d188c75a92a7ec64185f51761a8b0c066ca976"} err="failed to get container status \"31fa37b1108d09e65952d61876d188c75a92a7ec64185f51761a8b0c066ca976\": rpc error: code = NotFound desc = could not find container \"31fa37b1108d09e65952d61876d188c75a92a7ec64185f51761a8b0c066ca976\": container with ID starting with 31fa37b1108d09e65952d61876d188c75a92a7ec64185f51761a8b0c066ca976 not found: ID does not exist" Oct 12 07:50:21 crc kubenswrapper[4599]: I1012 07:50:21.661949 4599 scope.go:117] "RemoveContainer" containerID="9f0345b593bc8549548e28ca4afefae19c14e8a42fbfbdf1e752f086d5846fe4" Oct 12 07:50:21 crc kubenswrapper[4599]: E1012 07:50:21.662179 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f0345b593bc8549548e28ca4afefae19c14e8a42fbfbdf1e752f086d5846fe4\": container with ID starting with 9f0345b593bc8549548e28ca4afefae19c14e8a42fbfbdf1e752f086d5846fe4 not found: ID does not exist" containerID="9f0345b593bc8549548e28ca4afefae19c14e8a42fbfbdf1e752f086d5846fe4" Oct 12 07:50:21 crc kubenswrapper[4599]: I1012 07:50:21.662198 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f0345b593bc8549548e28ca4afefae19c14e8a42fbfbdf1e752f086d5846fe4"} err="failed to get container status \"9f0345b593bc8549548e28ca4afefae19c14e8a42fbfbdf1e752f086d5846fe4\": rpc error: code = NotFound desc = could not find container \"9f0345b593bc8549548e28ca4afefae19c14e8a42fbfbdf1e752f086d5846fe4\": container with ID starting with 9f0345b593bc8549548e28ca4afefae19c14e8a42fbfbdf1e752f086d5846fe4 not found: ID does not exist" Oct 12 07:50:21 crc kubenswrapper[4599]: I1012 07:50:21.662213 4599 scope.go:117] "RemoveContainer" containerID="7f6072cfb4f6896768d0e01eca5e8c15338d230a547fd6a1d3ca96a4b37bfe68" Oct 12 07:50:21 crc kubenswrapper[4599]: E1012 07:50:21.663206 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f6072cfb4f6896768d0e01eca5e8c15338d230a547fd6a1d3ca96a4b37bfe68\": container with ID starting with 7f6072cfb4f6896768d0e01eca5e8c15338d230a547fd6a1d3ca96a4b37bfe68 not found: ID does not exist" containerID="7f6072cfb4f6896768d0e01eca5e8c15338d230a547fd6a1d3ca96a4b37bfe68" Oct 12 07:50:21 crc kubenswrapper[4599]: I1012 07:50:21.663234 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f6072cfb4f6896768d0e01eca5e8c15338d230a547fd6a1d3ca96a4b37bfe68"} err="failed to get container status \"7f6072cfb4f6896768d0e01eca5e8c15338d230a547fd6a1d3ca96a4b37bfe68\": rpc error: code = NotFound desc = could not find container \"7f6072cfb4f6896768d0e01eca5e8c15338d230a547fd6a1d3ca96a4b37bfe68\": container with ID starting with 7f6072cfb4f6896768d0e01eca5e8c15338d230a547fd6a1d3ca96a4b37bfe68 not found: ID does not exist" Oct 12 07:50:21 crc kubenswrapper[4599]: I1012 07:50:21.749902 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqqsh\" (UniqueName: \"kubernetes.io/projected/ad8ac3e7-8df6-433e-8fb6-2ed728f133ad-kube-api-access-sqqsh\") pod \"ceilometer-0\" (UID: \"ad8ac3e7-8df6-433e-8fb6-2ed728f133ad\") " pod="openstack/ceilometer-0" Oct 12 07:50:21 crc kubenswrapper[4599]: I1012 07:50:21.749965 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ad8ac3e7-8df6-433e-8fb6-2ed728f133ad-run-httpd\") pod \"ceilometer-0\" (UID: \"ad8ac3e7-8df6-433e-8fb6-2ed728f133ad\") " pod="openstack/ceilometer-0" Oct 12 07:50:21 crc kubenswrapper[4599]: I1012 07:50:21.750193 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ad8ac3e7-8df6-433e-8fb6-2ed728f133ad-log-httpd\") pod \"ceilometer-0\" (UID: \"ad8ac3e7-8df6-433e-8fb6-2ed728f133ad\") " pod="openstack/ceilometer-0" Oct 12 07:50:21 crc kubenswrapper[4599]: I1012 07:50:21.750245 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad8ac3e7-8df6-433e-8fb6-2ed728f133ad-config-data\") pod \"ceilometer-0\" (UID: \"ad8ac3e7-8df6-433e-8fb6-2ed728f133ad\") " pod="openstack/ceilometer-0" Oct 12 07:50:21 crc kubenswrapper[4599]: I1012 07:50:21.750318 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad8ac3e7-8df6-433e-8fb6-2ed728f133ad-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ad8ac3e7-8df6-433e-8fb6-2ed728f133ad\") " pod="openstack/ceilometer-0" Oct 12 07:50:21 crc kubenswrapper[4599]: I1012 07:50:21.750390 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad8ac3e7-8df6-433e-8fb6-2ed728f133ad-scripts\") pod \"ceilometer-0\" (UID: \"ad8ac3e7-8df6-433e-8fb6-2ed728f133ad\") " pod="openstack/ceilometer-0" Oct 12 07:50:21 crc kubenswrapper[4599]: I1012 07:50:21.750516 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ad8ac3e7-8df6-433e-8fb6-2ed728f133ad-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ad8ac3e7-8df6-433e-8fb6-2ed728f133ad\") " pod="openstack/ceilometer-0" Oct 12 07:50:21 crc kubenswrapper[4599]: I1012 07:50:21.838059 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-rfqhk" Oct 12 07:50:21 crc kubenswrapper[4599]: I1012 07:50:21.852278 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad8ac3e7-8df6-433e-8fb6-2ed728f133ad-scripts\") pod \"ceilometer-0\" (UID: \"ad8ac3e7-8df6-433e-8fb6-2ed728f133ad\") " pod="openstack/ceilometer-0" Oct 12 07:50:21 crc kubenswrapper[4599]: I1012 07:50:21.852379 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ad8ac3e7-8df6-433e-8fb6-2ed728f133ad-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ad8ac3e7-8df6-433e-8fb6-2ed728f133ad\") " pod="openstack/ceilometer-0" Oct 12 07:50:21 crc kubenswrapper[4599]: I1012 07:50:21.852505 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqqsh\" (UniqueName: \"kubernetes.io/projected/ad8ac3e7-8df6-433e-8fb6-2ed728f133ad-kube-api-access-sqqsh\") pod \"ceilometer-0\" (UID: \"ad8ac3e7-8df6-433e-8fb6-2ed728f133ad\") " pod="openstack/ceilometer-0" Oct 12 07:50:21 crc kubenswrapper[4599]: I1012 07:50:21.852549 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ad8ac3e7-8df6-433e-8fb6-2ed728f133ad-run-httpd\") pod \"ceilometer-0\" (UID: \"ad8ac3e7-8df6-433e-8fb6-2ed728f133ad\") " pod="openstack/ceilometer-0" Oct 12 07:50:21 crc kubenswrapper[4599]: I1012 07:50:21.852618 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad8ac3e7-8df6-433e-8fb6-2ed728f133ad-config-data\") pod \"ceilometer-0\" (UID: \"ad8ac3e7-8df6-433e-8fb6-2ed728f133ad\") " pod="openstack/ceilometer-0" Oct 12 07:50:21 crc kubenswrapper[4599]: I1012 07:50:21.852631 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ad8ac3e7-8df6-433e-8fb6-2ed728f133ad-log-httpd\") pod \"ceilometer-0\" (UID: \"ad8ac3e7-8df6-433e-8fb6-2ed728f133ad\") " pod="openstack/ceilometer-0" Oct 12 07:50:21 crc kubenswrapper[4599]: I1012 07:50:21.852659 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad8ac3e7-8df6-433e-8fb6-2ed728f133ad-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ad8ac3e7-8df6-433e-8fb6-2ed728f133ad\") " pod="openstack/ceilometer-0" Oct 12 07:50:21 crc kubenswrapper[4599]: I1012 07:50:21.855514 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ad8ac3e7-8df6-433e-8fb6-2ed728f133ad-run-httpd\") pod \"ceilometer-0\" (UID: \"ad8ac3e7-8df6-433e-8fb6-2ed728f133ad\") " pod="openstack/ceilometer-0" Oct 12 07:50:21 crc kubenswrapper[4599]: I1012 07:50:21.856488 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ad8ac3e7-8df6-433e-8fb6-2ed728f133ad-log-httpd\") pod \"ceilometer-0\" (UID: \"ad8ac3e7-8df6-433e-8fb6-2ed728f133ad\") " pod="openstack/ceilometer-0" Oct 12 07:50:21 crc kubenswrapper[4599]: I1012 07:50:21.858730 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad8ac3e7-8df6-433e-8fb6-2ed728f133ad-scripts\") pod \"ceilometer-0\" (UID: \"ad8ac3e7-8df6-433e-8fb6-2ed728f133ad\") " pod="openstack/ceilometer-0" Oct 12 07:50:21 crc kubenswrapper[4599]: I1012 07:50:21.859633 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ad8ac3e7-8df6-433e-8fb6-2ed728f133ad-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ad8ac3e7-8df6-433e-8fb6-2ed728f133ad\") " pod="openstack/ceilometer-0" Oct 12 07:50:21 crc kubenswrapper[4599]: I1012 07:50:21.860566 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad8ac3e7-8df6-433e-8fb6-2ed728f133ad-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ad8ac3e7-8df6-433e-8fb6-2ed728f133ad\") " pod="openstack/ceilometer-0" Oct 12 07:50:21 crc kubenswrapper[4599]: I1012 07:50:21.860773 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad8ac3e7-8df6-433e-8fb6-2ed728f133ad-config-data\") pod \"ceilometer-0\" (UID: \"ad8ac3e7-8df6-433e-8fb6-2ed728f133ad\") " pod="openstack/ceilometer-0" Oct 12 07:50:21 crc kubenswrapper[4599]: I1012 07:50:21.870373 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqqsh\" (UniqueName: \"kubernetes.io/projected/ad8ac3e7-8df6-433e-8fb6-2ed728f133ad-kube-api-access-sqqsh\") pod \"ceilometer-0\" (UID: \"ad8ac3e7-8df6-433e-8fb6-2ed728f133ad\") " pod="openstack/ceilometer-0" Oct 12 07:50:21 crc kubenswrapper[4599]: I1012 07:50:21.916278 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 12 07:50:21 crc kubenswrapper[4599]: I1012 07:50:21.953420 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-khtsd\" (UniqueName: \"kubernetes.io/projected/84022a3c-4f3a-40f7-b3f6-6aa3a83fb174-kube-api-access-khtsd\") pod \"84022a3c-4f3a-40f7-b3f6-6aa3a83fb174\" (UID: \"84022a3c-4f3a-40f7-b3f6-6aa3a83fb174\") " Oct 12 07:50:21 crc kubenswrapper[4599]: I1012 07:50:21.956378 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84022a3c-4f3a-40f7-b3f6-6aa3a83fb174-kube-api-access-khtsd" (OuterVolumeSpecName: "kube-api-access-khtsd") pod "84022a3c-4f3a-40f7-b3f6-6aa3a83fb174" (UID: "84022a3c-4f3a-40f7-b3f6-6aa3a83fb174"). InnerVolumeSpecName "kube-api-access-khtsd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:50:22 crc kubenswrapper[4599]: I1012 07:50:22.056415 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-khtsd\" (UniqueName: \"kubernetes.io/projected/84022a3c-4f3a-40f7-b3f6-6aa3a83fb174-kube-api-access-khtsd\") on node \"crc\" DevicePath \"\"" Oct 12 07:50:22 crc kubenswrapper[4599]: I1012 07:50:22.323075 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 12 07:50:22 crc kubenswrapper[4599]: W1012 07:50:22.328584 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podad8ac3e7_8df6_433e_8fb6_2ed728f133ad.slice/crio-edb4d5bd111e802bd3eaeefffa1b14c126cd8f1b1484eea9f9280ef5e2c359e6 WatchSource:0}: Error finding container edb4d5bd111e802bd3eaeefffa1b14c126cd8f1b1484eea9f9280ef5e2c359e6: Status 404 returned error can't find the container with id edb4d5bd111e802bd3eaeefffa1b14c126cd8f1b1484eea9f9280ef5e2c359e6 Oct 12 07:50:22 crc kubenswrapper[4599]: I1012 07:50:22.430534 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-rfqhk" Oct 12 07:50:22 crc kubenswrapper[4599]: I1012 07:50:22.430634 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-rfqhk" event={"ID":"84022a3c-4f3a-40f7-b3f6-6aa3a83fb174","Type":"ContainerDied","Data":"83417eec70fa4716915d5fe84213c4f74374c33cc05c198fc987bcfabb01bec3"} Oct 12 07:50:22 crc kubenswrapper[4599]: I1012 07:50:22.430672 4599 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="83417eec70fa4716915d5fe84213c4f74374c33cc05c198fc987bcfabb01bec3" Oct 12 07:50:22 crc kubenswrapper[4599]: I1012 07:50:22.432558 4599 generic.go:334] "Generic (PLEG): container finished" podID="fc5e8701-df4e-459c-85ae-78db217fcea0" containerID="869243432b03a8368e341b8063082bb68d7d1901d66174e6ffed62187df2b87e" exitCode=0 Oct 12 07:50:22 crc kubenswrapper[4599]: I1012 07:50:22.432645 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-586f7cb8d6-xq2lk" event={"ID":"fc5e8701-df4e-459c-85ae-78db217fcea0","Type":"ContainerDied","Data":"869243432b03a8368e341b8063082bb68d7d1901d66174e6ffed62187df2b87e"} Oct 12 07:50:22 crc kubenswrapper[4599]: I1012 07:50:22.433747 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ad8ac3e7-8df6-433e-8fb6-2ed728f133ad","Type":"ContainerStarted","Data":"edb4d5bd111e802bd3eaeefffa1b14c126cd8f1b1484eea9f9280ef5e2c359e6"} Oct 12 07:50:22 crc kubenswrapper[4599]: I1012 07:50:22.560770 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 12 07:50:22 crc kubenswrapper[4599]: I1012 07:50:22.560824 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 12 07:50:22 crc kubenswrapper[4599]: I1012 07:50:22.606764 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 12 07:50:22 crc kubenswrapper[4599]: I1012 07:50:22.609005 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 12 07:50:22 crc kubenswrapper[4599]: I1012 07:50:22.766870 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-nsn2r" Oct 12 07:50:22 crc kubenswrapper[4599]: I1012 07:50:22.774628 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-2p2tr" Oct 12 07:50:22 crc kubenswrapper[4599]: I1012 07:50:22.873594 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q78ds\" (UniqueName: \"kubernetes.io/projected/23162836-2301-4422-b6c0-30591a99bed0-kube-api-access-q78ds\") pod \"23162836-2301-4422-b6c0-30591a99bed0\" (UID: \"23162836-2301-4422-b6c0-30591a99bed0\") " Oct 12 07:50:22 crc kubenswrapper[4599]: I1012 07:50:22.875571 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pb8hp\" (UniqueName: \"kubernetes.io/projected/ac30765e-04ab-4e53-87ca-8d5bcf9df91b-kube-api-access-pb8hp\") pod \"ac30765e-04ab-4e53-87ca-8d5bcf9df91b\" (UID: \"ac30765e-04ab-4e53-87ca-8d5bcf9df91b\") " Oct 12 07:50:22 crc kubenswrapper[4599]: I1012 07:50:22.880860 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac30765e-04ab-4e53-87ca-8d5bcf9df91b-kube-api-access-pb8hp" (OuterVolumeSpecName: "kube-api-access-pb8hp") pod "ac30765e-04ab-4e53-87ca-8d5bcf9df91b" (UID: "ac30765e-04ab-4e53-87ca-8d5bcf9df91b"). InnerVolumeSpecName "kube-api-access-pb8hp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:50:22 crc kubenswrapper[4599]: I1012 07:50:22.882009 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23162836-2301-4422-b6c0-30591a99bed0-kube-api-access-q78ds" (OuterVolumeSpecName: "kube-api-access-q78ds") pod "23162836-2301-4422-b6c0-30591a99bed0" (UID: "23162836-2301-4422-b6c0-30591a99bed0"). InnerVolumeSpecName "kube-api-access-q78ds". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:50:22 crc kubenswrapper[4599]: I1012 07:50:22.981066 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pb8hp\" (UniqueName: \"kubernetes.io/projected/ac30765e-04ab-4e53-87ca-8d5bcf9df91b-kube-api-access-pb8hp\") on node \"crc\" DevicePath \"\"" Oct 12 07:50:22 crc kubenswrapper[4599]: I1012 07:50:22.981106 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q78ds\" (UniqueName: \"kubernetes.io/projected/23162836-2301-4422-b6c0-30591a99bed0-kube-api-access-q78ds\") on node \"crc\" DevicePath \"\"" Oct 12 07:50:23 crc kubenswrapper[4599]: I1012 07:50:23.464453 4599 generic.go:334] "Generic (PLEG): container finished" podID="fc5e8701-df4e-459c-85ae-78db217fcea0" containerID="94fce9a0f63f7cf8b138f590f6da338cb28eb8bb2a21c68e841b5c625a81f6c4" exitCode=0 Oct 12 07:50:23 crc kubenswrapper[4599]: I1012 07:50:23.464545 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-586f7cb8d6-xq2lk" event={"ID":"fc5e8701-df4e-459c-85ae-78db217fcea0","Type":"ContainerDied","Data":"94fce9a0f63f7cf8b138f590f6da338cb28eb8bb2a21c68e841b5c625a81f6c4"} Oct 12 07:50:23 crc kubenswrapper[4599]: I1012 07:50:23.469498 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ad8ac3e7-8df6-433e-8fb6-2ed728f133ad","Type":"ContainerStarted","Data":"a322b17e1734018bc819124a518971344bf7d4c215c1f9395b066f21d8246130"} Oct 12 07:50:23 crc kubenswrapper[4599]: I1012 07:50:23.471249 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-nsn2r" event={"ID":"23162836-2301-4422-b6c0-30591a99bed0","Type":"ContainerDied","Data":"3c8eeb75912fe3fac4b9ed1c57f44973e8c9b1e7b760db3c770803b487df929d"} Oct 12 07:50:23 crc kubenswrapper[4599]: I1012 07:50:23.471286 4599 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c8eeb75912fe3fac4b9ed1c57f44973e8c9b1e7b760db3c770803b487df929d" Oct 12 07:50:23 crc kubenswrapper[4599]: I1012 07:50:23.471359 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-nsn2r" Oct 12 07:50:23 crc kubenswrapper[4599]: I1012 07:50:23.479146 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-2p2tr" Oct 12 07:50:23 crc kubenswrapper[4599]: I1012 07:50:23.479139 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-2p2tr" event={"ID":"ac30765e-04ab-4e53-87ca-8d5bcf9df91b","Type":"ContainerDied","Data":"355f0903fa2a573d2e0ad6e72f5d63d6cef5868a08a0c226d0f43ec8407291cc"} Oct 12 07:50:23 crc kubenswrapper[4599]: I1012 07:50:23.479210 4599 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="355f0903fa2a573d2e0ad6e72f5d63d6cef5868a08a0c226d0f43ec8407291cc" Oct 12 07:50:23 crc kubenswrapper[4599]: I1012 07:50:23.479694 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 12 07:50:23 crc kubenswrapper[4599]: I1012 07:50:23.479736 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 12 07:50:23 crc kubenswrapper[4599]: I1012 07:50:23.555759 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="231584a7-c306-4904-9e9b-33789d2d42bd" path="/var/lib/kubelet/pods/231584a7-c306-4904-9e9b-33789d2d42bd/volumes" Oct 12 07:50:23 crc kubenswrapper[4599]: I1012 07:50:23.723384 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-586f7cb8d6-xq2lk" Oct 12 07:50:23 crc kubenswrapper[4599]: I1012 07:50:23.905775 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/fc5e8701-df4e-459c-85ae-78db217fcea0-httpd-config\") pod \"fc5e8701-df4e-459c-85ae-78db217fcea0\" (UID: \"fc5e8701-df4e-459c-85ae-78db217fcea0\") " Oct 12 07:50:23 crc kubenswrapper[4599]: I1012 07:50:23.905921 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-488pl\" (UniqueName: \"kubernetes.io/projected/fc5e8701-df4e-459c-85ae-78db217fcea0-kube-api-access-488pl\") pod \"fc5e8701-df4e-459c-85ae-78db217fcea0\" (UID: \"fc5e8701-df4e-459c-85ae-78db217fcea0\") " Oct 12 07:50:23 crc kubenswrapper[4599]: I1012 07:50:23.906054 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc5e8701-df4e-459c-85ae-78db217fcea0-combined-ca-bundle\") pod \"fc5e8701-df4e-459c-85ae-78db217fcea0\" (UID: \"fc5e8701-df4e-459c-85ae-78db217fcea0\") " Oct 12 07:50:23 crc kubenswrapper[4599]: I1012 07:50:23.906190 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fc5e8701-df4e-459c-85ae-78db217fcea0-config\") pod \"fc5e8701-df4e-459c-85ae-78db217fcea0\" (UID: \"fc5e8701-df4e-459c-85ae-78db217fcea0\") " Oct 12 07:50:23 crc kubenswrapper[4599]: I1012 07:50:23.906221 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc5e8701-df4e-459c-85ae-78db217fcea0-ovndb-tls-certs\") pod \"fc5e8701-df4e-459c-85ae-78db217fcea0\" (UID: \"fc5e8701-df4e-459c-85ae-78db217fcea0\") " Oct 12 07:50:23 crc kubenswrapper[4599]: I1012 07:50:23.915037 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc5e8701-df4e-459c-85ae-78db217fcea0-kube-api-access-488pl" (OuterVolumeSpecName: "kube-api-access-488pl") pod "fc5e8701-df4e-459c-85ae-78db217fcea0" (UID: "fc5e8701-df4e-459c-85ae-78db217fcea0"). InnerVolumeSpecName "kube-api-access-488pl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:50:23 crc kubenswrapper[4599]: I1012 07:50:23.949321 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc5e8701-df4e-459c-85ae-78db217fcea0-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "fc5e8701-df4e-459c-85ae-78db217fcea0" (UID: "fc5e8701-df4e-459c-85ae-78db217fcea0"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:50:23 crc kubenswrapper[4599]: I1012 07:50:23.987503 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc5e8701-df4e-459c-85ae-78db217fcea0-config" (OuterVolumeSpecName: "config") pod "fc5e8701-df4e-459c-85ae-78db217fcea0" (UID: "fc5e8701-df4e-459c-85ae-78db217fcea0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:50:24 crc kubenswrapper[4599]: I1012 07:50:24.002447 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc5e8701-df4e-459c-85ae-78db217fcea0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fc5e8701-df4e-459c-85ae-78db217fcea0" (UID: "fc5e8701-df4e-459c-85ae-78db217fcea0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:50:24 crc kubenswrapper[4599]: I1012 07:50:24.009429 4599 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/fc5e8701-df4e-459c-85ae-78db217fcea0-httpd-config\") on node \"crc\" DevicePath \"\"" Oct 12 07:50:24 crc kubenswrapper[4599]: I1012 07:50:24.009455 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-488pl\" (UniqueName: \"kubernetes.io/projected/fc5e8701-df4e-459c-85ae-78db217fcea0-kube-api-access-488pl\") on node \"crc\" DevicePath \"\"" Oct 12 07:50:24 crc kubenswrapper[4599]: I1012 07:50:24.009467 4599 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc5e8701-df4e-459c-85ae-78db217fcea0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 07:50:24 crc kubenswrapper[4599]: I1012 07:50:24.009475 4599 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/fc5e8701-df4e-459c-85ae-78db217fcea0-config\") on node \"crc\" DevicePath \"\"" Oct 12 07:50:24 crc kubenswrapper[4599]: I1012 07:50:24.061428 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc5e8701-df4e-459c-85ae-78db217fcea0-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "fc5e8701-df4e-459c-85ae-78db217fcea0" (UID: "fc5e8701-df4e-459c-85ae-78db217fcea0"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:50:24 crc kubenswrapper[4599]: I1012 07:50:24.114907 4599 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc5e8701-df4e-459c-85ae-78db217fcea0-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 12 07:50:24 crc kubenswrapper[4599]: I1012 07:50:24.487875 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-586f7cb8d6-xq2lk" event={"ID":"fc5e8701-df4e-459c-85ae-78db217fcea0","Type":"ContainerDied","Data":"b42401f8cf3b4e6c273c057fa4ba267bdd331d8d4876919e80c12c05574c2cd5"} Oct 12 07:50:24 crc kubenswrapper[4599]: I1012 07:50:24.487934 4599 scope.go:117] "RemoveContainer" containerID="869243432b03a8368e341b8063082bb68d7d1901d66174e6ffed62187df2b87e" Oct 12 07:50:24 crc kubenswrapper[4599]: I1012 07:50:24.488070 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-586f7cb8d6-xq2lk" Oct 12 07:50:24 crc kubenswrapper[4599]: I1012 07:50:24.492817 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ad8ac3e7-8df6-433e-8fb6-2ed728f133ad","Type":"ContainerStarted","Data":"b6e98c23c623db132e7de729ee558b4d755f97b5bf06ed983fb5cf5d69aa1dbd"} Oct 12 07:50:24 crc kubenswrapper[4599]: I1012 07:50:24.510500 4599 scope.go:117] "RemoveContainer" containerID="94fce9a0f63f7cf8b138f590f6da338cb28eb8bb2a21c68e841b5c625a81f6c4" Oct 12 07:50:24 crc kubenswrapper[4599]: I1012 07:50:24.528327 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-586f7cb8d6-xq2lk"] Oct 12 07:50:24 crc kubenswrapper[4599]: I1012 07:50:24.537231 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-586f7cb8d6-xq2lk"] Oct 12 07:50:25 crc kubenswrapper[4599]: I1012 07:50:25.225570 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 12 07:50:25 crc kubenswrapper[4599]: I1012 07:50:25.255663 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 12 07:50:25 crc kubenswrapper[4599]: I1012 07:50:25.504004 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ad8ac3e7-8df6-433e-8fb6-2ed728f133ad","Type":"ContainerStarted","Data":"c703f77069f7bc020c223a930b6f6fa64c4d462d188dd0b6b07b7916886cf861"} Oct 12 07:50:25 crc kubenswrapper[4599]: I1012 07:50:25.553564 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc5e8701-df4e-459c-85ae-78db217fcea0" path="/var/lib/kubelet/pods/fc5e8701-df4e-459c-85ae-78db217fcea0/volumes" Oct 12 07:50:25 crc kubenswrapper[4599]: I1012 07:50:25.801597 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vvd6p" Oct 12 07:50:25 crc kubenswrapper[4599]: I1012 07:50:25.801776 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vvd6p" Oct 12 07:50:25 crc kubenswrapper[4599]: I1012 07:50:25.842789 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vvd6p" Oct 12 07:50:26 crc kubenswrapper[4599]: I1012 07:50:26.360526 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 12 07:50:26 crc kubenswrapper[4599]: I1012 07:50:26.512585 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ad8ac3e7-8df6-433e-8fb6-2ed728f133ad","Type":"ContainerStarted","Data":"e809f78f30205fffacf4be5e9441fd40efaafd14eca9b39d55b55e314a7578f5"} Oct 12 07:50:26 crc kubenswrapper[4599]: I1012 07:50:26.513054 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ad8ac3e7-8df6-433e-8fb6-2ed728f133ad" containerName="ceilometer-central-agent" containerID="cri-o://a322b17e1734018bc819124a518971344bf7d4c215c1f9395b066f21d8246130" gracePeriod=30 Oct 12 07:50:26 crc kubenswrapper[4599]: I1012 07:50:26.513245 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ad8ac3e7-8df6-433e-8fb6-2ed728f133ad" containerName="proxy-httpd" containerID="cri-o://e809f78f30205fffacf4be5e9441fd40efaafd14eca9b39d55b55e314a7578f5" gracePeriod=30 Oct 12 07:50:26 crc kubenswrapper[4599]: I1012 07:50:26.513373 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ad8ac3e7-8df6-433e-8fb6-2ed728f133ad" containerName="sg-core" containerID="cri-o://c703f77069f7bc020c223a930b6f6fa64c4d462d188dd0b6b07b7916886cf861" gracePeriod=30 Oct 12 07:50:26 crc kubenswrapper[4599]: I1012 07:50:26.513380 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ad8ac3e7-8df6-433e-8fb6-2ed728f133ad" containerName="ceilometer-notification-agent" containerID="cri-o://b6e98c23c623db132e7de729ee558b4d755f97b5bf06ed983fb5cf5d69aa1dbd" gracePeriod=30 Oct 12 07:50:26 crc kubenswrapper[4599]: I1012 07:50:26.540615 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.701865967 podStartE2EDuration="5.540599447s" podCreationTimestamp="2025-10-12 07:50:21 +0000 UTC" firstStartedPulling="2025-10-12 07:50:22.33300541 +0000 UTC m=+919.122200912" lastFinishedPulling="2025-10-12 07:50:26.17173889 +0000 UTC m=+922.960934392" observedRunningTime="2025-10-12 07:50:26.535712556 +0000 UTC m=+923.324908057" watchObservedRunningTime="2025-10-12 07:50:26.540599447 +0000 UTC m=+923.329794950" Oct 12 07:50:26 crc kubenswrapper[4599]: I1012 07:50:26.551703 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vvd6p" Oct 12 07:50:26 crc kubenswrapper[4599]: I1012 07:50:26.595795 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 12 07:50:26 crc kubenswrapper[4599]: I1012 07:50:26.595838 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 12 07:50:26 crc kubenswrapper[4599]: I1012 07:50:26.596195 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vvd6p"] Oct 12 07:50:26 crc kubenswrapper[4599]: I1012 07:50:26.639488 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 12 07:50:26 crc kubenswrapper[4599]: I1012 07:50:26.647061 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 12 07:50:26 crc kubenswrapper[4599]: I1012 07:50:26.881539 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rk6kl" Oct 12 07:50:26 crc kubenswrapper[4599]: I1012 07:50:26.881627 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rk6kl" Oct 12 07:50:26 crc kubenswrapper[4599]: I1012 07:50:26.922045 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rk6kl" Oct 12 07:50:27 crc kubenswrapper[4599]: I1012 07:50:27.525750 4599 generic.go:334] "Generic (PLEG): container finished" podID="ad8ac3e7-8df6-433e-8fb6-2ed728f133ad" containerID="e809f78f30205fffacf4be5e9441fd40efaafd14eca9b39d55b55e314a7578f5" exitCode=0 Oct 12 07:50:27 crc kubenswrapper[4599]: I1012 07:50:27.525788 4599 generic.go:334] "Generic (PLEG): container finished" podID="ad8ac3e7-8df6-433e-8fb6-2ed728f133ad" containerID="c703f77069f7bc020c223a930b6f6fa64c4d462d188dd0b6b07b7916886cf861" exitCode=2 Oct 12 07:50:27 crc kubenswrapper[4599]: I1012 07:50:27.525796 4599 generic.go:334] "Generic (PLEG): container finished" podID="ad8ac3e7-8df6-433e-8fb6-2ed728f133ad" containerID="b6e98c23c623db132e7de729ee558b4d755f97b5bf06ed983fb5cf5d69aa1dbd" exitCode=0 Oct 12 07:50:27 crc kubenswrapper[4599]: I1012 07:50:27.525847 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ad8ac3e7-8df6-433e-8fb6-2ed728f133ad","Type":"ContainerDied","Data":"e809f78f30205fffacf4be5e9441fd40efaafd14eca9b39d55b55e314a7578f5"} Oct 12 07:50:27 crc kubenswrapper[4599]: I1012 07:50:27.525915 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ad8ac3e7-8df6-433e-8fb6-2ed728f133ad","Type":"ContainerDied","Data":"c703f77069f7bc020c223a930b6f6fa64c4d462d188dd0b6b07b7916886cf861"} Oct 12 07:50:27 crc kubenswrapper[4599]: I1012 07:50:27.525927 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ad8ac3e7-8df6-433e-8fb6-2ed728f133ad","Type":"ContainerDied","Data":"b6e98c23c623db132e7de729ee558b4d755f97b5bf06ed983fb5cf5d69aa1dbd"} Oct 12 07:50:27 crc kubenswrapper[4599]: I1012 07:50:27.526722 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 12 07:50:27 crc kubenswrapper[4599]: I1012 07:50:27.526763 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 12 07:50:27 crc kubenswrapper[4599]: I1012 07:50:27.565770 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rk6kl" Oct 12 07:50:28 crc kubenswrapper[4599]: I1012 07:50:28.322196 4599 patch_prober.go:28] interesting pod/machine-config-daemon-5mz5c container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 07:50:28 crc kubenswrapper[4599]: I1012 07:50:28.322661 4599 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 07:50:28 crc kubenswrapper[4599]: I1012 07:50:28.322725 4599 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" Oct 12 07:50:28 crc kubenswrapper[4599]: I1012 07:50:28.323662 4599 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f791fc6fe233d5a2dcb3bd14d2fd8d76369bf4f0ae51317c4f0bb3b0e75a17de"} pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 12 07:50:28 crc kubenswrapper[4599]: I1012 07:50:28.323732 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" containerName="machine-config-daemon" containerID="cri-o://f791fc6fe233d5a2dcb3bd14d2fd8d76369bf4f0ae51317c4f0bb3b0e75a17de" gracePeriod=600 Oct 12 07:50:28 crc kubenswrapper[4599]: I1012 07:50:28.534797 4599 generic.go:334] "Generic (PLEG): container finished" podID="cc694bce-8c25-4729-b452-29d44d3efe6e" containerID="f791fc6fe233d5a2dcb3bd14d2fd8d76369bf4f0ae51317c4f0bb3b0e75a17de" exitCode=0 Oct 12 07:50:28 crc kubenswrapper[4599]: I1012 07:50:28.534839 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" event={"ID":"cc694bce-8c25-4729-b452-29d44d3efe6e","Type":"ContainerDied","Data":"f791fc6fe233d5a2dcb3bd14d2fd8d76369bf4f0ae51317c4f0bb3b0e75a17de"} Oct 12 07:50:28 crc kubenswrapper[4599]: I1012 07:50:28.535029 4599 scope.go:117] "RemoveContainer" containerID="62dd115f3eaf8ba983cf13f3b84adc51fbb09341d1c83aeb28106a411652e265" Oct 12 07:50:28 crc kubenswrapper[4599]: I1012 07:50:28.535306 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vvd6p" podUID="fdd9da54-4ee3-40c8-af3b-fb522971c160" containerName="registry-server" containerID="cri-o://8a9161f6eb5724ce2b7b744603c36a31e9bbbf8a9f287ef35511a0953a5edf70" gracePeriod=2 Oct 12 07:50:28 crc kubenswrapper[4599]: I1012 07:50:28.670537 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rk6kl"] Oct 12 07:50:28 crc kubenswrapper[4599]: I1012 07:50:28.975821 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vvd6p" Oct 12 07:50:29 crc kubenswrapper[4599]: I1012 07:50:29.128090 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tw9m8\" (UniqueName: \"kubernetes.io/projected/fdd9da54-4ee3-40c8-af3b-fb522971c160-kube-api-access-tw9m8\") pod \"fdd9da54-4ee3-40c8-af3b-fb522971c160\" (UID: \"fdd9da54-4ee3-40c8-af3b-fb522971c160\") " Oct 12 07:50:29 crc kubenswrapper[4599]: I1012 07:50:29.128320 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdd9da54-4ee3-40c8-af3b-fb522971c160-catalog-content\") pod \"fdd9da54-4ee3-40c8-af3b-fb522971c160\" (UID: \"fdd9da54-4ee3-40c8-af3b-fb522971c160\") " Oct 12 07:50:29 crc kubenswrapper[4599]: I1012 07:50:29.128435 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdd9da54-4ee3-40c8-af3b-fb522971c160-utilities\") pod \"fdd9da54-4ee3-40c8-af3b-fb522971c160\" (UID: \"fdd9da54-4ee3-40c8-af3b-fb522971c160\") " Oct 12 07:50:29 crc kubenswrapper[4599]: I1012 07:50:29.129196 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fdd9da54-4ee3-40c8-af3b-fb522971c160-utilities" (OuterVolumeSpecName: "utilities") pod "fdd9da54-4ee3-40c8-af3b-fb522971c160" (UID: "fdd9da54-4ee3-40c8-af3b-fb522971c160"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:50:29 crc kubenswrapper[4599]: I1012 07:50:29.133533 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fdd9da54-4ee3-40c8-af3b-fb522971c160-kube-api-access-tw9m8" (OuterVolumeSpecName: "kube-api-access-tw9m8") pod "fdd9da54-4ee3-40c8-af3b-fb522971c160" (UID: "fdd9da54-4ee3-40c8-af3b-fb522971c160"). InnerVolumeSpecName "kube-api-access-tw9m8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:50:29 crc kubenswrapper[4599]: I1012 07:50:29.167193 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fdd9da54-4ee3-40c8-af3b-fb522971c160-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fdd9da54-4ee3-40c8-af3b-fb522971c160" (UID: "fdd9da54-4ee3-40c8-af3b-fb522971c160"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:50:29 crc kubenswrapper[4599]: I1012 07:50:29.228489 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 12 07:50:29 crc kubenswrapper[4599]: I1012 07:50:29.231197 4599 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdd9da54-4ee3-40c8-af3b-fb522971c160-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 07:50:29 crc kubenswrapper[4599]: I1012 07:50:29.231234 4599 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdd9da54-4ee3-40c8-af3b-fb522971c160-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 07:50:29 crc kubenswrapper[4599]: I1012 07:50:29.231246 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tw9m8\" (UniqueName: \"kubernetes.io/projected/fdd9da54-4ee3-40c8-af3b-fb522971c160-kube-api-access-tw9m8\") on node \"crc\" DevicePath \"\"" Oct 12 07:50:29 crc kubenswrapper[4599]: I1012 07:50:29.241399 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 12 07:50:29 crc kubenswrapper[4599]: I1012 07:50:29.381411 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-365a-account-create-hgsfs"] Oct 12 07:50:29 crc kubenswrapper[4599]: E1012 07:50:29.381900 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc5e8701-df4e-459c-85ae-78db217fcea0" containerName="neutron-api" Oct 12 07:50:29 crc kubenswrapper[4599]: I1012 07:50:29.381918 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc5e8701-df4e-459c-85ae-78db217fcea0" containerName="neutron-api" Oct 12 07:50:29 crc kubenswrapper[4599]: E1012 07:50:29.381934 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc5e8701-df4e-459c-85ae-78db217fcea0" containerName="neutron-httpd" Oct 12 07:50:29 crc kubenswrapper[4599]: I1012 07:50:29.381941 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc5e8701-df4e-459c-85ae-78db217fcea0" containerName="neutron-httpd" Oct 12 07:50:29 crc kubenswrapper[4599]: E1012 07:50:29.381954 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23162836-2301-4422-b6c0-30591a99bed0" containerName="mariadb-database-create" Oct 12 07:50:29 crc kubenswrapper[4599]: I1012 07:50:29.381960 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="23162836-2301-4422-b6c0-30591a99bed0" containerName="mariadb-database-create" Oct 12 07:50:29 crc kubenswrapper[4599]: E1012 07:50:29.381969 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdd9da54-4ee3-40c8-af3b-fb522971c160" containerName="extract-content" Oct 12 07:50:29 crc kubenswrapper[4599]: I1012 07:50:29.381973 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdd9da54-4ee3-40c8-af3b-fb522971c160" containerName="extract-content" Oct 12 07:50:29 crc kubenswrapper[4599]: E1012 07:50:29.381984 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84022a3c-4f3a-40f7-b3f6-6aa3a83fb174" containerName="mariadb-database-create" Oct 12 07:50:29 crc kubenswrapper[4599]: I1012 07:50:29.381990 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="84022a3c-4f3a-40f7-b3f6-6aa3a83fb174" containerName="mariadb-database-create" Oct 12 07:50:29 crc kubenswrapper[4599]: E1012 07:50:29.382012 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac30765e-04ab-4e53-87ca-8d5bcf9df91b" containerName="mariadb-database-create" Oct 12 07:50:29 crc kubenswrapper[4599]: I1012 07:50:29.382020 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac30765e-04ab-4e53-87ca-8d5bcf9df91b" containerName="mariadb-database-create" Oct 12 07:50:29 crc kubenswrapper[4599]: E1012 07:50:29.382038 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdd9da54-4ee3-40c8-af3b-fb522971c160" containerName="extract-utilities" Oct 12 07:50:29 crc kubenswrapper[4599]: I1012 07:50:29.382043 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdd9da54-4ee3-40c8-af3b-fb522971c160" containerName="extract-utilities" Oct 12 07:50:29 crc kubenswrapper[4599]: E1012 07:50:29.382053 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdd9da54-4ee3-40c8-af3b-fb522971c160" containerName="registry-server" Oct 12 07:50:29 crc kubenswrapper[4599]: I1012 07:50:29.382059 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdd9da54-4ee3-40c8-af3b-fb522971c160" containerName="registry-server" Oct 12 07:50:29 crc kubenswrapper[4599]: I1012 07:50:29.382249 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="84022a3c-4f3a-40f7-b3f6-6aa3a83fb174" containerName="mariadb-database-create" Oct 12 07:50:29 crc kubenswrapper[4599]: I1012 07:50:29.382262 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdd9da54-4ee3-40c8-af3b-fb522971c160" containerName="registry-server" Oct 12 07:50:29 crc kubenswrapper[4599]: I1012 07:50:29.382268 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac30765e-04ab-4e53-87ca-8d5bcf9df91b" containerName="mariadb-database-create" Oct 12 07:50:29 crc kubenswrapper[4599]: I1012 07:50:29.382277 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc5e8701-df4e-459c-85ae-78db217fcea0" containerName="neutron-httpd" Oct 12 07:50:29 crc kubenswrapper[4599]: I1012 07:50:29.382286 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc5e8701-df4e-459c-85ae-78db217fcea0" containerName="neutron-api" Oct 12 07:50:29 crc kubenswrapper[4599]: I1012 07:50:29.382295 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="23162836-2301-4422-b6c0-30591a99bed0" containerName="mariadb-database-create" Oct 12 07:50:29 crc kubenswrapper[4599]: I1012 07:50:29.382958 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-365a-account-create-hgsfs" Oct 12 07:50:29 crc kubenswrapper[4599]: I1012 07:50:29.391530 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Oct 12 07:50:29 crc kubenswrapper[4599]: I1012 07:50:29.395051 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-365a-account-create-hgsfs"] Oct 12 07:50:29 crc kubenswrapper[4599]: I1012 07:50:29.535764 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpxwq\" (UniqueName: \"kubernetes.io/projected/c351afb9-ec01-409c-bde8-b466a94e4d9d-kube-api-access-vpxwq\") pod \"nova-api-365a-account-create-hgsfs\" (UID: \"c351afb9-ec01-409c-bde8-b466a94e4d9d\") " pod="openstack/nova-api-365a-account-create-hgsfs" Oct 12 07:50:29 crc kubenswrapper[4599]: I1012 07:50:29.556795 4599 generic.go:334] "Generic (PLEG): container finished" podID="ad8ac3e7-8df6-433e-8fb6-2ed728f133ad" containerID="a322b17e1734018bc819124a518971344bf7d4c215c1f9395b066f21d8246130" exitCode=0 Oct 12 07:50:29 crc kubenswrapper[4599]: I1012 07:50:29.558218 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ad8ac3e7-8df6-433e-8fb6-2ed728f133ad","Type":"ContainerDied","Data":"a322b17e1734018bc819124a518971344bf7d4c215c1f9395b066f21d8246130"} Oct 12 07:50:29 crc kubenswrapper[4599]: I1012 07:50:29.578000 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" event={"ID":"cc694bce-8c25-4729-b452-29d44d3efe6e","Type":"ContainerStarted","Data":"409590f6935d88c6579c0848195b75ccd573f94456a7d800342528199b5f70c8"} Oct 12 07:50:29 crc kubenswrapper[4599]: I1012 07:50:29.597798 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-d46a-account-create-gnb8g"] Oct 12 07:50:29 crc kubenswrapper[4599]: I1012 07:50:29.603492 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-d46a-account-create-gnb8g" Oct 12 07:50:29 crc kubenswrapper[4599]: I1012 07:50:29.606288 4599 generic.go:334] "Generic (PLEG): container finished" podID="fdd9da54-4ee3-40c8-af3b-fb522971c160" containerID="8a9161f6eb5724ce2b7b744603c36a31e9bbbf8a9f287ef35511a0953a5edf70" exitCode=0 Oct 12 07:50:29 crc kubenswrapper[4599]: I1012 07:50:29.606604 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-rk6kl" podUID="9f317920-4f82-4691-8e1c-31b0b35526e8" containerName="registry-server" containerID="cri-o://b638aaa1589d905594aec15c290c10b79fe2cdcf894d90b0dfacac9aa4f378b7" gracePeriod=2 Oct 12 07:50:29 crc kubenswrapper[4599]: I1012 07:50:29.607031 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vvd6p" Oct 12 07:50:29 crc kubenswrapper[4599]: I1012 07:50:29.607779 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vvd6p" event={"ID":"fdd9da54-4ee3-40c8-af3b-fb522971c160","Type":"ContainerDied","Data":"8a9161f6eb5724ce2b7b744603c36a31e9bbbf8a9f287ef35511a0953a5edf70"} Oct 12 07:50:29 crc kubenswrapper[4599]: I1012 07:50:29.607812 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vvd6p" event={"ID":"fdd9da54-4ee3-40c8-af3b-fb522971c160","Type":"ContainerDied","Data":"98d8990c532bc8d4bd995fad295fda1d7f633ac8745f0c04b80852807c81309a"} Oct 12 07:50:29 crc kubenswrapper[4599]: I1012 07:50:29.607830 4599 scope.go:117] "RemoveContainer" containerID="8a9161f6eb5724ce2b7b744603c36a31e9bbbf8a9f287ef35511a0953a5edf70" Oct 12 07:50:29 crc kubenswrapper[4599]: I1012 07:50:29.613688 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Oct 12 07:50:29 crc kubenswrapper[4599]: I1012 07:50:29.620038 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-d46a-account-create-gnb8g"] Oct 12 07:50:29 crc kubenswrapper[4599]: I1012 07:50:29.638678 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpxwq\" (UniqueName: \"kubernetes.io/projected/c351afb9-ec01-409c-bde8-b466a94e4d9d-kube-api-access-vpxwq\") pod \"nova-api-365a-account-create-hgsfs\" (UID: \"c351afb9-ec01-409c-bde8-b466a94e4d9d\") " pod="openstack/nova-api-365a-account-create-hgsfs" Oct 12 07:50:29 crc kubenswrapper[4599]: I1012 07:50:29.661149 4599 scope.go:117] "RemoveContainer" containerID="8b426305db57f447fca7b13b43be22f2b95a9f61dee44bf35039d349186b6df1" Oct 12 07:50:29 crc kubenswrapper[4599]: I1012 07:50:29.670415 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpxwq\" (UniqueName: \"kubernetes.io/projected/c351afb9-ec01-409c-bde8-b466a94e4d9d-kube-api-access-vpxwq\") pod \"nova-api-365a-account-create-hgsfs\" (UID: \"c351afb9-ec01-409c-bde8-b466a94e4d9d\") " pod="openstack/nova-api-365a-account-create-hgsfs" Oct 12 07:50:29 crc kubenswrapper[4599]: I1012 07:50:29.677411 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vvd6p"] Oct 12 07:50:29 crc kubenswrapper[4599]: I1012 07:50:29.683841 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vvd6p"] Oct 12 07:50:29 crc kubenswrapper[4599]: I1012 07:50:29.741076 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vj6n8\" (UniqueName: \"kubernetes.io/projected/d20b8ffe-fcc1-4097-a968-0cbe4bcd30a3-kube-api-access-vj6n8\") pod \"nova-cell0-d46a-account-create-gnb8g\" (UID: \"d20b8ffe-fcc1-4097-a968-0cbe4bcd30a3\") " pod="openstack/nova-cell0-d46a-account-create-gnb8g" Oct 12 07:50:29 crc kubenswrapper[4599]: I1012 07:50:29.782145 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-8bb3-account-create-cfxfp"] Oct 12 07:50:29 crc kubenswrapper[4599]: I1012 07:50:29.783513 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-8bb3-account-create-cfxfp" Oct 12 07:50:29 crc kubenswrapper[4599]: I1012 07:50:29.785051 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Oct 12 07:50:29 crc kubenswrapper[4599]: I1012 07:50:29.794038 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-365a-account-create-hgsfs" Oct 12 07:50:29 crc kubenswrapper[4599]: I1012 07:50:29.805221 4599 scope.go:117] "RemoveContainer" containerID="958085b6614b6b8062018ad8f344e2baa29e201b99985fc68ec2442f058e8924" Oct 12 07:50:29 crc kubenswrapper[4599]: I1012 07:50:29.812532 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-8bb3-account-create-cfxfp"] Oct 12 07:50:29 crc kubenswrapper[4599]: I1012 07:50:29.820542 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 12 07:50:29 crc kubenswrapper[4599]: I1012 07:50:29.848136 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vj6n8\" (UniqueName: \"kubernetes.io/projected/d20b8ffe-fcc1-4097-a968-0cbe4bcd30a3-kube-api-access-vj6n8\") pod \"nova-cell0-d46a-account-create-gnb8g\" (UID: \"d20b8ffe-fcc1-4097-a968-0cbe4bcd30a3\") " pod="openstack/nova-cell0-d46a-account-create-gnb8g" Oct 12 07:50:29 crc kubenswrapper[4599]: I1012 07:50:29.879711 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vj6n8\" (UniqueName: \"kubernetes.io/projected/d20b8ffe-fcc1-4097-a968-0cbe4bcd30a3-kube-api-access-vj6n8\") pod \"nova-cell0-d46a-account-create-gnb8g\" (UID: \"d20b8ffe-fcc1-4097-a968-0cbe4bcd30a3\") " pod="openstack/nova-cell0-d46a-account-create-gnb8g" Oct 12 07:50:29 crc kubenswrapper[4599]: I1012 07:50:29.892206 4599 scope.go:117] "RemoveContainer" containerID="8a9161f6eb5724ce2b7b744603c36a31e9bbbf8a9f287ef35511a0953a5edf70" Oct 12 07:50:29 crc kubenswrapper[4599]: E1012 07:50:29.893262 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a9161f6eb5724ce2b7b744603c36a31e9bbbf8a9f287ef35511a0953a5edf70\": container with ID starting with 8a9161f6eb5724ce2b7b744603c36a31e9bbbf8a9f287ef35511a0953a5edf70 not found: ID does not exist" containerID="8a9161f6eb5724ce2b7b744603c36a31e9bbbf8a9f287ef35511a0953a5edf70" Oct 12 07:50:29 crc kubenswrapper[4599]: I1012 07:50:29.893297 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a9161f6eb5724ce2b7b744603c36a31e9bbbf8a9f287ef35511a0953a5edf70"} err="failed to get container status \"8a9161f6eb5724ce2b7b744603c36a31e9bbbf8a9f287ef35511a0953a5edf70\": rpc error: code = NotFound desc = could not find container \"8a9161f6eb5724ce2b7b744603c36a31e9bbbf8a9f287ef35511a0953a5edf70\": container with ID starting with 8a9161f6eb5724ce2b7b744603c36a31e9bbbf8a9f287ef35511a0953a5edf70 not found: ID does not exist" Oct 12 07:50:29 crc kubenswrapper[4599]: I1012 07:50:29.893328 4599 scope.go:117] "RemoveContainer" containerID="8b426305db57f447fca7b13b43be22f2b95a9f61dee44bf35039d349186b6df1" Oct 12 07:50:29 crc kubenswrapper[4599]: E1012 07:50:29.894246 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b426305db57f447fca7b13b43be22f2b95a9f61dee44bf35039d349186b6df1\": container with ID starting with 8b426305db57f447fca7b13b43be22f2b95a9f61dee44bf35039d349186b6df1 not found: ID does not exist" containerID="8b426305db57f447fca7b13b43be22f2b95a9f61dee44bf35039d349186b6df1" Oct 12 07:50:29 crc kubenswrapper[4599]: I1012 07:50:29.894286 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b426305db57f447fca7b13b43be22f2b95a9f61dee44bf35039d349186b6df1"} err="failed to get container status \"8b426305db57f447fca7b13b43be22f2b95a9f61dee44bf35039d349186b6df1\": rpc error: code = NotFound desc = could not find container \"8b426305db57f447fca7b13b43be22f2b95a9f61dee44bf35039d349186b6df1\": container with ID starting with 8b426305db57f447fca7b13b43be22f2b95a9f61dee44bf35039d349186b6df1 not found: ID does not exist" Oct 12 07:50:29 crc kubenswrapper[4599]: I1012 07:50:29.894316 4599 scope.go:117] "RemoveContainer" containerID="958085b6614b6b8062018ad8f344e2baa29e201b99985fc68ec2442f058e8924" Oct 12 07:50:29 crc kubenswrapper[4599]: E1012 07:50:29.894814 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"958085b6614b6b8062018ad8f344e2baa29e201b99985fc68ec2442f058e8924\": container with ID starting with 958085b6614b6b8062018ad8f344e2baa29e201b99985fc68ec2442f058e8924 not found: ID does not exist" containerID="958085b6614b6b8062018ad8f344e2baa29e201b99985fc68ec2442f058e8924" Oct 12 07:50:29 crc kubenswrapper[4599]: I1012 07:50:29.894864 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"958085b6614b6b8062018ad8f344e2baa29e201b99985fc68ec2442f058e8924"} err="failed to get container status \"958085b6614b6b8062018ad8f344e2baa29e201b99985fc68ec2442f058e8924\": rpc error: code = NotFound desc = could not find container \"958085b6614b6b8062018ad8f344e2baa29e201b99985fc68ec2442f058e8924\": container with ID starting with 958085b6614b6b8062018ad8f344e2baa29e201b99985fc68ec2442f058e8924 not found: ID does not exist" Oct 12 07:50:29 crc kubenswrapper[4599]: I1012 07:50:29.948408 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-d46a-account-create-gnb8g" Oct 12 07:50:29 crc kubenswrapper[4599]: I1012 07:50:29.950243 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad8ac3e7-8df6-433e-8fb6-2ed728f133ad-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ad8ac3e7-8df6-433e-8fb6-2ed728f133ad" (UID: "ad8ac3e7-8df6-433e-8fb6-2ed728f133ad"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:50:29 crc kubenswrapper[4599]: I1012 07:50:29.949809 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ad8ac3e7-8df6-433e-8fb6-2ed728f133ad-run-httpd\") pod \"ad8ac3e7-8df6-433e-8fb6-2ed728f133ad\" (UID: \"ad8ac3e7-8df6-433e-8fb6-2ed728f133ad\") " Oct 12 07:50:29 crc kubenswrapper[4599]: I1012 07:50:29.953563 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad8ac3e7-8df6-433e-8fb6-2ed728f133ad-scripts\") pod \"ad8ac3e7-8df6-433e-8fb6-2ed728f133ad\" (UID: \"ad8ac3e7-8df6-433e-8fb6-2ed728f133ad\") " Oct 12 07:50:29 crc kubenswrapper[4599]: I1012 07:50:29.953617 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad8ac3e7-8df6-433e-8fb6-2ed728f133ad-combined-ca-bundle\") pod \"ad8ac3e7-8df6-433e-8fb6-2ed728f133ad\" (UID: \"ad8ac3e7-8df6-433e-8fb6-2ed728f133ad\") " Oct 12 07:50:29 crc kubenswrapper[4599]: I1012 07:50:29.953638 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad8ac3e7-8df6-433e-8fb6-2ed728f133ad-config-data\") pod \"ad8ac3e7-8df6-433e-8fb6-2ed728f133ad\" (UID: \"ad8ac3e7-8df6-433e-8fb6-2ed728f133ad\") " Oct 12 07:50:29 crc kubenswrapper[4599]: I1012 07:50:29.953689 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sqqsh\" (UniqueName: \"kubernetes.io/projected/ad8ac3e7-8df6-433e-8fb6-2ed728f133ad-kube-api-access-sqqsh\") pod \"ad8ac3e7-8df6-433e-8fb6-2ed728f133ad\" (UID: \"ad8ac3e7-8df6-433e-8fb6-2ed728f133ad\") " Oct 12 07:50:29 crc kubenswrapper[4599]: I1012 07:50:29.953818 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ad8ac3e7-8df6-433e-8fb6-2ed728f133ad-sg-core-conf-yaml\") pod \"ad8ac3e7-8df6-433e-8fb6-2ed728f133ad\" (UID: \"ad8ac3e7-8df6-433e-8fb6-2ed728f133ad\") " Oct 12 07:50:29 crc kubenswrapper[4599]: I1012 07:50:29.953850 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ad8ac3e7-8df6-433e-8fb6-2ed728f133ad-log-httpd\") pod \"ad8ac3e7-8df6-433e-8fb6-2ed728f133ad\" (UID: \"ad8ac3e7-8df6-433e-8fb6-2ed728f133ad\") " Oct 12 07:50:29 crc kubenswrapper[4599]: I1012 07:50:29.954401 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6clp9\" (UniqueName: \"kubernetes.io/projected/f812212a-8c44-416a-8671-bceb361a6779-kube-api-access-6clp9\") pod \"nova-cell1-8bb3-account-create-cfxfp\" (UID: \"f812212a-8c44-416a-8671-bceb361a6779\") " pod="openstack/nova-cell1-8bb3-account-create-cfxfp" Oct 12 07:50:29 crc kubenswrapper[4599]: I1012 07:50:29.954571 4599 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ad8ac3e7-8df6-433e-8fb6-2ed728f133ad-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 12 07:50:29 crc kubenswrapper[4599]: I1012 07:50:29.954980 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad8ac3e7-8df6-433e-8fb6-2ed728f133ad-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ad8ac3e7-8df6-433e-8fb6-2ed728f133ad" (UID: "ad8ac3e7-8df6-433e-8fb6-2ed728f133ad"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:50:29 crc kubenswrapper[4599]: I1012 07:50:29.966516 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad8ac3e7-8df6-433e-8fb6-2ed728f133ad-kube-api-access-sqqsh" (OuterVolumeSpecName: "kube-api-access-sqqsh") pod "ad8ac3e7-8df6-433e-8fb6-2ed728f133ad" (UID: "ad8ac3e7-8df6-433e-8fb6-2ed728f133ad"). InnerVolumeSpecName "kube-api-access-sqqsh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:50:29 crc kubenswrapper[4599]: I1012 07:50:29.972865 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad8ac3e7-8df6-433e-8fb6-2ed728f133ad-scripts" (OuterVolumeSpecName: "scripts") pod "ad8ac3e7-8df6-433e-8fb6-2ed728f133ad" (UID: "ad8ac3e7-8df6-433e-8fb6-2ed728f133ad"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:50:29 crc kubenswrapper[4599]: I1012 07:50:29.991428 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad8ac3e7-8df6-433e-8fb6-2ed728f133ad-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ad8ac3e7-8df6-433e-8fb6-2ed728f133ad" (UID: "ad8ac3e7-8df6-433e-8fb6-2ed728f133ad"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:50:30 crc kubenswrapper[4599]: I1012 07:50:30.058962 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6clp9\" (UniqueName: \"kubernetes.io/projected/f812212a-8c44-416a-8671-bceb361a6779-kube-api-access-6clp9\") pod \"nova-cell1-8bb3-account-create-cfxfp\" (UID: \"f812212a-8c44-416a-8671-bceb361a6779\") " pod="openstack/nova-cell1-8bb3-account-create-cfxfp" Oct 12 07:50:30 crc kubenswrapper[4599]: I1012 07:50:30.059363 4599 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ad8ac3e7-8df6-433e-8fb6-2ed728f133ad-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 12 07:50:30 crc kubenswrapper[4599]: I1012 07:50:30.059380 4599 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ad8ac3e7-8df6-433e-8fb6-2ed728f133ad-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 12 07:50:30 crc kubenswrapper[4599]: I1012 07:50:30.059390 4599 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad8ac3e7-8df6-433e-8fb6-2ed728f133ad-scripts\") on node \"crc\" DevicePath \"\"" Oct 12 07:50:30 crc kubenswrapper[4599]: I1012 07:50:30.059404 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sqqsh\" (UniqueName: \"kubernetes.io/projected/ad8ac3e7-8df6-433e-8fb6-2ed728f133ad-kube-api-access-sqqsh\") on node \"crc\" DevicePath \"\"" Oct 12 07:50:30 crc kubenswrapper[4599]: I1012 07:50:30.061402 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad8ac3e7-8df6-433e-8fb6-2ed728f133ad-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ad8ac3e7-8df6-433e-8fb6-2ed728f133ad" (UID: "ad8ac3e7-8df6-433e-8fb6-2ed728f133ad"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:50:30 crc kubenswrapper[4599]: I1012 07:50:30.064473 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rk6kl" Oct 12 07:50:30 crc kubenswrapper[4599]: I1012 07:50:30.077546 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6clp9\" (UniqueName: \"kubernetes.io/projected/f812212a-8c44-416a-8671-bceb361a6779-kube-api-access-6clp9\") pod \"nova-cell1-8bb3-account-create-cfxfp\" (UID: \"f812212a-8c44-416a-8671-bceb361a6779\") " pod="openstack/nova-cell1-8bb3-account-create-cfxfp" Oct 12 07:50:30 crc kubenswrapper[4599]: I1012 07:50:30.132127 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad8ac3e7-8df6-433e-8fb6-2ed728f133ad-config-data" (OuterVolumeSpecName: "config-data") pod "ad8ac3e7-8df6-433e-8fb6-2ed728f133ad" (UID: "ad8ac3e7-8df6-433e-8fb6-2ed728f133ad"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:50:30 crc kubenswrapper[4599]: I1012 07:50:30.160286 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2c5zt\" (UniqueName: \"kubernetes.io/projected/9f317920-4f82-4691-8e1c-31b0b35526e8-kube-api-access-2c5zt\") pod \"9f317920-4f82-4691-8e1c-31b0b35526e8\" (UID: \"9f317920-4f82-4691-8e1c-31b0b35526e8\") " Oct 12 07:50:30 crc kubenswrapper[4599]: I1012 07:50:30.160462 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f317920-4f82-4691-8e1c-31b0b35526e8-utilities\") pod \"9f317920-4f82-4691-8e1c-31b0b35526e8\" (UID: \"9f317920-4f82-4691-8e1c-31b0b35526e8\") " Oct 12 07:50:30 crc kubenswrapper[4599]: I1012 07:50:30.160701 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f317920-4f82-4691-8e1c-31b0b35526e8-catalog-content\") pod \"9f317920-4f82-4691-8e1c-31b0b35526e8\" (UID: \"9f317920-4f82-4691-8e1c-31b0b35526e8\") " Oct 12 07:50:30 crc kubenswrapper[4599]: I1012 07:50:30.161130 4599 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad8ac3e7-8df6-433e-8fb6-2ed728f133ad-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 07:50:30 crc kubenswrapper[4599]: I1012 07:50:30.161149 4599 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad8ac3e7-8df6-433e-8fb6-2ed728f133ad-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 07:50:30 crc kubenswrapper[4599]: I1012 07:50:30.161707 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f317920-4f82-4691-8e1c-31b0b35526e8-utilities" (OuterVolumeSpecName: "utilities") pod "9f317920-4f82-4691-8e1c-31b0b35526e8" (UID: "9f317920-4f82-4691-8e1c-31b0b35526e8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:50:30 crc kubenswrapper[4599]: I1012 07:50:30.168620 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f317920-4f82-4691-8e1c-31b0b35526e8-kube-api-access-2c5zt" (OuterVolumeSpecName: "kube-api-access-2c5zt") pod "9f317920-4f82-4691-8e1c-31b0b35526e8" (UID: "9f317920-4f82-4691-8e1c-31b0b35526e8"). InnerVolumeSpecName "kube-api-access-2c5zt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:50:30 crc kubenswrapper[4599]: I1012 07:50:30.171099 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-8bb3-account-create-cfxfp" Oct 12 07:50:30 crc kubenswrapper[4599]: I1012 07:50:30.172504 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f317920-4f82-4691-8e1c-31b0b35526e8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9f317920-4f82-4691-8e1c-31b0b35526e8" (UID: "9f317920-4f82-4691-8e1c-31b0b35526e8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:50:30 crc kubenswrapper[4599]: I1012 07:50:30.264017 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2c5zt\" (UniqueName: \"kubernetes.io/projected/9f317920-4f82-4691-8e1c-31b0b35526e8-kube-api-access-2c5zt\") on node \"crc\" DevicePath \"\"" Oct 12 07:50:30 crc kubenswrapper[4599]: I1012 07:50:30.264248 4599 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f317920-4f82-4691-8e1c-31b0b35526e8-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 07:50:30 crc kubenswrapper[4599]: I1012 07:50:30.264260 4599 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f317920-4f82-4691-8e1c-31b0b35526e8-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 07:50:30 crc kubenswrapper[4599]: I1012 07:50:30.291129 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-365a-account-create-hgsfs"] Oct 12 07:50:30 crc kubenswrapper[4599]: W1012 07:50:30.298222 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc351afb9_ec01_409c_bde8_b466a94e4d9d.slice/crio-9dcb916bd8bdb5793c5f21a563836c83c224f09fb4feaaddb62b50ccf56ce1cb WatchSource:0}: Error finding container 9dcb916bd8bdb5793c5f21a563836c83c224f09fb4feaaddb62b50ccf56ce1cb: Status 404 returned error can't find the container with id 9dcb916bd8bdb5793c5f21a563836c83c224f09fb4feaaddb62b50ccf56ce1cb Oct 12 07:50:30 crc kubenswrapper[4599]: I1012 07:50:30.428208 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-d46a-account-create-gnb8g"] Oct 12 07:50:30 crc kubenswrapper[4599]: W1012 07:50:30.430604 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd20b8ffe_fcc1_4097_a968_0cbe4bcd30a3.slice/crio-fb7c2da8971af63bde778963785db985ecf597e624843e99e601031c3c665388 WatchSource:0}: Error finding container fb7c2da8971af63bde778963785db985ecf597e624843e99e601031c3c665388: Status 404 returned error can't find the container with id fb7c2da8971af63bde778963785db985ecf597e624843e99e601031c3c665388 Oct 12 07:50:30 crc kubenswrapper[4599]: I1012 07:50:30.505850 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Oct 12 07:50:30 crc kubenswrapper[4599]: I1012 07:50:30.583203 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-8bb3-account-create-cfxfp"] Oct 12 07:50:30 crc kubenswrapper[4599]: W1012 07:50:30.607420 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf812212a_8c44_416a_8671_bceb361a6779.slice/crio-dd5b85bd9b4ddbce010c3600d78db5d94648e1d170a42a9b9f0c451c2f59c581 WatchSource:0}: Error finding container dd5b85bd9b4ddbce010c3600d78db5d94648e1d170a42a9b9f0c451c2f59c581: Status 404 returned error can't find the container with id dd5b85bd9b4ddbce010c3600d78db5d94648e1d170a42a9b9f0c451c2f59c581 Oct 12 07:50:30 crc kubenswrapper[4599]: I1012 07:50:30.640327 4599 generic.go:334] "Generic (PLEG): container finished" podID="c351afb9-ec01-409c-bde8-b466a94e4d9d" containerID="aa5dd1932318e1b477a7aa7cbe3e7779728a34d3e12999e78e4da4a5cd10f3b6" exitCode=0 Oct 12 07:50:30 crc kubenswrapper[4599]: I1012 07:50:30.640791 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-365a-account-create-hgsfs" event={"ID":"c351afb9-ec01-409c-bde8-b466a94e4d9d","Type":"ContainerDied","Data":"aa5dd1932318e1b477a7aa7cbe3e7779728a34d3e12999e78e4da4a5cd10f3b6"} Oct 12 07:50:30 crc kubenswrapper[4599]: I1012 07:50:30.640818 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-365a-account-create-hgsfs" event={"ID":"c351afb9-ec01-409c-bde8-b466a94e4d9d","Type":"ContainerStarted","Data":"9dcb916bd8bdb5793c5f21a563836c83c224f09fb4feaaddb62b50ccf56ce1cb"} Oct 12 07:50:30 crc kubenswrapper[4599]: I1012 07:50:30.655210 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ad8ac3e7-8df6-433e-8fb6-2ed728f133ad","Type":"ContainerDied","Data":"edb4d5bd111e802bd3eaeefffa1b14c126cd8f1b1484eea9f9280ef5e2c359e6"} Oct 12 07:50:30 crc kubenswrapper[4599]: I1012 07:50:30.655271 4599 scope.go:117] "RemoveContainer" containerID="e809f78f30205fffacf4be5e9441fd40efaafd14eca9b39d55b55e314a7578f5" Oct 12 07:50:30 crc kubenswrapper[4599]: I1012 07:50:30.655416 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 12 07:50:30 crc kubenswrapper[4599]: I1012 07:50:30.695643 4599 generic.go:334] "Generic (PLEG): container finished" podID="9f317920-4f82-4691-8e1c-31b0b35526e8" containerID="b638aaa1589d905594aec15c290c10b79fe2cdcf894d90b0dfacac9aa4f378b7" exitCode=0 Oct 12 07:50:30 crc kubenswrapper[4599]: I1012 07:50:30.695873 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rk6kl" event={"ID":"9f317920-4f82-4691-8e1c-31b0b35526e8","Type":"ContainerDied","Data":"b638aaa1589d905594aec15c290c10b79fe2cdcf894d90b0dfacac9aa4f378b7"} Oct 12 07:50:30 crc kubenswrapper[4599]: I1012 07:50:30.695962 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rk6kl" event={"ID":"9f317920-4f82-4691-8e1c-31b0b35526e8","Type":"ContainerDied","Data":"d66a28f2ed0b2c521f2fb1f7a3b55ae8b19955a89a1cec31c7131bce347b42bd"} Oct 12 07:50:30 crc kubenswrapper[4599]: I1012 07:50:30.696076 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rk6kl" Oct 12 07:50:30 crc kubenswrapper[4599]: I1012 07:50:30.708365 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-d46a-account-create-gnb8g" event={"ID":"d20b8ffe-fcc1-4097-a968-0cbe4bcd30a3","Type":"ContainerStarted","Data":"fb7c2da8971af63bde778963785db985ecf597e624843e99e601031c3c665388"} Oct 12 07:50:30 crc kubenswrapper[4599]: I1012 07:50:30.715403 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 12 07:50:30 crc kubenswrapper[4599]: I1012 07:50:30.727577 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 12 07:50:30 crc kubenswrapper[4599]: I1012 07:50:30.728457 4599 scope.go:117] "RemoveContainer" containerID="c703f77069f7bc020c223a930b6f6fa64c4d462d188dd0b6b07b7916886cf861" Oct 12 07:50:30 crc kubenswrapper[4599]: I1012 07:50:30.737398 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 12 07:50:30 crc kubenswrapper[4599]: E1012 07:50:30.737829 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f317920-4f82-4691-8e1c-31b0b35526e8" containerName="registry-server" Oct 12 07:50:30 crc kubenswrapper[4599]: I1012 07:50:30.737842 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f317920-4f82-4691-8e1c-31b0b35526e8" containerName="registry-server" Oct 12 07:50:30 crc kubenswrapper[4599]: E1012 07:50:30.737856 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f317920-4f82-4691-8e1c-31b0b35526e8" containerName="extract-content" Oct 12 07:50:30 crc kubenswrapper[4599]: I1012 07:50:30.737861 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f317920-4f82-4691-8e1c-31b0b35526e8" containerName="extract-content" Oct 12 07:50:30 crc kubenswrapper[4599]: E1012 07:50:30.737877 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad8ac3e7-8df6-433e-8fb6-2ed728f133ad" containerName="sg-core" Oct 12 07:50:30 crc kubenswrapper[4599]: I1012 07:50:30.737883 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad8ac3e7-8df6-433e-8fb6-2ed728f133ad" containerName="sg-core" Oct 12 07:50:30 crc kubenswrapper[4599]: E1012 07:50:30.737896 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad8ac3e7-8df6-433e-8fb6-2ed728f133ad" containerName="ceilometer-notification-agent" Oct 12 07:50:30 crc kubenswrapper[4599]: I1012 07:50:30.737901 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad8ac3e7-8df6-433e-8fb6-2ed728f133ad" containerName="ceilometer-notification-agent" Oct 12 07:50:30 crc kubenswrapper[4599]: E1012 07:50:30.737929 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f317920-4f82-4691-8e1c-31b0b35526e8" containerName="extract-utilities" Oct 12 07:50:30 crc kubenswrapper[4599]: I1012 07:50:30.737934 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f317920-4f82-4691-8e1c-31b0b35526e8" containerName="extract-utilities" Oct 12 07:50:30 crc kubenswrapper[4599]: E1012 07:50:30.737945 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad8ac3e7-8df6-433e-8fb6-2ed728f133ad" containerName="proxy-httpd" Oct 12 07:50:30 crc kubenswrapper[4599]: I1012 07:50:30.737950 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad8ac3e7-8df6-433e-8fb6-2ed728f133ad" containerName="proxy-httpd" Oct 12 07:50:30 crc kubenswrapper[4599]: E1012 07:50:30.737959 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad8ac3e7-8df6-433e-8fb6-2ed728f133ad" containerName="ceilometer-central-agent" Oct 12 07:50:30 crc kubenswrapper[4599]: I1012 07:50:30.737965 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad8ac3e7-8df6-433e-8fb6-2ed728f133ad" containerName="ceilometer-central-agent" Oct 12 07:50:30 crc kubenswrapper[4599]: I1012 07:50:30.738124 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad8ac3e7-8df6-433e-8fb6-2ed728f133ad" containerName="ceilometer-notification-agent" Oct 12 07:50:30 crc kubenswrapper[4599]: I1012 07:50:30.738138 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad8ac3e7-8df6-433e-8fb6-2ed728f133ad" containerName="sg-core" Oct 12 07:50:30 crc kubenswrapper[4599]: I1012 07:50:30.738150 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f317920-4f82-4691-8e1c-31b0b35526e8" containerName="registry-server" Oct 12 07:50:30 crc kubenswrapper[4599]: I1012 07:50:30.738161 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad8ac3e7-8df6-433e-8fb6-2ed728f133ad" containerName="ceilometer-central-agent" Oct 12 07:50:30 crc kubenswrapper[4599]: I1012 07:50:30.738180 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad8ac3e7-8df6-433e-8fb6-2ed728f133ad" containerName="proxy-httpd" Oct 12 07:50:30 crc kubenswrapper[4599]: I1012 07:50:30.746765 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 12 07:50:30 crc kubenswrapper[4599]: I1012 07:50:30.747095 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 12 07:50:30 crc kubenswrapper[4599]: I1012 07:50:30.751475 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 12 07:50:30 crc kubenswrapper[4599]: I1012 07:50:30.753483 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 12 07:50:30 crc kubenswrapper[4599]: I1012 07:50:30.764347 4599 scope.go:117] "RemoveContainer" containerID="b6e98c23c623db132e7de729ee558b4d755f97b5bf06ed983fb5cf5d69aa1dbd" Oct 12 07:50:30 crc kubenswrapper[4599]: I1012 07:50:30.774829 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rk6kl"] Oct 12 07:50:30 crc kubenswrapper[4599]: I1012 07:50:30.789156 4599 scope.go:117] "RemoveContainer" containerID="a322b17e1734018bc819124a518971344bf7d4c215c1f9395b066f21d8246130" Oct 12 07:50:30 crc kubenswrapper[4599]: I1012 07:50:30.806449 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-rk6kl"] Oct 12 07:50:30 crc kubenswrapper[4599]: I1012 07:50:30.819155 4599 scope.go:117] "RemoveContainer" containerID="b638aaa1589d905594aec15c290c10b79fe2cdcf894d90b0dfacac9aa4f378b7" Oct 12 07:50:30 crc kubenswrapper[4599]: I1012 07:50:30.888374 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18f19858-7b8d-4c40-afde-102c00500fe0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"18f19858-7b8d-4c40-afde-102c00500fe0\") " pod="openstack/ceilometer-0" Oct 12 07:50:30 crc kubenswrapper[4599]: I1012 07:50:30.888496 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18f19858-7b8d-4c40-afde-102c00500fe0-config-data\") pod \"ceilometer-0\" (UID: \"18f19858-7b8d-4c40-afde-102c00500fe0\") " pod="openstack/ceilometer-0" Oct 12 07:50:30 crc kubenswrapper[4599]: I1012 07:50:30.888562 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/18f19858-7b8d-4c40-afde-102c00500fe0-log-httpd\") pod \"ceilometer-0\" (UID: \"18f19858-7b8d-4c40-afde-102c00500fe0\") " pod="openstack/ceilometer-0" Oct 12 07:50:30 crc kubenswrapper[4599]: I1012 07:50:30.888629 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/18f19858-7b8d-4c40-afde-102c00500fe0-run-httpd\") pod \"ceilometer-0\" (UID: \"18f19858-7b8d-4c40-afde-102c00500fe0\") " pod="openstack/ceilometer-0" Oct 12 07:50:30 crc kubenswrapper[4599]: I1012 07:50:30.888673 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/18f19858-7b8d-4c40-afde-102c00500fe0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"18f19858-7b8d-4c40-afde-102c00500fe0\") " pod="openstack/ceilometer-0" Oct 12 07:50:30 crc kubenswrapper[4599]: I1012 07:50:30.888761 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87h6w\" (UniqueName: \"kubernetes.io/projected/18f19858-7b8d-4c40-afde-102c00500fe0-kube-api-access-87h6w\") pod \"ceilometer-0\" (UID: \"18f19858-7b8d-4c40-afde-102c00500fe0\") " pod="openstack/ceilometer-0" Oct 12 07:50:30 crc kubenswrapper[4599]: I1012 07:50:30.888795 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18f19858-7b8d-4c40-afde-102c00500fe0-scripts\") pod \"ceilometer-0\" (UID: \"18f19858-7b8d-4c40-afde-102c00500fe0\") " pod="openstack/ceilometer-0" Oct 12 07:50:30 crc kubenswrapper[4599]: I1012 07:50:30.895561 4599 scope.go:117] "RemoveContainer" containerID="ab4347a8dd9086883071cf633a1b3ed022ddc06d202e90171c2686bcf93e2b81" Oct 12 07:50:30 crc kubenswrapper[4599]: I1012 07:50:30.963823 4599 scope.go:117] "RemoveContainer" containerID="f9cc29531352bab33de61b83e169f4cc518d5c558085a4b25545c45ffe540bc5" Oct 12 07:50:30 crc kubenswrapper[4599]: I1012 07:50:30.990751 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18f19858-7b8d-4c40-afde-102c00500fe0-config-data\") pod \"ceilometer-0\" (UID: \"18f19858-7b8d-4c40-afde-102c00500fe0\") " pod="openstack/ceilometer-0" Oct 12 07:50:30 crc kubenswrapper[4599]: I1012 07:50:30.990843 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/18f19858-7b8d-4c40-afde-102c00500fe0-log-httpd\") pod \"ceilometer-0\" (UID: \"18f19858-7b8d-4c40-afde-102c00500fe0\") " pod="openstack/ceilometer-0" Oct 12 07:50:30 crc kubenswrapper[4599]: I1012 07:50:30.990912 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/18f19858-7b8d-4c40-afde-102c00500fe0-run-httpd\") pod \"ceilometer-0\" (UID: \"18f19858-7b8d-4c40-afde-102c00500fe0\") " pod="openstack/ceilometer-0" Oct 12 07:50:30 crc kubenswrapper[4599]: I1012 07:50:30.990964 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/18f19858-7b8d-4c40-afde-102c00500fe0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"18f19858-7b8d-4c40-afde-102c00500fe0\") " pod="openstack/ceilometer-0" Oct 12 07:50:30 crc kubenswrapper[4599]: I1012 07:50:30.991024 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87h6w\" (UniqueName: \"kubernetes.io/projected/18f19858-7b8d-4c40-afde-102c00500fe0-kube-api-access-87h6w\") pod \"ceilometer-0\" (UID: \"18f19858-7b8d-4c40-afde-102c00500fe0\") " pod="openstack/ceilometer-0" Oct 12 07:50:30 crc kubenswrapper[4599]: I1012 07:50:30.991058 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18f19858-7b8d-4c40-afde-102c00500fe0-scripts\") pod \"ceilometer-0\" (UID: \"18f19858-7b8d-4c40-afde-102c00500fe0\") " pod="openstack/ceilometer-0" Oct 12 07:50:30 crc kubenswrapper[4599]: I1012 07:50:30.991124 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18f19858-7b8d-4c40-afde-102c00500fe0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"18f19858-7b8d-4c40-afde-102c00500fe0\") " pod="openstack/ceilometer-0" Oct 12 07:50:30 crc kubenswrapper[4599]: I1012 07:50:30.992668 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/18f19858-7b8d-4c40-afde-102c00500fe0-log-httpd\") pod \"ceilometer-0\" (UID: \"18f19858-7b8d-4c40-afde-102c00500fe0\") " pod="openstack/ceilometer-0" Oct 12 07:50:30 crc kubenswrapper[4599]: I1012 07:50:30.992877 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/18f19858-7b8d-4c40-afde-102c00500fe0-run-httpd\") pod \"ceilometer-0\" (UID: \"18f19858-7b8d-4c40-afde-102c00500fe0\") " pod="openstack/ceilometer-0" Oct 12 07:50:30 crc kubenswrapper[4599]: I1012 07:50:30.996771 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18f19858-7b8d-4c40-afde-102c00500fe0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"18f19858-7b8d-4c40-afde-102c00500fe0\") " pod="openstack/ceilometer-0" Oct 12 07:50:30 crc kubenswrapper[4599]: I1012 07:50:30.997200 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/18f19858-7b8d-4c40-afde-102c00500fe0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"18f19858-7b8d-4c40-afde-102c00500fe0\") " pod="openstack/ceilometer-0" Oct 12 07:50:30 crc kubenswrapper[4599]: I1012 07:50:30.997380 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18f19858-7b8d-4c40-afde-102c00500fe0-scripts\") pod \"ceilometer-0\" (UID: \"18f19858-7b8d-4c40-afde-102c00500fe0\") " pod="openstack/ceilometer-0" Oct 12 07:50:31 crc kubenswrapper[4599]: I1012 07:50:31.001538 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18f19858-7b8d-4c40-afde-102c00500fe0-config-data\") pod \"ceilometer-0\" (UID: \"18f19858-7b8d-4c40-afde-102c00500fe0\") " pod="openstack/ceilometer-0" Oct 12 07:50:31 crc kubenswrapper[4599]: I1012 07:50:31.006016 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87h6w\" (UniqueName: \"kubernetes.io/projected/18f19858-7b8d-4c40-afde-102c00500fe0-kube-api-access-87h6w\") pod \"ceilometer-0\" (UID: \"18f19858-7b8d-4c40-afde-102c00500fe0\") " pod="openstack/ceilometer-0" Oct 12 07:50:31 crc kubenswrapper[4599]: I1012 07:50:31.007530 4599 scope.go:117] "RemoveContainer" containerID="b638aaa1589d905594aec15c290c10b79fe2cdcf894d90b0dfacac9aa4f378b7" Oct 12 07:50:31 crc kubenswrapper[4599]: E1012 07:50:31.007924 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b638aaa1589d905594aec15c290c10b79fe2cdcf894d90b0dfacac9aa4f378b7\": container with ID starting with b638aaa1589d905594aec15c290c10b79fe2cdcf894d90b0dfacac9aa4f378b7 not found: ID does not exist" containerID="b638aaa1589d905594aec15c290c10b79fe2cdcf894d90b0dfacac9aa4f378b7" Oct 12 07:50:31 crc kubenswrapper[4599]: I1012 07:50:31.007952 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b638aaa1589d905594aec15c290c10b79fe2cdcf894d90b0dfacac9aa4f378b7"} err="failed to get container status \"b638aaa1589d905594aec15c290c10b79fe2cdcf894d90b0dfacac9aa4f378b7\": rpc error: code = NotFound desc = could not find container \"b638aaa1589d905594aec15c290c10b79fe2cdcf894d90b0dfacac9aa4f378b7\": container with ID starting with b638aaa1589d905594aec15c290c10b79fe2cdcf894d90b0dfacac9aa4f378b7 not found: ID does not exist" Oct 12 07:50:31 crc kubenswrapper[4599]: I1012 07:50:31.007980 4599 scope.go:117] "RemoveContainer" containerID="ab4347a8dd9086883071cf633a1b3ed022ddc06d202e90171c2686bcf93e2b81" Oct 12 07:50:31 crc kubenswrapper[4599]: E1012 07:50:31.008302 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab4347a8dd9086883071cf633a1b3ed022ddc06d202e90171c2686bcf93e2b81\": container with ID starting with ab4347a8dd9086883071cf633a1b3ed022ddc06d202e90171c2686bcf93e2b81 not found: ID does not exist" containerID="ab4347a8dd9086883071cf633a1b3ed022ddc06d202e90171c2686bcf93e2b81" Oct 12 07:50:31 crc kubenswrapper[4599]: I1012 07:50:31.008317 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab4347a8dd9086883071cf633a1b3ed022ddc06d202e90171c2686bcf93e2b81"} err="failed to get container status \"ab4347a8dd9086883071cf633a1b3ed022ddc06d202e90171c2686bcf93e2b81\": rpc error: code = NotFound desc = could not find container \"ab4347a8dd9086883071cf633a1b3ed022ddc06d202e90171c2686bcf93e2b81\": container with ID starting with ab4347a8dd9086883071cf633a1b3ed022ddc06d202e90171c2686bcf93e2b81 not found: ID does not exist" Oct 12 07:50:31 crc kubenswrapper[4599]: I1012 07:50:31.008356 4599 scope.go:117] "RemoveContainer" containerID="f9cc29531352bab33de61b83e169f4cc518d5c558085a4b25545c45ffe540bc5" Oct 12 07:50:31 crc kubenswrapper[4599]: E1012 07:50:31.011959 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9cc29531352bab33de61b83e169f4cc518d5c558085a4b25545c45ffe540bc5\": container with ID starting with f9cc29531352bab33de61b83e169f4cc518d5c558085a4b25545c45ffe540bc5 not found: ID does not exist" containerID="f9cc29531352bab33de61b83e169f4cc518d5c558085a4b25545c45ffe540bc5" Oct 12 07:50:31 crc kubenswrapper[4599]: I1012 07:50:31.012019 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9cc29531352bab33de61b83e169f4cc518d5c558085a4b25545c45ffe540bc5"} err="failed to get container status \"f9cc29531352bab33de61b83e169f4cc518d5c558085a4b25545c45ffe540bc5\": rpc error: code = NotFound desc = could not find container \"f9cc29531352bab33de61b83e169f4cc518d5c558085a4b25545c45ffe540bc5\": container with ID starting with f9cc29531352bab33de61b83e169f4cc518d5c558085a4b25545c45ffe540bc5 not found: ID does not exist" Oct 12 07:50:31 crc kubenswrapper[4599]: I1012 07:50:31.073140 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 12 07:50:31 crc kubenswrapper[4599]: I1012 07:50:31.493203 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 12 07:50:31 crc kubenswrapper[4599]: I1012 07:50:31.554197 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f317920-4f82-4691-8e1c-31b0b35526e8" path="/var/lib/kubelet/pods/9f317920-4f82-4691-8e1c-31b0b35526e8/volumes" Oct 12 07:50:31 crc kubenswrapper[4599]: I1012 07:50:31.554996 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad8ac3e7-8df6-433e-8fb6-2ed728f133ad" path="/var/lib/kubelet/pods/ad8ac3e7-8df6-433e-8fb6-2ed728f133ad/volumes" Oct 12 07:50:31 crc kubenswrapper[4599]: I1012 07:50:31.556222 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fdd9da54-4ee3-40c8-af3b-fb522971c160" path="/var/lib/kubelet/pods/fdd9da54-4ee3-40c8-af3b-fb522971c160/volumes" Oct 12 07:50:31 crc kubenswrapper[4599]: I1012 07:50:31.719697 4599 generic.go:334] "Generic (PLEG): container finished" podID="f812212a-8c44-416a-8671-bceb361a6779" containerID="0cf8e6c98ac835ac25ab6609fceef30863df8bc0ca707bb8b0066f7f1bd23692" exitCode=0 Oct 12 07:50:31 crc kubenswrapper[4599]: I1012 07:50:31.719783 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-8bb3-account-create-cfxfp" event={"ID":"f812212a-8c44-416a-8671-bceb361a6779","Type":"ContainerDied","Data":"0cf8e6c98ac835ac25ab6609fceef30863df8bc0ca707bb8b0066f7f1bd23692"} Oct 12 07:50:31 crc kubenswrapper[4599]: I1012 07:50:31.719814 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-8bb3-account-create-cfxfp" event={"ID":"f812212a-8c44-416a-8671-bceb361a6779","Type":"ContainerStarted","Data":"dd5b85bd9b4ddbce010c3600d78db5d94648e1d170a42a9b9f0c451c2f59c581"} Oct 12 07:50:31 crc kubenswrapper[4599]: I1012 07:50:31.722647 4599 generic.go:334] "Generic (PLEG): container finished" podID="d20b8ffe-fcc1-4097-a968-0cbe4bcd30a3" containerID="83db2f60d001c5dcb6721a23c2ad2ab8bbbe0aa9486ba3cbf623c5b688aa256c" exitCode=0 Oct 12 07:50:31 crc kubenswrapper[4599]: I1012 07:50:31.722729 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-d46a-account-create-gnb8g" event={"ID":"d20b8ffe-fcc1-4097-a968-0cbe4bcd30a3","Type":"ContainerDied","Data":"83db2f60d001c5dcb6721a23c2ad2ab8bbbe0aa9486ba3cbf623c5b688aa256c"} Oct 12 07:50:31 crc kubenswrapper[4599]: I1012 07:50:31.723866 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"18f19858-7b8d-4c40-afde-102c00500fe0","Type":"ContainerStarted","Data":"f237955579909e1c4bc6d182e14ba03f1b581fc1cd2c657361f3dae91c2afbab"} Oct 12 07:50:32 crc kubenswrapper[4599]: I1012 07:50:32.037067 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-365a-account-create-hgsfs" Oct 12 07:50:32 crc kubenswrapper[4599]: I1012 07:50:32.218463 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vpxwq\" (UniqueName: \"kubernetes.io/projected/c351afb9-ec01-409c-bde8-b466a94e4d9d-kube-api-access-vpxwq\") pod \"c351afb9-ec01-409c-bde8-b466a94e4d9d\" (UID: \"c351afb9-ec01-409c-bde8-b466a94e4d9d\") " Oct 12 07:50:32 crc kubenswrapper[4599]: I1012 07:50:32.225388 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c351afb9-ec01-409c-bde8-b466a94e4d9d-kube-api-access-vpxwq" (OuterVolumeSpecName: "kube-api-access-vpxwq") pod "c351afb9-ec01-409c-bde8-b466a94e4d9d" (UID: "c351afb9-ec01-409c-bde8-b466a94e4d9d"). InnerVolumeSpecName "kube-api-access-vpxwq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:50:32 crc kubenswrapper[4599]: I1012 07:50:32.322274 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vpxwq\" (UniqueName: \"kubernetes.io/projected/c351afb9-ec01-409c-bde8-b466a94e4d9d-kube-api-access-vpxwq\") on node \"crc\" DevicePath \"\"" Oct 12 07:50:32 crc kubenswrapper[4599]: I1012 07:50:32.736365 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-365a-account-create-hgsfs" event={"ID":"c351afb9-ec01-409c-bde8-b466a94e4d9d","Type":"ContainerDied","Data":"9dcb916bd8bdb5793c5f21a563836c83c224f09fb4feaaddb62b50ccf56ce1cb"} Oct 12 07:50:32 crc kubenswrapper[4599]: I1012 07:50:32.736716 4599 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9dcb916bd8bdb5793c5f21a563836c83c224f09fb4feaaddb62b50ccf56ce1cb" Oct 12 07:50:32 crc kubenswrapper[4599]: I1012 07:50:32.736398 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-365a-account-create-hgsfs" Oct 12 07:50:32 crc kubenswrapper[4599]: I1012 07:50:32.739305 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"18f19858-7b8d-4c40-afde-102c00500fe0","Type":"ContainerStarted","Data":"be351c4ab187da14c841501e1db3f02eca70d2ffcf8fd7925cbf3e97393380de"} Oct 12 07:50:33 crc kubenswrapper[4599]: I1012 07:50:33.120450 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-8bb3-account-create-cfxfp" Oct 12 07:50:33 crc kubenswrapper[4599]: I1012 07:50:33.124991 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-d46a-account-create-gnb8g" Oct 12 07:50:33 crc kubenswrapper[4599]: I1012 07:50:33.242518 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6clp9\" (UniqueName: \"kubernetes.io/projected/f812212a-8c44-416a-8671-bceb361a6779-kube-api-access-6clp9\") pod \"f812212a-8c44-416a-8671-bceb361a6779\" (UID: \"f812212a-8c44-416a-8671-bceb361a6779\") " Oct 12 07:50:33 crc kubenswrapper[4599]: I1012 07:50:33.242574 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vj6n8\" (UniqueName: \"kubernetes.io/projected/d20b8ffe-fcc1-4097-a968-0cbe4bcd30a3-kube-api-access-vj6n8\") pod \"d20b8ffe-fcc1-4097-a968-0cbe4bcd30a3\" (UID: \"d20b8ffe-fcc1-4097-a968-0cbe4bcd30a3\") " Oct 12 07:50:33 crc kubenswrapper[4599]: I1012 07:50:33.249628 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d20b8ffe-fcc1-4097-a968-0cbe4bcd30a3-kube-api-access-vj6n8" (OuterVolumeSpecName: "kube-api-access-vj6n8") pod "d20b8ffe-fcc1-4097-a968-0cbe4bcd30a3" (UID: "d20b8ffe-fcc1-4097-a968-0cbe4bcd30a3"). InnerVolumeSpecName "kube-api-access-vj6n8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:50:33 crc kubenswrapper[4599]: I1012 07:50:33.249700 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f812212a-8c44-416a-8671-bceb361a6779-kube-api-access-6clp9" (OuterVolumeSpecName: "kube-api-access-6clp9") pod "f812212a-8c44-416a-8671-bceb361a6779" (UID: "f812212a-8c44-416a-8671-bceb361a6779"). InnerVolumeSpecName "kube-api-access-6clp9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:50:33 crc kubenswrapper[4599]: I1012 07:50:33.345666 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6clp9\" (UniqueName: \"kubernetes.io/projected/f812212a-8c44-416a-8671-bceb361a6779-kube-api-access-6clp9\") on node \"crc\" DevicePath \"\"" Oct 12 07:50:33 crc kubenswrapper[4599]: I1012 07:50:33.345703 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vj6n8\" (UniqueName: \"kubernetes.io/projected/d20b8ffe-fcc1-4097-a968-0cbe4bcd30a3-kube-api-access-vj6n8\") on node \"crc\" DevicePath \"\"" Oct 12 07:50:33 crc kubenswrapper[4599]: I1012 07:50:33.760203 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-d46a-account-create-gnb8g" event={"ID":"d20b8ffe-fcc1-4097-a968-0cbe4bcd30a3","Type":"ContainerDied","Data":"fb7c2da8971af63bde778963785db985ecf597e624843e99e601031c3c665388"} Oct 12 07:50:33 crc kubenswrapper[4599]: I1012 07:50:33.760258 4599 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb7c2da8971af63bde778963785db985ecf597e624843e99e601031c3c665388" Oct 12 07:50:33 crc kubenswrapper[4599]: I1012 07:50:33.760350 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-d46a-account-create-gnb8g" Oct 12 07:50:33 crc kubenswrapper[4599]: I1012 07:50:33.768352 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"18f19858-7b8d-4c40-afde-102c00500fe0","Type":"ContainerStarted","Data":"50d89208ffbfaf2b47cfe9fde6f024ae70473304133d3f34769e62780f33b003"} Oct 12 07:50:33 crc kubenswrapper[4599]: I1012 07:50:33.769840 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-8bb3-account-create-cfxfp" event={"ID":"f812212a-8c44-416a-8671-bceb361a6779","Type":"ContainerDied","Data":"dd5b85bd9b4ddbce010c3600d78db5d94648e1d170a42a9b9f0c451c2f59c581"} Oct 12 07:50:33 crc kubenswrapper[4599]: I1012 07:50:33.769864 4599 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dd5b85bd9b4ddbce010c3600d78db5d94648e1d170a42a9b9f0c451c2f59c581" Oct 12 07:50:33 crc kubenswrapper[4599]: I1012 07:50:33.769900 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-8bb3-account-create-cfxfp" Oct 12 07:50:34 crc kubenswrapper[4599]: I1012 07:50:34.795552 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-q7gzz"] Oct 12 07:50:34 crc kubenswrapper[4599]: E1012 07:50:34.796187 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c351afb9-ec01-409c-bde8-b466a94e4d9d" containerName="mariadb-account-create" Oct 12 07:50:34 crc kubenswrapper[4599]: I1012 07:50:34.796201 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="c351afb9-ec01-409c-bde8-b466a94e4d9d" containerName="mariadb-account-create" Oct 12 07:50:34 crc kubenswrapper[4599]: E1012 07:50:34.796227 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f812212a-8c44-416a-8671-bceb361a6779" containerName="mariadb-account-create" Oct 12 07:50:34 crc kubenswrapper[4599]: I1012 07:50:34.796235 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="f812212a-8c44-416a-8671-bceb361a6779" containerName="mariadb-account-create" Oct 12 07:50:34 crc kubenswrapper[4599]: E1012 07:50:34.796251 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d20b8ffe-fcc1-4097-a968-0cbe4bcd30a3" containerName="mariadb-account-create" Oct 12 07:50:34 crc kubenswrapper[4599]: I1012 07:50:34.796257 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="d20b8ffe-fcc1-4097-a968-0cbe4bcd30a3" containerName="mariadb-account-create" Oct 12 07:50:34 crc kubenswrapper[4599]: I1012 07:50:34.800179 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="d20b8ffe-fcc1-4097-a968-0cbe4bcd30a3" containerName="mariadb-account-create" Oct 12 07:50:34 crc kubenswrapper[4599]: I1012 07:50:34.800209 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="c351afb9-ec01-409c-bde8-b466a94e4d9d" containerName="mariadb-account-create" Oct 12 07:50:34 crc kubenswrapper[4599]: I1012 07:50:34.800224 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="f812212a-8c44-416a-8671-bceb361a6779" containerName="mariadb-account-create" Oct 12 07:50:34 crc kubenswrapper[4599]: I1012 07:50:34.808718 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"18f19858-7b8d-4c40-afde-102c00500fe0","Type":"ContainerStarted","Data":"aaab87415dfc693913a370b19728746de29f16583adf69abec100847729b553b"} Oct 12 07:50:34 crc kubenswrapper[4599]: I1012 07:50:34.808855 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-q7gzz" Oct 12 07:50:34 crc kubenswrapper[4599]: I1012 07:50:34.808946 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-q7gzz"] Oct 12 07:50:34 crc kubenswrapper[4599]: I1012 07:50:34.810573 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Oct 12 07:50:34 crc kubenswrapper[4599]: I1012 07:50:34.810835 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-fngv7" Oct 12 07:50:34 crc kubenswrapper[4599]: I1012 07:50:34.811591 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 12 07:50:34 crc kubenswrapper[4599]: I1012 07:50:34.981908 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3c31956-8a05-421b-86cf-5e04f27a0ad1-config-data\") pod \"nova-cell0-conductor-db-sync-q7gzz\" (UID: \"c3c31956-8a05-421b-86cf-5e04f27a0ad1\") " pod="openstack/nova-cell0-conductor-db-sync-q7gzz" Oct 12 07:50:34 crc kubenswrapper[4599]: I1012 07:50:34.981992 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3c31956-8a05-421b-86cf-5e04f27a0ad1-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-q7gzz\" (UID: \"c3c31956-8a05-421b-86cf-5e04f27a0ad1\") " pod="openstack/nova-cell0-conductor-db-sync-q7gzz" Oct 12 07:50:34 crc kubenswrapper[4599]: I1012 07:50:34.982029 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcg4g\" (UniqueName: \"kubernetes.io/projected/c3c31956-8a05-421b-86cf-5e04f27a0ad1-kube-api-access-qcg4g\") pod \"nova-cell0-conductor-db-sync-q7gzz\" (UID: \"c3c31956-8a05-421b-86cf-5e04f27a0ad1\") " pod="openstack/nova-cell0-conductor-db-sync-q7gzz" Oct 12 07:50:34 crc kubenswrapper[4599]: I1012 07:50:34.982380 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3c31956-8a05-421b-86cf-5e04f27a0ad1-scripts\") pod \"nova-cell0-conductor-db-sync-q7gzz\" (UID: \"c3c31956-8a05-421b-86cf-5e04f27a0ad1\") " pod="openstack/nova-cell0-conductor-db-sync-q7gzz" Oct 12 07:50:35 crc kubenswrapper[4599]: I1012 07:50:35.084213 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3c31956-8a05-421b-86cf-5e04f27a0ad1-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-q7gzz\" (UID: \"c3c31956-8a05-421b-86cf-5e04f27a0ad1\") " pod="openstack/nova-cell0-conductor-db-sync-q7gzz" Oct 12 07:50:35 crc kubenswrapper[4599]: I1012 07:50:35.084256 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcg4g\" (UniqueName: \"kubernetes.io/projected/c3c31956-8a05-421b-86cf-5e04f27a0ad1-kube-api-access-qcg4g\") pod \"nova-cell0-conductor-db-sync-q7gzz\" (UID: \"c3c31956-8a05-421b-86cf-5e04f27a0ad1\") " pod="openstack/nova-cell0-conductor-db-sync-q7gzz" Oct 12 07:50:35 crc kubenswrapper[4599]: I1012 07:50:35.084327 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3c31956-8a05-421b-86cf-5e04f27a0ad1-scripts\") pod \"nova-cell0-conductor-db-sync-q7gzz\" (UID: \"c3c31956-8a05-421b-86cf-5e04f27a0ad1\") " pod="openstack/nova-cell0-conductor-db-sync-q7gzz" Oct 12 07:50:35 crc kubenswrapper[4599]: I1012 07:50:35.084404 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3c31956-8a05-421b-86cf-5e04f27a0ad1-config-data\") pod \"nova-cell0-conductor-db-sync-q7gzz\" (UID: \"c3c31956-8a05-421b-86cf-5e04f27a0ad1\") " pod="openstack/nova-cell0-conductor-db-sync-q7gzz" Oct 12 07:50:35 crc kubenswrapper[4599]: I1012 07:50:35.091292 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3c31956-8a05-421b-86cf-5e04f27a0ad1-config-data\") pod \"nova-cell0-conductor-db-sync-q7gzz\" (UID: \"c3c31956-8a05-421b-86cf-5e04f27a0ad1\") " pod="openstack/nova-cell0-conductor-db-sync-q7gzz" Oct 12 07:50:35 crc kubenswrapper[4599]: I1012 07:50:35.092002 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3c31956-8a05-421b-86cf-5e04f27a0ad1-scripts\") pod \"nova-cell0-conductor-db-sync-q7gzz\" (UID: \"c3c31956-8a05-421b-86cf-5e04f27a0ad1\") " pod="openstack/nova-cell0-conductor-db-sync-q7gzz" Oct 12 07:50:35 crc kubenswrapper[4599]: I1012 07:50:35.097440 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3c31956-8a05-421b-86cf-5e04f27a0ad1-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-q7gzz\" (UID: \"c3c31956-8a05-421b-86cf-5e04f27a0ad1\") " pod="openstack/nova-cell0-conductor-db-sync-q7gzz" Oct 12 07:50:35 crc kubenswrapper[4599]: I1012 07:50:35.105596 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcg4g\" (UniqueName: \"kubernetes.io/projected/c3c31956-8a05-421b-86cf-5e04f27a0ad1-kube-api-access-qcg4g\") pod \"nova-cell0-conductor-db-sync-q7gzz\" (UID: \"c3c31956-8a05-421b-86cf-5e04f27a0ad1\") " pod="openstack/nova-cell0-conductor-db-sync-q7gzz" Oct 12 07:50:35 crc kubenswrapper[4599]: I1012 07:50:35.126620 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-q7gzz" Oct 12 07:50:35 crc kubenswrapper[4599]: I1012 07:50:35.561875 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-q7gzz"] Oct 12 07:50:35 crc kubenswrapper[4599]: W1012 07:50:35.570282 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc3c31956_8a05_421b_86cf_5e04f27a0ad1.slice/crio-58c4736288aaa63f681dce3e08270f84b5fa4a85fc26e2ef7315665275fdff66 WatchSource:0}: Error finding container 58c4736288aaa63f681dce3e08270f84b5fa4a85fc26e2ef7315665275fdff66: Status 404 returned error can't find the container with id 58c4736288aaa63f681dce3e08270f84b5fa4a85fc26e2ef7315665275fdff66 Oct 12 07:50:35 crc kubenswrapper[4599]: I1012 07:50:35.809425 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"18f19858-7b8d-4c40-afde-102c00500fe0","Type":"ContainerStarted","Data":"498ce5d4ca86aa9fffe991724399ac67b69637f733112dad41b0786a3f28c57e"} Oct 12 07:50:35 crc kubenswrapper[4599]: I1012 07:50:35.812049 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 12 07:50:35 crc kubenswrapper[4599]: I1012 07:50:35.816328 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-q7gzz" event={"ID":"c3c31956-8a05-421b-86cf-5e04f27a0ad1","Type":"ContainerStarted","Data":"58c4736288aaa63f681dce3e08270f84b5fa4a85fc26e2ef7315665275fdff66"} Oct 12 07:50:35 crc kubenswrapper[4599]: I1012 07:50:35.835159 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.895581152 podStartE2EDuration="5.835129664s" podCreationTimestamp="2025-10-12 07:50:30 +0000 UTC" firstStartedPulling="2025-10-12 07:50:31.50318201 +0000 UTC m=+928.292377513" lastFinishedPulling="2025-10-12 07:50:35.442730523 +0000 UTC m=+932.231926025" observedRunningTime="2025-10-12 07:50:35.831664233 +0000 UTC m=+932.620859735" watchObservedRunningTime="2025-10-12 07:50:35.835129664 +0000 UTC m=+932.624325165" Oct 12 07:50:44 crc kubenswrapper[4599]: I1012 07:50:44.920458 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-q7gzz" event={"ID":"c3c31956-8a05-421b-86cf-5e04f27a0ad1","Type":"ContainerStarted","Data":"fa1c6ad6118afca12345e075d1eee8718b7d8386aa4167c93749b5827db46642"} Oct 12 07:50:44 crc kubenswrapper[4599]: I1012 07:50:44.944394 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-q7gzz" podStartSLOduration=2.166347761 podStartE2EDuration="10.944378599s" podCreationTimestamp="2025-10-12 07:50:34 +0000 UTC" firstStartedPulling="2025-10-12 07:50:35.573294498 +0000 UTC m=+932.362490000" lastFinishedPulling="2025-10-12 07:50:44.351325335 +0000 UTC m=+941.140520838" observedRunningTime="2025-10-12 07:50:44.941200151 +0000 UTC m=+941.730395653" watchObservedRunningTime="2025-10-12 07:50:44.944378599 +0000 UTC m=+941.733574101" Oct 12 07:50:50 crc kubenswrapper[4599]: I1012 07:50:50.990284 4599 generic.go:334] "Generic (PLEG): container finished" podID="c3c31956-8a05-421b-86cf-5e04f27a0ad1" containerID="fa1c6ad6118afca12345e075d1eee8718b7d8386aa4167c93749b5827db46642" exitCode=0 Oct 12 07:50:50 crc kubenswrapper[4599]: I1012 07:50:50.990385 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-q7gzz" event={"ID":"c3c31956-8a05-421b-86cf-5e04f27a0ad1","Type":"ContainerDied","Data":"fa1c6ad6118afca12345e075d1eee8718b7d8386aa4167c93749b5827db46642"} Oct 12 07:50:52 crc kubenswrapper[4599]: I1012 07:50:52.295457 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-q7gzz" Oct 12 07:50:52 crc kubenswrapper[4599]: I1012 07:50:52.474326 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3c31956-8a05-421b-86cf-5e04f27a0ad1-scripts\") pod \"c3c31956-8a05-421b-86cf-5e04f27a0ad1\" (UID: \"c3c31956-8a05-421b-86cf-5e04f27a0ad1\") " Oct 12 07:50:52 crc kubenswrapper[4599]: I1012 07:50:52.474400 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3c31956-8a05-421b-86cf-5e04f27a0ad1-combined-ca-bundle\") pod \"c3c31956-8a05-421b-86cf-5e04f27a0ad1\" (UID: \"c3c31956-8a05-421b-86cf-5e04f27a0ad1\") " Oct 12 07:50:52 crc kubenswrapper[4599]: I1012 07:50:52.474435 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3c31956-8a05-421b-86cf-5e04f27a0ad1-config-data\") pod \"c3c31956-8a05-421b-86cf-5e04f27a0ad1\" (UID: \"c3c31956-8a05-421b-86cf-5e04f27a0ad1\") " Oct 12 07:50:52 crc kubenswrapper[4599]: I1012 07:50:52.474514 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qcg4g\" (UniqueName: \"kubernetes.io/projected/c3c31956-8a05-421b-86cf-5e04f27a0ad1-kube-api-access-qcg4g\") pod \"c3c31956-8a05-421b-86cf-5e04f27a0ad1\" (UID: \"c3c31956-8a05-421b-86cf-5e04f27a0ad1\") " Oct 12 07:50:52 crc kubenswrapper[4599]: I1012 07:50:52.482107 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3c31956-8a05-421b-86cf-5e04f27a0ad1-scripts" (OuterVolumeSpecName: "scripts") pod "c3c31956-8a05-421b-86cf-5e04f27a0ad1" (UID: "c3c31956-8a05-421b-86cf-5e04f27a0ad1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:50:52 crc kubenswrapper[4599]: I1012 07:50:52.482366 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3c31956-8a05-421b-86cf-5e04f27a0ad1-kube-api-access-qcg4g" (OuterVolumeSpecName: "kube-api-access-qcg4g") pod "c3c31956-8a05-421b-86cf-5e04f27a0ad1" (UID: "c3c31956-8a05-421b-86cf-5e04f27a0ad1"). InnerVolumeSpecName "kube-api-access-qcg4g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:50:52 crc kubenswrapper[4599]: I1012 07:50:52.502776 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3c31956-8a05-421b-86cf-5e04f27a0ad1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c3c31956-8a05-421b-86cf-5e04f27a0ad1" (UID: "c3c31956-8a05-421b-86cf-5e04f27a0ad1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:50:52 crc kubenswrapper[4599]: I1012 07:50:52.503325 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3c31956-8a05-421b-86cf-5e04f27a0ad1-config-data" (OuterVolumeSpecName: "config-data") pod "c3c31956-8a05-421b-86cf-5e04f27a0ad1" (UID: "c3c31956-8a05-421b-86cf-5e04f27a0ad1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:50:52 crc kubenswrapper[4599]: I1012 07:50:52.578241 4599 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3c31956-8a05-421b-86cf-5e04f27a0ad1-scripts\") on node \"crc\" DevicePath \"\"" Oct 12 07:50:52 crc kubenswrapper[4599]: I1012 07:50:52.578582 4599 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3c31956-8a05-421b-86cf-5e04f27a0ad1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 07:50:52 crc kubenswrapper[4599]: I1012 07:50:52.578598 4599 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3c31956-8a05-421b-86cf-5e04f27a0ad1-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 07:50:52 crc kubenswrapper[4599]: I1012 07:50:52.578609 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qcg4g\" (UniqueName: \"kubernetes.io/projected/c3c31956-8a05-421b-86cf-5e04f27a0ad1-kube-api-access-qcg4g\") on node \"crc\" DevicePath \"\"" Oct 12 07:50:53 crc kubenswrapper[4599]: I1012 07:50:53.014897 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-q7gzz" event={"ID":"c3c31956-8a05-421b-86cf-5e04f27a0ad1","Type":"ContainerDied","Data":"58c4736288aaa63f681dce3e08270f84b5fa4a85fc26e2ef7315665275fdff66"} Oct 12 07:50:53 crc kubenswrapper[4599]: I1012 07:50:53.014953 4599 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="58c4736288aaa63f681dce3e08270f84b5fa4a85fc26e2ef7315665275fdff66" Oct 12 07:50:53 crc kubenswrapper[4599]: I1012 07:50:53.014987 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-q7gzz" Oct 12 07:50:53 crc kubenswrapper[4599]: I1012 07:50:53.118893 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 12 07:50:53 crc kubenswrapper[4599]: E1012 07:50:53.119364 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3c31956-8a05-421b-86cf-5e04f27a0ad1" containerName="nova-cell0-conductor-db-sync" Oct 12 07:50:53 crc kubenswrapper[4599]: I1012 07:50:53.119384 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3c31956-8a05-421b-86cf-5e04f27a0ad1" containerName="nova-cell0-conductor-db-sync" Oct 12 07:50:53 crc kubenswrapper[4599]: I1012 07:50:53.119600 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3c31956-8a05-421b-86cf-5e04f27a0ad1" containerName="nova-cell0-conductor-db-sync" Oct 12 07:50:53 crc kubenswrapper[4599]: I1012 07:50:53.120222 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 12 07:50:53 crc kubenswrapper[4599]: I1012 07:50:53.122645 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-fngv7" Oct 12 07:50:53 crc kubenswrapper[4599]: I1012 07:50:53.123713 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 12 07:50:53 crc kubenswrapper[4599]: I1012 07:50:53.134382 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 12 07:50:53 crc kubenswrapper[4599]: I1012 07:50:53.192023 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwbd8\" (UniqueName: \"kubernetes.io/projected/65c5adc9-0a5f-4631-bee5-c87a70c0d0a2-kube-api-access-kwbd8\") pod \"nova-cell0-conductor-0\" (UID: \"65c5adc9-0a5f-4631-bee5-c87a70c0d0a2\") " pod="openstack/nova-cell0-conductor-0" Oct 12 07:50:53 crc kubenswrapper[4599]: I1012 07:50:53.192074 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65c5adc9-0a5f-4631-bee5-c87a70c0d0a2-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"65c5adc9-0a5f-4631-bee5-c87a70c0d0a2\") " pod="openstack/nova-cell0-conductor-0" Oct 12 07:50:53 crc kubenswrapper[4599]: I1012 07:50:53.192500 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65c5adc9-0a5f-4631-bee5-c87a70c0d0a2-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"65c5adc9-0a5f-4631-bee5-c87a70c0d0a2\") " pod="openstack/nova-cell0-conductor-0" Oct 12 07:50:53 crc kubenswrapper[4599]: I1012 07:50:53.294769 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65c5adc9-0a5f-4631-bee5-c87a70c0d0a2-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"65c5adc9-0a5f-4631-bee5-c87a70c0d0a2\") " pod="openstack/nova-cell0-conductor-0" Oct 12 07:50:53 crc kubenswrapper[4599]: I1012 07:50:53.294959 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65c5adc9-0a5f-4631-bee5-c87a70c0d0a2-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"65c5adc9-0a5f-4631-bee5-c87a70c0d0a2\") " pod="openstack/nova-cell0-conductor-0" Oct 12 07:50:53 crc kubenswrapper[4599]: I1012 07:50:53.295027 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwbd8\" (UniqueName: \"kubernetes.io/projected/65c5adc9-0a5f-4631-bee5-c87a70c0d0a2-kube-api-access-kwbd8\") pod \"nova-cell0-conductor-0\" (UID: \"65c5adc9-0a5f-4631-bee5-c87a70c0d0a2\") " pod="openstack/nova-cell0-conductor-0" Oct 12 07:50:53 crc kubenswrapper[4599]: I1012 07:50:53.300877 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65c5adc9-0a5f-4631-bee5-c87a70c0d0a2-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"65c5adc9-0a5f-4631-bee5-c87a70c0d0a2\") " pod="openstack/nova-cell0-conductor-0" Oct 12 07:50:53 crc kubenswrapper[4599]: I1012 07:50:53.300964 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65c5adc9-0a5f-4631-bee5-c87a70c0d0a2-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"65c5adc9-0a5f-4631-bee5-c87a70c0d0a2\") " pod="openstack/nova-cell0-conductor-0" Oct 12 07:50:53 crc kubenswrapper[4599]: I1012 07:50:53.310276 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwbd8\" (UniqueName: \"kubernetes.io/projected/65c5adc9-0a5f-4631-bee5-c87a70c0d0a2-kube-api-access-kwbd8\") pod \"nova-cell0-conductor-0\" (UID: \"65c5adc9-0a5f-4631-bee5-c87a70c0d0a2\") " pod="openstack/nova-cell0-conductor-0" Oct 12 07:50:53 crc kubenswrapper[4599]: I1012 07:50:53.433983 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 12 07:50:53 crc kubenswrapper[4599]: I1012 07:50:53.908029 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 12 07:50:54 crc kubenswrapper[4599]: I1012 07:50:54.027220 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"65c5adc9-0a5f-4631-bee5-c87a70c0d0a2","Type":"ContainerStarted","Data":"0467166a013b21ea2f3f58fd9b129f5726eb1dd167bf98c4ba744296148f4ecb"} Oct 12 07:50:55 crc kubenswrapper[4599]: I1012 07:50:55.039437 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"65c5adc9-0a5f-4631-bee5-c87a70c0d0a2","Type":"ContainerStarted","Data":"0917c5a582389ff84657e3a0a4f550fdc6f934c9f8e8e86c1b0985dcea4221df"} Oct 12 07:50:55 crc kubenswrapper[4599]: I1012 07:50:55.042960 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Oct 12 07:50:55 crc kubenswrapper[4599]: I1012 07:50:55.063829 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.063812756 podStartE2EDuration="2.063812756s" podCreationTimestamp="2025-10-12 07:50:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:50:55.059480721 +0000 UTC m=+951.848676213" watchObservedRunningTime="2025-10-12 07:50:55.063812756 +0000 UTC m=+951.853008257" Oct 12 07:51:01 crc kubenswrapper[4599]: I1012 07:51:01.077836 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 12 07:51:03 crc kubenswrapper[4599]: I1012 07:51:03.468389 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Oct 12 07:51:03 crc kubenswrapper[4599]: I1012 07:51:03.960464 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-tszt8"] Oct 12 07:51:03 crc kubenswrapper[4599]: I1012 07:51:03.961675 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-tszt8" Oct 12 07:51:03 crc kubenswrapper[4599]: I1012 07:51:03.965532 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Oct 12 07:51:03 crc kubenswrapper[4599]: I1012 07:51:03.973980 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Oct 12 07:51:03 crc kubenswrapper[4599]: I1012 07:51:03.974247 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-tszt8"] Oct 12 07:51:04 crc kubenswrapper[4599]: I1012 07:51:04.028221 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcrsf\" (UniqueName: \"kubernetes.io/projected/a5ccaa75-c98c-4f83-af3c-0276bb6ad3a8-kube-api-access-fcrsf\") pod \"nova-cell0-cell-mapping-tszt8\" (UID: \"a5ccaa75-c98c-4f83-af3c-0276bb6ad3a8\") " pod="openstack/nova-cell0-cell-mapping-tszt8" Oct 12 07:51:04 crc kubenswrapper[4599]: I1012 07:51:04.028516 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5ccaa75-c98c-4f83-af3c-0276bb6ad3a8-config-data\") pod \"nova-cell0-cell-mapping-tszt8\" (UID: \"a5ccaa75-c98c-4f83-af3c-0276bb6ad3a8\") " pod="openstack/nova-cell0-cell-mapping-tszt8" Oct 12 07:51:04 crc kubenswrapper[4599]: I1012 07:51:04.028642 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5ccaa75-c98c-4f83-af3c-0276bb6ad3a8-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-tszt8\" (UID: \"a5ccaa75-c98c-4f83-af3c-0276bb6ad3a8\") " pod="openstack/nova-cell0-cell-mapping-tszt8" Oct 12 07:51:04 crc kubenswrapper[4599]: I1012 07:51:04.028828 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5ccaa75-c98c-4f83-af3c-0276bb6ad3a8-scripts\") pod \"nova-cell0-cell-mapping-tszt8\" (UID: \"a5ccaa75-c98c-4f83-af3c-0276bb6ad3a8\") " pod="openstack/nova-cell0-cell-mapping-tszt8" Oct 12 07:51:04 crc kubenswrapper[4599]: I1012 07:51:04.118903 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 12 07:51:04 crc kubenswrapper[4599]: I1012 07:51:04.122962 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 12 07:51:04 crc kubenswrapper[4599]: I1012 07:51:04.126970 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 12 07:51:04 crc kubenswrapper[4599]: I1012 07:51:04.130375 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fcrsf\" (UniqueName: \"kubernetes.io/projected/a5ccaa75-c98c-4f83-af3c-0276bb6ad3a8-kube-api-access-fcrsf\") pod \"nova-cell0-cell-mapping-tszt8\" (UID: \"a5ccaa75-c98c-4f83-af3c-0276bb6ad3a8\") " pod="openstack/nova-cell0-cell-mapping-tszt8" Oct 12 07:51:04 crc kubenswrapper[4599]: I1012 07:51:04.130442 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5ccaa75-c98c-4f83-af3c-0276bb6ad3a8-config-data\") pod \"nova-cell0-cell-mapping-tszt8\" (UID: \"a5ccaa75-c98c-4f83-af3c-0276bb6ad3a8\") " pod="openstack/nova-cell0-cell-mapping-tszt8" Oct 12 07:51:04 crc kubenswrapper[4599]: I1012 07:51:04.130491 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5ccaa75-c98c-4f83-af3c-0276bb6ad3a8-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-tszt8\" (UID: \"a5ccaa75-c98c-4f83-af3c-0276bb6ad3a8\") " pod="openstack/nova-cell0-cell-mapping-tszt8" Oct 12 07:51:04 crc kubenswrapper[4599]: I1012 07:51:04.130598 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5ccaa75-c98c-4f83-af3c-0276bb6ad3a8-scripts\") pod \"nova-cell0-cell-mapping-tszt8\" (UID: \"a5ccaa75-c98c-4f83-af3c-0276bb6ad3a8\") " pod="openstack/nova-cell0-cell-mapping-tszt8" Oct 12 07:51:04 crc kubenswrapper[4599]: I1012 07:51:04.137777 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5ccaa75-c98c-4f83-af3c-0276bb6ad3a8-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-tszt8\" (UID: \"a5ccaa75-c98c-4f83-af3c-0276bb6ad3a8\") " pod="openstack/nova-cell0-cell-mapping-tszt8" Oct 12 07:51:04 crc kubenswrapper[4599]: I1012 07:51:04.137865 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5ccaa75-c98c-4f83-af3c-0276bb6ad3a8-config-data\") pod \"nova-cell0-cell-mapping-tszt8\" (UID: \"a5ccaa75-c98c-4f83-af3c-0276bb6ad3a8\") " pod="openstack/nova-cell0-cell-mapping-tszt8" Oct 12 07:51:04 crc kubenswrapper[4599]: I1012 07:51:04.148388 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 12 07:51:04 crc kubenswrapper[4599]: I1012 07:51:04.149723 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 12 07:51:04 crc kubenswrapper[4599]: I1012 07:51:04.154750 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 12 07:51:04 crc kubenswrapper[4599]: I1012 07:51:04.154882 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcrsf\" (UniqueName: \"kubernetes.io/projected/a5ccaa75-c98c-4f83-af3c-0276bb6ad3a8-kube-api-access-fcrsf\") pod \"nova-cell0-cell-mapping-tszt8\" (UID: \"a5ccaa75-c98c-4f83-af3c-0276bb6ad3a8\") " pod="openstack/nova-cell0-cell-mapping-tszt8" Oct 12 07:51:04 crc kubenswrapper[4599]: I1012 07:51:04.155287 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5ccaa75-c98c-4f83-af3c-0276bb6ad3a8-scripts\") pod \"nova-cell0-cell-mapping-tszt8\" (UID: \"a5ccaa75-c98c-4f83-af3c-0276bb6ad3a8\") " pod="openstack/nova-cell0-cell-mapping-tszt8" Oct 12 07:51:04 crc kubenswrapper[4599]: I1012 07:51:04.169527 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 12 07:51:04 crc kubenswrapper[4599]: I1012 07:51:04.176167 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 12 07:51:04 crc kubenswrapper[4599]: I1012 07:51:04.232171 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04538b3c-68d8-43fa-adc3-d319f98bf0d9-config-data\") pod \"nova-scheduler-0\" (UID: \"04538b3c-68d8-43fa-adc3-d319f98bf0d9\") " pod="openstack/nova-scheduler-0" Oct 12 07:51:04 crc kubenswrapper[4599]: I1012 07:51:04.232233 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2c68\" (UniqueName: \"kubernetes.io/projected/04538b3c-68d8-43fa-adc3-d319f98bf0d9-kube-api-access-x2c68\") pod \"nova-scheduler-0\" (UID: \"04538b3c-68d8-43fa-adc3-d319f98bf0d9\") " pod="openstack/nova-scheduler-0" Oct 12 07:51:04 crc kubenswrapper[4599]: I1012 07:51:04.232617 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04538b3c-68d8-43fa-adc3-d319f98bf0d9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"04538b3c-68d8-43fa-adc3-d319f98bf0d9\") " pod="openstack/nova-scheduler-0" Oct 12 07:51:04 crc kubenswrapper[4599]: I1012 07:51:04.232773 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cflm\" (UniqueName: \"kubernetes.io/projected/3b8ae9f7-4558-4445-8976-db978922edde-kube-api-access-5cflm\") pod \"nova-api-0\" (UID: \"3b8ae9f7-4558-4445-8976-db978922edde\") " pod="openstack/nova-api-0" Oct 12 07:51:04 crc kubenswrapper[4599]: I1012 07:51:04.232848 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b8ae9f7-4558-4445-8976-db978922edde-logs\") pod \"nova-api-0\" (UID: \"3b8ae9f7-4558-4445-8976-db978922edde\") " pod="openstack/nova-api-0" Oct 12 07:51:04 crc kubenswrapper[4599]: I1012 07:51:04.232893 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b8ae9f7-4558-4445-8976-db978922edde-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3b8ae9f7-4558-4445-8976-db978922edde\") " pod="openstack/nova-api-0" Oct 12 07:51:04 crc kubenswrapper[4599]: I1012 07:51:04.232973 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b8ae9f7-4558-4445-8976-db978922edde-config-data\") pod \"nova-api-0\" (UID: \"3b8ae9f7-4558-4445-8976-db978922edde\") " pod="openstack/nova-api-0" Oct 12 07:51:04 crc kubenswrapper[4599]: I1012 07:51:04.278508 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-tszt8" Oct 12 07:51:04 crc kubenswrapper[4599]: I1012 07:51:04.300860 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 12 07:51:04 crc kubenswrapper[4599]: I1012 07:51:04.302263 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 12 07:51:04 crc kubenswrapper[4599]: I1012 07:51:04.308571 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 12 07:51:04 crc kubenswrapper[4599]: I1012 07:51:04.322393 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 12 07:51:04 crc kubenswrapper[4599]: I1012 07:51:04.334919 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04538b3c-68d8-43fa-adc3-d319f98bf0d9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"04538b3c-68d8-43fa-adc3-d319f98bf0d9\") " pod="openstack/nova-scheduler-0" Oct 12 07:51:04 crc kubenswrapper[4599]: I1012 07:51:04.334965 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c45zb\" (UniqueName: \"kubernetes.io/projected/3bcc5217-6fbf-4f31-9f4d-cbaabc914b51-kube-api-access-c45zb\") pod \"nova-metadata-0\" (UID: \"3bcc5217-6fbf-4f31-9f4d-cbaabc914b51\") " pod="openstack/nova-metadata-0" Oct 12 07:51:04 crc kubenswrapper[4599]: I1012 07:51:04.335022 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cflm\" (UniqueName: \"kubernetes.io/projected/3b8ae9f7-4558-4445-8976-db978922edde-kube-api-access-5cflm\") pod \"nova-api-0\" (UID: \"3b8ae9f7-4558-4445-8976-db978922edde\") " pod="openstack/nova-api-0" Oct 12 07:51:04 crc kubenswrapper[4599]: I1012 07:51:04.335045 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bcc5217-6fbf-4f31-9f4d-cbaabc914b51-config-data\") pod \"nova-metadata-0\" (UID: \"3bcc5217-6fbf-4f31-9f4d-cbaabc914b51\") " pod="openstack/nova-metadata-0" Oct 12 07:51:04 crc kubenswrapper[4599]: I1012 07:51:04.335069 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b8ae9f7-4558-4445-8976-db978922edde-logs\") pod \"nova-api-0\" (UID: \"3b8ae9f7-4558-4445-8976-db978922edde\") " pod="openstack/nova-api-0" Oct 12 07:51:04 crc kubenswrapper[4599]: I1012 07:51:04.335087 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b8ae9f7-4558-4445-8976-db978922edde-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3b8ae9f7-4558-4445-8976-db978922edde\") " pod="openstack/nova-api-0" Oct 12 07:51:04 crc kubenswrapper[4599]: I1012 07:51:04.335118 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b8ae9f7-4558-4445-8976-db978922edde-config-data\") pod \"nova-api-0\" (UID: \"3b8ae9f7-4558-4445-8976-db978922edde\") " pod="openstack/nova-api-0" Oct 12 07:51:04 crc kubenswrapper[4599]: I1012 07:51:04.335151 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04538b3c-68d8-43fa-adc3-d319f98bf0d9-config-data\") pod \"nova-scheduler-0\" (UID: \"04538b3c-68d8-43fa-adc3-d319f98bf0d9\") " pod="openstack/nova-scheduler-0" Oct 12 07:51:04 crc kubenswrapper[4599]: I1012 07:51:04.335167 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3bcc5217-6fbf-4f31-9f4d-cbaabc914b51-logs\") pod \"nova-metadata-0\" (UID: \"3bcc5217-6fbf-4f31-9f4d-cbaabc914b51\") " pod="openstack/nova-metadata-0" Oct 12 07:51:04 crc kubenswrapper[4599]: I1012 07:51:04.335187 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2c68\" (UniqueName: \"kubernetes.io/projected/04538b3c-68d8-43fa-adc3-d319f98bf0d9-kube-api-access-x2c68\") pod \"nova-scheduler-0\" (UID: \"04538b3c-68d8-43fa-adc3-d319f98bf0d9\") " pod="openstack/nova-scheduler-0" Oct 12 07:51:04 crc kubenswrapper[4599]: I1012 07:51:04.335236 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bcc5217-6fbf-4f31-9f4d-cbaabc914b51-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3bcc5217-6fbf-4f31-9f4d-cbaabc914b51\") " pod="openstack/nova-metadata-0" Oct 12 07:51:04 crc kubenswrapper[4599]: I1012 07:51:04.341110 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04538b3c-68d8-43fa-adc3-d319f98bf0d9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"04538b3c-68d8-43fa-adc3-d319f98bf0d9\") " pod="openstack/nova-scheduler-0" Oct 12 07:51:04 crc kubenswrapper[4599]: I1012 07:51:04.341794 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b8ae9f7-4558-4445-8976-db978922edde-logs\") pod \"nova-api-0\" (UID: \"3b8ae9f7-4558-4445-8976-db978922edde\") " pod="openstack/nova-api-0" Oct 12 07:51:04 crc kubenswrapper[4599]: I1012 07:51:04.350192 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 12 07:51:04 crc kubenswrapper[4599]: I1012 07:51:04.351433 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 12 07:51:04 crc kubenswrapper[4599]: I1012 07:51:04.355956 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 12 07:51:04 crc kubenswrapper[4599]: I1012 07:51:04.356143 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b8ae9f7-4558-4445-8976-db978922edde-config-data\") pod \"nova-api-0\" (UID: \"3b8ae9f7-4558-4445-8976-db978922edde\") " pod="openstack/nova-api-0" Oct 12 07:51:04 crc kubenswrapper[4599]: I1012 07:51:04.359888 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04538b3c-68d8-43fa-adc3-d319f98bf0d9-config-data\") pod \"nova-scheduler-0\" (UID: \"04538b3c-68d8-43fa-adc3-d319f98bf0d9\") " pod="openstack/nova-scheduler-0" Oct 12 07:51:04 crc kubenswrapper[4599]: I1012 07:51:04.360562 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b8ae9f7-4558-4445-8976-db978922edde-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3b8ae9f7-4558-4445-8976-db978922edde\") " pod="openstack/nova-api-0" Oct 12 07:51:04 crc kubenswrapper[4599]: I1012 07:51:04.368776 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cflm\" (UniqueName: \"kubernetes.io/projected/3b8ae9f7-4558-4445-8976-db978922edde-kube-api-access-5cflm\") pod \"nova-api-0\" (UID: \"3b8ae9f7-4558-4445-8976-db978922edde\") " pod="openstack/nova-api-0" Oct 12 07:51:04 crc kubenswrapper[4599]: I1012 07:51:04.373872 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2c68\" (UniqueName: \"kubernetes.io/projected/04538b3c-68d8-43fa-adc3-d319f98bf0d9-kube-api-access-x2c68\") pod \"nova-scheduler-0\" (UID: \"04538b3c-68d8-43fa-adc3-d319f98bf0d9\") " pod="openstack/nova-scheduler-0" Oct 12 07:51:04 crc kubenswrapper[4599]: I1012 07:51:04.423742 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 12 07:51:04 crc kubenswrapper[4599]: I1012 07:51:04.442035 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c45zb\" (UniqueName: \"kubernetes.io/projected/3bcc5217-6fbf-4f31-9f4d-cbaabc914b51-kube-api-access-c45zb\") pod \"nova-metadata-0\" (UID: \"3bcc5217-6fbf-4f31-9f4d-cbaabc914b51\") " pod="openstack/nova-metadata-0" Oct 12 07:51:04 crc kubenswrapper[4599]: I1012 07:51:04.442493 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bcc5217-6fbf-4f31-9f4d-cbaabc914b51-config-data\") pod \"nova-metadata-0\" (UID: \"3bcc5217-6fbf-4f31-9f4d-cbaabc914b51\") " pod="openstack/nova-metadata-0" Oct 12 07:51:04 crc kubenswrapper[4599]: I1012 07:51:04.442577 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3bcc5217-6fbf-4f31-9f4d-cbaabc914b51-logs\") pod \"nova-metadata-0\" (UID: \"3bcc5217-6fbf-4f31-9f4d-cbaabc914b51\") " pod="openstack/nova-metadata-0" Oct 12 07:51:04 crc kubenswrapper[4599]: I1012 07:51:04.442622 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bcc5217-6fbf-4f31-9f4d-cbaabc914b51-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3bcc5217-6fbf-4f31-9f4d-cbaabc914b51\") " pod="openstack/nova-metadata-0" Oct 12 07:51:04 crc kubenswrapper[4599]: I1012 07:51:04.444268 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3bcc5217-6fbf-4f31-9f4d-cbaabc914b51-logs\") pod \"nova-metadata-0\" (UID: \"3bcc5217-6fbf-4f31-9f4d-cbaabc914b51\") " pod="openstack/nova-metadata-0" Oct 12 07:51:04 crc kubenswrapper[4599]: I1012 07:51:04.449526 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bcc5217-6fbf-4f31-9f4d-cbaabc914b51-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3bcc5217-6fbf-4f31-9f4d-cbaabc914b51\") " pod="openstack/nova-metadata-0" Oct 12 07:51:04 crc kubenswrapper[4599]: I1012 07:51:04.451952 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bcc5217-6fbf-4f31-9f4d-cbaabc914b51-config-data\") pod \"nova-metadata-0\" (UID: \"3bcc5217-6fbf-4f31-9f4d-cbaabc914b51\") " pod="openstack/nova-metadata-0" Oct 12 07:51:04 crc kubenswrapper[4599]: I1012 07:51:04.493968 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c45zb\" (UniqueName: \"kubernetes.io/projected/3bcc5217-6fbf-4f31-9f4d-cbaabc914b51-kube-api-access-c45zb\") pod \"nova-metadata-0\" (UID: \"3bcc5217-6fbf-4f31-9f4d-cbaabc914b51\") " pod="openstack/nova-metadata-0" Oct 12 07:51:04 crc kubenswrapper[4599]: I1012 07:51:04.524687 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 12 07:51:04 crc kubenswrapper[4599]: I1012 07:51:04.537965 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 12 07:51:04 crc kubenswrapper[4599]: I1012 07:51:04.540412 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-94c48d4d7-z6cgn"] Oct 12 07:51:04 crc kubenswrapper[4599]: I1012 07:51:04.542142 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-94c48d4d7-z6cgn" Oct 12 07:51:04 crc kubenswrapper[4599]: I1012 07:51:04.544075 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e47371c-f8d3-4eef-9a33-f4f6f7169dee-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"2e47371c-f8d3-4eef-9a33-f4f6f7169dee\") " pod="openstack/nova-cell1-novncproxy-0" Oct 12 07:51:04 crc kubenswrapper[4599]: I1012 07:51:04.544163 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e47371c-f8d3-4eef-9a33-f4f6f7169dee-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"2e47371c-f8d3-4eef-9a33-f4f6f7169dee\") " pod="openstack/nova-cell1-novncproxy-0" Oct 12 07:51:04 crc kubenswrapper[4599]: I1012 07:51:04.544326 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j89bz\" (UniqueName: \"kubernetes.io/projected/2e47371c-f8d3-4eef-9a33-f4f6f7169dee-kube-api-access-j89bz\") pod \"nova-cell1-novncproxy-0\" (UID: \"2e47371c-f8d3-4eef-9a33-f4f6f7169dee\") " pod="openstack/nova-cell1-novncproxy-0" Oct 12 07:51:04 crc kubenswrapper[4599]: I1012 07:51:04.581933 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-94c48d4d7-z6cgn"] Oct 12 07:51:04 crc kubenswrapper[4599]: I1012 07:51:04.662872 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8361da66-d18b-4f7c-8c5e-6e1fc04b8f11-dns-swift-storage-0\") pod \"dnsmasq-dns-94c48d4d7-z6cgn\" (UID: \"8361da66-d18b-4f7c-8c5e-6e1fc04b8f11\") " pod="openstack/dnsmasq-dns-94c48d4d7-z6cgn" Oct 12 07:51:04 crc kubenswrapper[4599]: I1012 07:51:04.662939 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j89bz\" (UniqueName: \"kubernetes.io/projected/2e47371c-f8d3-4eef-9a33-f4f6f7169dee-kube-api-access-j89bz\") pod \"nova-cell1-novncproxy-0\" (UID: \"2e47371c-f8d3-4eef-9a33-f4f6f7169dee\") " pod="openstack/nova-cell1-novncproxy-0" Oct 12 07:51:04 crc kubenswrapper[4599]: I1012 07:51:04.663056 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8361da66-d18b-4f7c-8c5e-6e1fc04b8f11-ovsdbserver-sb\") pod \"dnsmasq-dns-94c48d4d7-z6cgn\" (UID: \"8361da66-d18b-4f7c-8c5e-6e1fc04b8f11\") " pod="openstack/dnsmasq-dns-94c48d4d7-z6cgn" Oct 12 07:51:04 crc kubenswrapper[4599]: I1012 07:51:04.663078 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8361da66-d18b-4f7c-8c5e-6e1fc04b8f11-dns-svc\") pod \"dnsmasq-dns-94c48d4d7-z6cgn\" (UID: \"8361da66-d18b-4f7c-8c5e-6e1fc04b8f11\") " pod="openstack/dnsmasq-dns-94c48d4d7-z6cgn" Oct 12 07:51:04 crc kubenswrapper[4599]: I1012 07:51:04.663270 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e47371c-f8d3-4eef-9a33-f4f6f7169dee-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"2e47371c-f8d3-4eef-9a33-f4f6f7169dee\") " pod="openstack/nova-cell1-novncproxy-0" Oct 12 07:51:04 crc kubenswrapper[4599]: I1012 07:51:04.663368 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8361da66-d18b-4f7c-8c5e-6e1fc04b8f11-ovsdbserver-nb\") pod \"dnsmasq-dns-94c48d4d7-z6cgn\" (UID: \"8361da66-d18b-4f7c-8c5e-6e1fc04b8f11\") " pod="openstack/dnsmasq-dns-94c48d4d7-z6cgn" Oct 12 07:51:04 crc kubenswrapper[4599]: I1012 07:51:04.663431 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e47371c-f8d3-4eef-9a33-f4f6f7169dee-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"2e47371c-f8d3-4eef-9a33-f4f6f7169dee\") " pod="openstack/nova-cell1-novncproxy-0" Oct 12 07:51:04 crc kubenswrapper[4599]: I1012 07:51:04.663480 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8361da66-d18b-4f7c-8c5e-6e1fc04b8f11-config\") pod \"dnsmasq-dns-94c48d4d7-z6cgn\" (UID: \"8361da66-d18b-4f7c-8c5e-6e1fc04b8f11\") " pod="openstack/dnsmasq-dns-94c48d4d7-z6cgn" Oct 12 07:51:04 crc kubenswrapper[4599]: I1012 07:51:04.663550 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j69hk\" (UniqueName: \"kubernetes.io/projected/8361da66-d18b-4f7c-8c5e-6e1fc04b8f11-kube-api-access-j69hk\") pod \"dnsmasq-dns-94c48d4d7-z6cgn\" (UID: \"8361da66-d18b-4f7c-8c5e-6e1fc04b8f11\") " pod="openstack/dnsmasq-dns-94c48d4d7-z6cgn" Oct 12 07:51:04 crc kubenswrapper[4599]: I1012 07:51:04.668256 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e47371c-f8d3-4eef-9a33-f4f6f7169dee-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"2e47371c-f8d3-4eef-9a33-f4f6f7169dee\") " pod="openstack/nova-cell1-novncproxy-0" Oct 12 07:51:04 crc kubenswrapper[4599]: I1012 07:51:04.676196 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e47371c-f8d3-4eef-9a33-f4f6f7169dee-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"2e47371c-f8d3-4eef-9a33-f4f6f7169dee\") " pod="openstack/nova-cell1-novncproxy-0" Oct 12 07:51:04 crc kubenswrapper[4599]: I1012 07:51:04.696611 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j89bz\" (UniqueName: \"kubernetes.io/projected/2e47371c-f8d3-4eef-9a33-f4f6f7169dee-kube-api-access-j89bz\") pod \"nova-cell1-novncproxy-0\" (UID: \"2e47371c-f8d3-4eef-9a33-f4f6f7169dee\") " pod="openstack/nova-cell1-novncproxy-0" Oct 12 07:51:04 crc kubenswrapper[4599]: I1012 07:51:04.749790 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 12 07:51:04 crc kubenswrapper[4599]: I1012 07:51:04.765101 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8361da66-d18b-4f7c-8c5e-6e1fc04b8f11-ovsdbserver-nb\") pod \"dnsmasq-dns-94c48d4d7-z6cgn\" (UID: \"8361da66-d18b-4f7c-8c5e-6e1fc04b8f11\") " pod="openstack/dnsmasq-dns-94c48d4d7-z6cgn" Oct 12 07:51:04 crc kubenswrapper[4599]: I1012 07:51:04.765160 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8361da66-d18b-4f7c-8c5e-6e1fc04b8f11-config\") pod \"dnsmasq-dns-94c48d4d7-z6cgn\" (UID: \"8361da66-d18b-4f7c-8c5e-6e1fc04b8f11\") " pod="openstack/dnsmasq-dns-94c48d4d7-z6cgn" Oct 12 07:51:04 crc kubenswrapper[4599]: I1012 07:51:04.765196 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j69hk\" (UniqueName: \"kubernetes.io/projected/8361da66-d18b-4f7c-8c5e-6e1fc04b8f11-kube-api-access-j69hk\") pod \"dnsmasq-dns-94c48d4d7-z6cgn\" (UID: \"8361da66-d18b-4f7c-8c5e-6e1fc04b8f11\") " pod="openstack/dnsmasq-dns-94c48d4d7-z6cgn" Oct 12 07:51:04 crc kubenswrapper[4599]: I1012 07:51:04.765247 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8361da66-d18b-4f7c-8c5e-6e1fc04b8f11-dns-swift-storage-0\") pod \"dnsmasq-dns-94c48d4d7-z6cgn\" (UID: \"8361da66-d18b-4f7c-8c5e-6e1fc04b8f11\") " pod="openstack/dnsmasq-dns-94c48d4d7-z6cgn" Oct 12 07:51:04 crc kubenswrapper[4599]: I1012 07:51:04.765286 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8361da66-d18b-4f7c-8c5e-6e1fc04b8f11-ovsdbserver-sb\") pod \"dnsmasq-dns-94c48d4d7-z6cgn\" (UID: \"8361da66-d18b-4f7c-8c5e-6e1fc04b8f11\") " pod="openstack/dnsmasq-dns-94c48d4d7-z6cgn" Oct 12 07:51:04 crc kubenswrapper[4599]: I1012 07:51:04.765306 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8361da66-d18b-4f7c-8c5e-6e1fc04b8f11-dns-svc\") pod \"dnsmasq-dns-94c48d4d7-z6cgn\" (UID: \"8361da66-d18b-4f7c-8c5e-6e1fc04b8f11\") " pod="openstack/dnsmasq-dns-94c48d4d7-z6cgn" Oct 12 07:51:04 crc kubenswrapper[4599]: I1012 07:51:04.766053 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8361da66-d18b-4f7c-8c5e-6e1fc04b8f11-dns-svc\") pod \"dnsmasq-dns-94c48d4d7-z6cgn\" (UID: \"8361da66-d18b-4f7c-8c5e-6e1fc04b8f11\") " pod="openstack/dnsmasq-dns-94c48d4d7-z6cgn" Oct 12 07:51:04 crc kubenswrapper[4599]: I1012 07:51:04.767061 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8361da66-d18b-4f7c-8c5e-6e1fc04b8f11-ovsdbserver-nb\") pod \"dnsmasq-dns-94c48d4d7-z6cgn\" (UID: \"8361da66-d18b-4f7c-8c5e-6e1fc04b8f11\") " pod="openstack/dnsmasq-dns-94c48d4d7-z6cgn" Oct 12 07:51:04 crc kubenswrapper[4599]: I1012 07:51:04.768006 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8361da66-d18b-4f7c-8c5e-6e1fc04b8f11-ovsdbserver-sb\") pod \"dnsmasq-dns-94c48d4d7-z6cgn\" (UID: \"8361da66-d18b-4f7c-8c5e-6e1fc04b8f11\") " pod="openstack/dnsmasq-dns-94c48d4d7-z6cgn" Oct 12 07:51:04 crc kubenswrapper[4599]: I1012 07:51:04.768465 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8361da66-d18b-4f7c-8c5e-6e1fc04b8f11-config\") pod \"dnsmasq-dns-94c48d4d7-z6cgn\" (UID: \"8361da66-d18b-4f7c-8c5e-6e1fc04b8f11\") " pod="openstack/dnsmasq-dns-94c48d4d7-z6cgn" Oct 12 07:51:04 crc kubenswrapper[4599]: I1012 07:51:04.769549 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8361da66-d18b-4f7c-8c5e-6e1fc04b8f11-dns-swift-storage-0\") pod \"dnsmasq-dns-94c48d4d7-z6cgn\" (UID: \"8361da66-d18b-4f7c-8c5e-6e1fc04b8f11\") " pod="openstack/dnsmasq-dns-94c48d4d7-z6cgn" Oct 12 07:51:04 crc kubenswrapper[4599]: I1012 07:51:04.769752 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 12 07:51:04 crc kubenswrapper[4599]: I1012 07:51:04.782432 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j69hk\" (UniqueName: \"kubernetes.io/projected/8361da66-d18b-4f7c-8c5e-6e1fc04b8f11-kube-api-access-j69hk\") pod \"dnsmasq-dns-94c48d4d7-z6cgn\" (UID: \"8361da66-d18b-4f7c-8c5e-6e1fc04b8f11\") " pod="openstack/dnsmasq-dns-94c48d4d7-z6cgn" Oct 12 07:51:04 crc kubenswrapper[4599]: I1012 07:51:04.904960 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-94c48d4d7-z6cgn" Oct 12 07:51:04 crc kubenswrapper[4599]: I1012 07:51:04.987303 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 12 07:51:04 crc kubenswrapper[4599]: I1012 07:51:04.987594 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="0ca657cf-7ac2-4cb5-894b-c3a149ee4101" containerName="kube-state-metrics" containerID="cri-o://4c62790da42dd9b8491acfb336b6b41933eda934dab99f342dc2b627fcb3872a" gracePeriod=30 Oct 12 07:51:05 crc kubenswrapper[4599]: I1012 07:51:05.008963 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-tszt8"] Oct 12 07:51:05 crc kubenswrapper[4599]: I1012 07:51:05.192123 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-tszt8" event={"ID":"a5ccaa75-c98c-4f83-af3c-0276bb6ad3a8","Type":"ContainerStarted","Data":"23a42d6d88c6e5c7392e3d9b49f7690b56078cfd080d27d4e7e682d702e7c0c2"} Oct 12 07:51:05 crc kubenswrapper[4599]: I1012 07:51:05.204487 4599 generic.go:334] "Generic (PLEG): container finished" podID="0ca657cf-7ac2-4cb5-894b-c3a149ee4101" containerID="4c62790da42dd9b8491acfb336b6b41933eda934dab99f342dc2b627fcb3872a" exitCode=2 Oct 12 07:51:05 crc kubenswrapper[4599]: I1012 07:51:05.204533 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"0ca657cf-7ac2-4cb5-894b-c3a149ee4101","Type":"ContainerDied","Data":"4c62790da42dd9b8491acfb336b6b41933eda934dab99f342dc2b627fcb3872a"} Oct 12 07:51:05 crc kubenswrapper[4599]: I1012 07:51:05.230598 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 12 07:51:05 crc kubenswrapper[4599]: I1012 07:51:05.410262 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-f7jhc"] Oct 12 07:51:05 crc kubenswrapper[4599]: I1012 07:51:05.411535 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-f7jhc" Oct 12 07:51:05 crc kubenswrapper[4599]: I1012 07:51:05.413809 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 12 07:51:05 crc kubenswrapper[4599]: I1012 07:51:05.414456 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Oct 12 07:51:05 crc kubenswrapper[4599]: I1012 07:51:05.423618 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-f7jhc"] Oct 12 07:51:05 crc kubenswrapper[4599]: I1012 07:51:05.460579 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 12 07:51:05 crc kubenswrapper[4599]: I1012 07:51:05.512356 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73cd69f3-194c-4484-a0cf-01815024d884-config-data\") pod \"nova-cell1-conductor-db-sync-f7jhc\" (UID: \"73cd69f3-194c-4484-a0cf-01815024d884\") " pod="openstack/nova-cell1-conductor-db-sync-f7jhc" Oct 12 07:51:05 crc kubenswrapper[4599]: I1012 07:51:05.513092 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsb7c\" (UniqueName: \"kubernetes.io/projected/73cd69f3-194c-4484-a0cf-01815024d884-kube-api-access-lsb7c\") pod \"nova-cell1-conductor-db-sync-f7jhc\" (UID: \"73cd69f3-194c-4484-a0cf-01815024d884\") " pod="openstack/nova-cell1-conductor-db-sync-f7jhc" Oct 12 07:51:05 crc kubenswrapper[4599]: I1012 07:51:05.513181 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73cd69f3-194c-4484-a0cf-01815024d884-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-f7jhc\" (UID: \"73cd69f3-194c-4484-a0cf-01815024d884\") " pod="openstack/nova-cell1-conductor-db-sync-f7jhc" Oct 12 07:51:05 crc kubenswrapper[4599]: I1012 07:51:05.513317 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73cd69f3-194c-4484-a0cf-01815024d884-scripts\") pod \"nova-cell1-conductor-db-sync-f7jhc\" (UID: \"73cd69f3-194c-4484-a0cf-01815024d884\") " pod="openstack/nova-cell1-conductor-db-sync-f7jhc" Oct 12 07:51:05 crc kubenswrapper[4599]: I1012 07:51:05.572852 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 12 07:51:05 crc kubenswrapper[4599]: I1012 07:51:05.615588 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73cd69f3-194c-4484-a0cf-01815024d884-config-data\") pod \"nova-cell1-conductor-db-sync-f7jhc\" (UID: \"73cd69f3-194c-4484-a0cf-01815024d884\") " pod="openstack/nova-cell1-conductor-db-sync-f7jhc" Oct 12 07:51:05 crc kubenswrapper[4599]: I1012 07:51:05.615649 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lsb7c\" (UniqueName: \"kubernetes.io/projected/73cd69f3-194c-4484-a0cf-01815024d884-kube-api-access-lsb7c\") pod \"nova-cell1-conductor-db-sync-f7jhc\" (UID: \"73cd69f3-194c-4484-a0cf-01815024d884\") " pod="openstack/nova-cell1-conductor-db-sync-f7jhc" Oct 12 07:51:05 crc kubenswrapper[4599]: I1012 07:51:05.615670 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73cd69f3-194c-4484-a0cf-01815024d884-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-f7jhc\" (UID: \"73cd69f3-194c-4484-a0cf-01815024d884\") " pod="openstack/nova-cell1-conductor-db-sync-f7jhc" Oct 12 07:51:05 crc kubenswrapper[4599]: I1012 07:51:05.615730 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73cd69f3-194c-4484-a0cf-01815024d884-scripts\") pod \"nova-cell1-conductor-db-sync-f7jhc\" (UID: \"73cd69f3-194c-4484-a0cf-01815024d884\") " pod="openstack/nova-cell1-conductor-db-sync-f7jhc" Oct 12 07:51:05 crc kubenswrapper[4599]: I1012 07:51:05.623733 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73cd69f3-194c-4484-a0cf-01815024d884-scripts\") pod \"nova-cell1-conductor-db-sync-f7jhc\" (UID: \"73cd69f3-194c-4484-a0cf-01815024d884\") " pod="openstack/nova-cell1-conductor-db-sync-f7jhc" Oct 12 07:51:05 crc kubenswrapper[4599]: I1012 07:51:05.623945 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73cd69f3-194c-4484-a0cf-01815024d884-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-f7jhc\" (UID: \"73cd69f3-194c-4484-a0cf-01815024d884\") " pod="openstack/nova-cell1-conductor-db-sync-f7jhc" Oct 12 07:51:05 crc kubenswrapper[4599]: I1012 07:51:05.624230 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73cd69f3-194c-4484-a0cf-01815024d884-config-data\") pod \"nova-cell1-conductor-db-sync-f7jhc\" (UID: \"73cd69f3-194c-4484-a0cf-01815024d884\") " pod="openstack/nova-cell1-conductor-db-sync-f7jhc" Oct 12 07:51:05 crc kubenswrapper[4599]: I1012 07:51:05.630809 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsb7c\" (UniqueName: \"kubernetes.io/projected/73cd69f3-194c-4484-a0cf-01815024d884-kube-api-access-lsb7c\") pod \"nova-cell1-conductor-db-sync-f7jhc\" (UID: \"73cd69f3-194c-4484-a0cf-01815024d884\") " pod="openstack/nova-cell1-conductor-db-sync-f7jhc" Oct 12 07:51:05 crc kubenswrapper[4599]: I1012 07:51:05.645991 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 12 07:51:05 crc kubenswrapper[4599]: I1012 07:51:05.717535 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xvtt\" (UniqueName: \"kubernetes.io/projected/0ca657cf-7ac2-4cb5-894b-c3a149ee4101-kube-api-access-5xvtt\") pod \"0ca657cf-7ac2-4cb5-894b-c3a149ee4101\" (UID: \"0ca657cf-7ac2-4cb5-894b-c3a149ee4101\") " Oct 12 07:51:05 crc kubenswrapper[4599]: I1012 07:51:05.720833 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ca657cf-7ac2-4cb5-894b-c3a149ee4101-kube-api-access-5xvtt" (OuterVolumeSpecName: "kube-api-access-5xvtt") pod "0ca657cf-7ac2-4cb5-894b-c3a149ee4101" (UID: "0ca657cf-7ac2-4cb5-894b-c3a149ee4101"). InnerVolumeSpecName "kube-api-access-5xvtt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:51:05 crc kubenswrapper[4599]: I1012 07:51:05.727251 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-f7jhc" Oct 12 07:51:05 crc kubenswrapper[4599]: I1012 07:51:05.779614 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-94c48d4d7-z6cgn"] Oct 12 07:51:05 crc kubenswrapper[4599]: I1012 07:51:05.786926 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 12 07:51:05 crc kubenswrapper[4599]: I1012 07:51:05.820494 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xvtt\" (UniqueName: \"kubernetes.io/projected/0ca657cf-7ac2-4cb5-894b-c3a149ee4101-kube-api-access-5xvtt\") on node \"crc\" DevicePath \"\"" Oct 12 07:51:06 crc kubenswrapper[4599]: I1012 07:51:06.173695 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-f7jhc"] Oct 12 07:51:06 crc kubenswrapper[4599]: I1012 07:51:06.215938 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3b8ae9f7-4558-4445-8976-db978922edde","Type":"ContainerStarted","Data":"01b7b9b04198b80b2a036e9d8886f5bb775f5ca51d6dd289dd2148954dd65a08"} Oct 12 07:51:06 crc kubenswrapper[4599]: I1012 07:51:06.218232 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-tszt8" event={"ID":"a5ccaa75-c98c-4f83-af3c-0276bb6ad3a8","Type":"ContainerStarted","Data":"4d592c4300dd77d3ce4d85000060919b3b15a7d14a77df4f4fd43b2a04a20080"} Oct 12 07:51:06 crc kubenswrapper[4599]: I1012 07:51:06.219859 4599 generic.go:334] "Generic (PLEG): container finished" podID="8361da66-d18b-4f7c-8c5e-6e1fc04b8f11" containerID="434e31289d95dba5b878fa38d669f637c9667e9b30ef07bc25abe9022084dccb" exitCode=0 Oct 12 07:51:06 crc kubenswrapper[4599]: I1012 07:51:06.219913 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-94c48d4d7-z6cgn" event={"ID":"8361da66-d18b-4f7c-8c5e-6e1fc04b8f11","Type":"ContainerDied","Data":"434e31289d95dba5b878fa38d669f637c9667e9b30ef07bc25abe9022084dccb"} Oct 12 07:51:06 crc kubenswrapper[4599]: I1012 07:51:06.219940 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-94c48d4d7-z6cgn" event={"ID":"8361da66-d18b-4f7c-8c5e-6e1fc04b8f11","Type":"ContainerStarted","Data":"86ee33f29c0b4e525dc697531c0fd590867b86494f74dec87ac79c38f77490ab"} Oct 12 07:51:06 crc kubenswrapper[4599]: I1012 07:51:06.224702 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3bcc5217-6fbf-4f31-9f4d-cbaabc914b51","Type":"ContainerStarted","Data":"d19a227617f6991796b6f345a88c0e0463bbff500e6fdc085b679d8fae9502cd"} Oct 12 07:51:06 crc kubenswrapper[4599]: I1012 07:51:06.234674 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"0ca657cf-7ac2-4cb5-894b-c3a149ee4101","Type":"ContainerDied","Data":"48ead8091005c34ed45cd6ad65b8725f721aa243504bbec10149ec01246e58f1"} Oct 12 07:51:06 crc kubenswrapper[4599]: I1012 07:51:06.234718 4599 scope.go:117] "RemoveContainer" containerID="4c62790da42dd9b8491acfb336b6b41933eda934dab99f342dc2b627fcb3872a" Oct 12 07:51:06 crc kubenswrapper[4599]: I1012 07:51:06.234827 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 12 07:51:06 crc kubenswrapper[4599]: I1012 07:51:06.238577 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-tszt8" podStartSLOduration=3.238558128 podStartE2EDuration="3.238558128s" podCreationTimestamp="2025-10-12 07:51:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:51:06.230566667 +0000 UTC m=+963.019762169" watchObservedRunningTime="2025-10-12 07:51:06.238558128 +0000 UTC m=+963.027753630" Oct 12 07:51:06 crc kubenswrapper[4599]: I1012 07:51:06.242966 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"04538b3c-68d8-43fa-adc3-d319f98bf0d9","Type":"ContainerStarted","Data":"4eda18998bd5accb28f0bbafa6a8b457511b836b15b0d252ddc4a14eaebc79cd"} Oct 12 07:51:06 crc kubenswrapper[4599]: I1012 07:51:06.266035 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-f7jhc" event={"ID":"73cd69f3-194c-4484-a0cf-01815024d884","Type":"ContainerStarted","Data":"5c443e4278162edd441b0ad051b349ad231b1ac1bf4c26e249843a9df4858158"} Oct 12 07:51:06 crc kubenswrapper[4599]: I1012 07:51:06.279487 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"2e47371c-f8d3-4eef-9a33-f4f6f7169dee","Type":"ContainerStarted","Data":"df31d022720b454b4781a46c2f4b6d6755901b117df48c6dd5de8c4f97e43075"} Oct 12 07:51:06 crc kubenswrapper[4599]: I1012 07:51:06.310803 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 12 07:51:06 crc kubenswrapper[4599]: I1012 07:51:06.329042 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 12 07:51:06 crc kubenswrapper[4599]: I1012 07:51:06.337627 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 12 07:51:06 crc kubenswrapper[4599]: E1012 07:51:06.338269 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ca657cf-7ac2-4cb5-894b-c3a149ee4101" containerName="kube-state-metrics" Oct 12 07:51:06 crc kubenswrapper[4599]: I1012 07:51:06.338285 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ca657cf-7ac2-4cb5-894b-c3a149ee4101" containerName="kube-state-metrics" Oct 12 07:51:06 crc kubenswrapper[4599]: I1012 07:51:06.338526 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ca657cf-7ac2-4cb5-894b-c3a149ee4101" containerName="kube-state-metrics" Oct 12 07:51:06 crc kubenswrapper[4599]: I1012 07:51:06.339373 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 12 07:51:06 crc kubenswrapper[4599]: I1012 07:51:06.341484 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Oct 12 07:51:06 crc kubenswrapper[4599]: I1012 07:51:06.344268 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 12 07:51:06 crc kubenswrapper[4599]: I1012 07:51:06.361351 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Oct 12 07:51:06 crc kubenswrapper[4599]: I1012 07:51:06.436450 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cm5zx\" (UniqueName: \"kubernetes.io/projected/03a40b71-5a8f-42cd-97dd-e1b360a15b68-kube-api-access-cm5zx\") pod \"kube-state-metrics-0\" (UID: \"03a40b71-5a8f-42cd-97dd-e1b360a15b68\") " pod="openstack/kube-state-metrics-0" Oct 12 07:51:06 crc kubenswrapper[4599]: I1012 07:51:06.436542 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/03a40b71-5a8f-42cd-97dd-e1b360a15b68-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"03a40b71-5a8f-42cd-97dd-e1b360a15b68\") " pod="openstack/kube-state-metrics-0" Oct 12 07:51:06 crc kubenswrapper[4599]: I1012 07:51:06.436576 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/03a40b71-5a8f-42cd-97dd-e1b360a15b68-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"03a40b71-5a8f-42cd-97dd-e1b360a15b68\") " pod="openstack/kube-state-metrics-0" Oct 12 07:51:06 crc kubenswrapper[4599]: I1012 07:51:06.436797 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03a40b71-5a8f-42cd-97dd-e1b360a15b68-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"03a40b71-5a8f-42cd-97dd-e1b360a15b68\") " pod="openstack/kube-state-metrics-0" Oct 12 07:51:06 crc kubenswrapper[4599]: I1012 07:51:06.543264 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cm5zx\" (UniqueName: \"kubernetes.io/projected/03a40b71-5a8f-42cd-97dd-e1b360a15b68-kube-api-access-cm5zx\") pod \"kube-state-metrics-0\" (UID: \"03a40b71-5a8f-42cd-97dd-e1b360a15b68\") " pod="openstack/kube-state-metrics-0" Oct 12 07:51:06 crc kubenswrapper[4599]: I1012 07:51:06.546249 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/03a40b71-5a8f-42cd-97dd-e1b360a15b68-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"03a40b71-5a8f-42cd-97dd-e1b360a15b68\") " pod="openstack/kube-state-metrics-0" Oct 12 07:51:06 crc kubenswrapper[4599]: I1012 07:51:06.546294 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/03a40b71-5a8f-42cd-97dd-e1b360a15b68-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"03a40b71-5a8f-42cd-97dd-e1b360a15b68\") " pod="openstack/kube-state-metrics-0" Oct 12 07:51:06 crc kubenswrapper[4599]: I1012 07:51:06.546392 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03a40b71-5a8f-42cd-97dd-e1b360a15b68-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"03a40b71-5a8f-42cd-97dd-e1b360a15b68\") " pod="openstack/kube-state-metrics-0" Oct 12 07:51:06 crc kubenswrapper[4599]: I1012 07:51:06.552042 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/03a40b71-5a8f-42cd-97dd-e1b360a15b68-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"03a40b71-5a8f-42cd-97dd-e1b360a15b68\") " pod="openstack/kube-state-metrics-0" Oct 12 07:51:06 crc kubenswrapper[4599]: I1012 07:51:06.552788 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/03a40b71-5a8f-42cd-97dd-e1b360a15b68-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"03a40b71-5a8f-42cd-97dd-e1b360a15b68\") " pod="openstack/kube-state-metrics-0" Oct 12 07:51:06 crc kubenswrapper[4599]: I1012 07:51:06.553313 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03a40b71-5a8f-42cd-97dd-e1b360a15b68-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"03a40b71-5a8f-42cd-97dd-e1b360a15b68\") " pod="openstack/kube-state-metrics-0" Oct 12 07:51:06 crc kubenswrapper[4599]: I1012 07:51:06.560686 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cm5zx\" (UniqueName: \"kubernetes.io/projected/03a40b71-5a8f-42cd-97dd-e1b360a15b68-kube-api-access-cm5zx\") pod \"kube-state-metrics-0\" (UID: \"03a40b71-5a8f-42cd-97dd-e1b360a15b68\") " pod="openstack/kube-state-metrics-0" Oct 12 07:51:06 crc kubenswrapper[4599]: I1012 07:51:06.704680 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 12 07:51:07 crc kubenswrapper[4599]: I1012 07:51:07.292624 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-f7jhc" event={"ID":"73cd69f3-194c-4484-a0cf-01815024d884","Type":"ContainerStarted","Data":"9251b6efd4054f03bf78b785b242d671aede9a26bb2415e8b96330e3b4e6d0de"} Oct 12 07:51:07 crc kubenswrapper[4599]: I1012 07:51:07.299263 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-94c48d4d7-z6cgn" event={"ID":"8361da66-d18b-4f7c-8c5e-6e1fc04b8f11","Type":"ContainerStarted","Data":"d90966a1acb1c1aa7d8e5c9d09d5294767275b776e8b1fb1a864173cd5629e23"} Oct 12 07:51:07 crc kubenswrapper[4599]: I1012 07:51:07.299557 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-94c48d4d7-z6cgn" Oct 12 07:51:07 crc kubenswrapper[4599]: I1012 07:51:07.324413 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 12 07:51:07 crc kubenswrapper[4599]: I1012 07:51:07.335115 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-f7jhc" podStartSLOduration=2.335093534 podStartE2EDuration="2.335093534s" podCreationTimestamp="2025-10-12 07:51:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:51:07.31728794 +0000 UTC m=+964.106483442" watchObservedRunningTime="2025-10-12 07:51:07.335093534 +0000 UTC m=+964.124289036" Oct 12 07:51:07 crc kubenswrapper[4599]: I1012 07:51:07.343127 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-94c48d4d7-z6cgn" podStartSLOduration=3.343109191 podStartE2EDuration="3.343109191s" podCreationTimestamp="2025-10-12 07:51:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:51:07.338704659 +0000 UTC m=+964.127900161" watchObservedRunningTime="2025-10-12 07:51:07.343109191 +0000 UTC m=+964.132304693" Oct 12 07:51:07 crc kubenswrapper[4599]: I1012 07:51:07.564406 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ca657cf-7ac2-4cb5-894b-c3a149ee4101" path="/var/lib/kubelet/pods/0ca657cf-7ac2-4cb5-894b-c3a149ee4101/volumes" Oct 12 07:51:07 crc kubenswrapper[4599]: I1012 07:51:07.779481 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 12 07:51:07 crc kubenswrapper[4599]: I1012 07:51:07.779778 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="18f19858-7b8d-4c40-afde-102c00500fe0" containerName="ceilometer-central-agent" containerID="cri-o://be351c4ab187da14c841501e1db3f02eca70d2ffcf8fd7925cbf3e97393380de" gracePeriod=30 Oct 12 07:51:07 crc kubenswrapper[4599]: I1012 07:51:07.779925 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="18f19858-7b8d-4c40-afde-102c00500fe0" containerName="proxy-httpd" containerID="cri-o://498ce5d4ca86aa9fffe991724399ac67b69637f733112dad41b0786a3f28c57e" gracePeriod=30 Oct 12 07:51:07 crc kubenswrapper[4599]: I1012 07:51:07.779974 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="18f19858-7b8d-4c40-afde-102c00500fe0" containerName="sg-core" containerID="cri-o://aaab87415dfc693913a370b19728746de29f16583adf69abec100847729b553b" gracePeriod=30 Oct 12 07:51:07 crc kubenswrapper[4599]: I1012 07:51:07.780008 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="18f19858-7b8d-4c40-afde-102c00500fe0" containerName="ceilometer-notification-agent" containerID="cri-o://50d89208ffbfaf2b47cfe9fde6f024ae70473304133d3f34769e62780f33b003" gracePeriod=30 Oct 12 07:51:08 crc kubenswrapper[4599]: I1012 07:51:08.309049 4599 generic.go:334] "Generic (PLEG): container finished" podID="18f19858-7b8d-4c40-afde-102c00500fe0" containerID="498ce5d4ca86aa9fffe991724399ac67b69637f733112dad41b0786a3f28c57e" exitCode=0 Oct 12 07:51:08 crc kubenswrapper[4599]: I1012 07:51:08.309078 4599 generic.go:334] "Generic (PLEG): container finished" podID="18f19858-7b8d-4c40-afde-102c00500fe0" containerID="aaab87415dfc693913a370b19728746de29f16583adf69abec100847729b553b" exitCode=2 Oct 12 07:51:08 crc kubenswrapper[4599]: I1012 07:51:08.309086 4599 generic.go:334] "Generic (PLEG): container finished" podID="18f19858-7b8d-4c40-afde-102c00500fe0" containerID="50d89208ffbfaf2b47cfe9fde6f024ae70473304133d3f34769e62780f33b003" exitCode=0 Oct 12 07:51:08 crc kubenswrapper[4599]: I1012 07:51:08.309094 4599 generic.go:334] "Generic (PLEG): container finished" podID="18f19858-7b8d-4c40-afde-102c00500fe0" containerID="be351c4ab187da14c841501e1db3f02eca70d2ffcf8fd7925cbf3e97393380de" exitCode=0 Oct 12 07:51:08 crc kubenswrapper[4599]: I1012 07:51:08.309759 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"18f19858-7b8d-4c40-afde-102c00500fe0","Type":"ContainerDied","Data":"498ce5d4ca86aa9fffe991724399ac67b69637f733112dad41b0786a3f28c57e"} Oct 12 07:51:08 crc kubenswrapper[4599]: I1012 07:51:08.309799 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"18f19858-7b8d-4c40-afde-102c00500fe0","Type":"ContainerDied","Data":"aaab87415dfc693913a370b19728746de29f16583adf69abec100847729b553b"} Oct 12 07:51:08 crc kubenswrapper[4599]: I1012 07:51:08.309810 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"18f19858-7b8d-4c40-afde-102c00500fe0","Type":"ContainerDied","Data":"50d89208ffbfaf2b47cfe9fde6f024ae70473304133d3f34769e62780f33b003"} Oct 12 07:51:08 crc kubenswrapper[4599]: I1012 07:51:08.309818 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"18f19858-7b8d-4c40-afde-102c00500fe0","Type":"ContainerDied","Data":"be351c4ab187da14c841501e1db3f02eca70d2ffcf8fd7925cbf3e97393380de"} Oct 12 07:51:08 crc kubenswrapper[4599]: I1012 07:51:08.923457 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 12 07:51:08 crc kubenswrapper[4599]: I1012 07:51:08.935802 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 12 07:51:09 crc kubenswrapper[4599]: I1012 07:51:09.157063 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 12 07:51:09 crc kubenswrapper[4599]: I1012 07:51:09.204898 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-87h6w\" (UniqueName: \"kubernetes.io/projected/18f19858-7b8d-4c40-afde-102c00500fe0-kube-api-access-87h6w\") pod \"18f19858-7b8d-4c40-afde-102c00500fe0\" (UID: \"18f19858-7b8d-4c40-afde-102c00500fe0\") " Oct 12 07:51:09 crc kubenswrapper[4599]: I1012 07:51:09.205259 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18f19858-7b8d-4c40-afde-102c00500fe0-scripts\") pod \"18f19858-7b8d-4c40-afde-102c00500fe0\" (UID: \"18f19858-7b8d-4c40-afde-102c00500fe0\") " Oct 12 07:51:09 crc kubenswrapper[4599]: I1012 07:51:09.205293 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/18f19858-7b8d-4c40-afde-102c00500fe0-log-httpd\") pod \"18f19858-7b8d-4c40-afde-102c00500fe0\" (UID: \"18f19858-7b8d-4c40-afde-102c00500fe0\") " Oct 12 07:51:09 crc kubenswrapper[4599]: I1012 07:51:09.205380 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/18f19858-7b8d-4c40-afde-102c00500fe0-sg-core-conf-yaml\") pod \"18f19858-7b8d-4c40-afde-102c00500fe0\" (UID: \"18f19858-7b8d-4c40-afde-102c00500fe0\") " Oct 12 07:51:09 crc kubenswrapper[4599]: I1012 07:51:09.205555 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18f19858-7b8d-4c40-afde-102c00500fe0-config-data\") pod \"18f19858-7b8d-4c40-afde-102c00500fe0\" (UID: \"18f19858-7b8d-4c40-afde-102c00500fe0\") " Oct 12 07:51:09 crc kubenswrapper[4599]: I1012 07:51:09.205635 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/18f19858-7b8d-4c40-afde-102c00500fe0-run-httpd\") pod \"18f19858-7b8d-4c40-afde-102c00500fe0\" (UID: \"18f19858-7b8d-4c40-afde-102c00500fe0\") " Oct 12 07:51:09 crc kubenswrapper[4599]: I1012 07:51:09.205666 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18f19858-7b8d-4c40-afde-102c00500fe0-combined-ca-bundle\") pod \"18f19858-7b8d-4c40-afde-102c00500fe0\" (UID: \"18f19858-7b8d-4c40-afde-102c00500fe0\") " Oct 12 07:51:09 crc kubenswrapper[4599]: I1012 07:51:09.205891 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18f19858-7b8d-4c40-afde-102c00500fe0-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "18f19858-7b8d-4c40-afde-102c00500fe0" (UID: "18f19858-7b8d-4c40-afde-102c00500fe0"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:51:09 crc kubenswrapper[4599]: I1012 07:51:09.206134 4599 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/18f19858-7b8d-4c40-afde-102c00500fe0-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 12 07:51:09 crc kubenswrapper[4599]: I1012 07:51:09.206288 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18f19858-7b8d-4c40-afde-102c00500fe0-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "18f19858-7b8d-4c40-afde-102c00500fe0" (UID: "18f19858-7b8d-4c40-afde-102c00500fe0"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:51:09 crc kubenswrapper[4599]: I1012 07:51:09.214191 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18f19858-7b8d-4c40-afde-102c00500fe0-scripts" (OuterVolumeSpecName: "scripts") pod "18f19858-7b8d-4c40-afde-102c00500fe0" (UID: "18f19858-7b8d-4c40-afde-102c00500fe0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:51:09 crc kubenswrapper[4599]: I1012 07:51:09.214237 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18f19858-7b8d-4c40-afde-102c00500fe0-kube-api-access-87h6w" (OuterVolumeSpecName: "kube-api-access-87h6w") pod "18f19858-7b8d-4c40-afde-102c00500fe0" (UID: "18f19858-7b8d-4c40-afde-102c00500fe0"). InnerVolumeSpecName "kube-api-access-87h6w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:51:09 crc kubenswrapper[4599]: I1012 07:51:09.246666 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18f19858-7b8d-4c40-afde-102c00500fe0-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "18f19858-7b8d-4c40-afde-102c00500fe0" (UID: "18f19858-7b8d-4c40-afde-102c00500fe0"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:51:09 crc kubenswrapper[4599]: I1012 07:51:09.300019 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18f19858-7b8d-4c40-afde-102c00500fe0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "18f19858-7b8d-4c40-afde-102c00500fe0" (UID: "18f19858-7b8d-4c40-afde-102c00500fe0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:51:09 crc kubenswrapper[4599]: I1012 07:51:09.307597 4599 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/18f19858-7b8d-4c40-afde-102c00500fe0-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 12 07:51:09 crc kubenswrapper[4599]: I1012 07:51:09.307624 4599 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18f19858-7b8d-4c40-afde-102c00500fe0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 07:51:09 crc kubenswrapper[4599]: I1012 07:51:09.307636 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-87h6w\" (UniqueName: \"kubernetes.io/projected/18f19858-7b8d-4c40-afde-102c00500fe0-kube-api-access-87h6w\") on node \"crc\" DevicePath \"\"" Oct 12 07:51:09 crc kubenswrapper[4599]: I1012 07:51:09.307644 4599 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18f19858-7b8d-4c40-afde-102c00500fe0-scripts\") on node \"crc\" DevicePath \"\"" Oct 12 07:51:09 crc kubenswrapper[4599]: I1012 07:51:09.307653 4599 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/18f19858-7b8d-4c40-afde-102c00500fe0-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 12 07:51:09 crc kubenswrapper[4599]: I1012 07:51:09.329434 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3bcc5217-6fbf-4f31-9f4d-cbaabc914b51","Type":"ContainerStarted","Data":"3a94e2f8a5f5cb8b8dd449c9db77eb5c5888f40fdd3fd11c92797faf920b6f0a"} Oct 12 07:51:09 crc kubenswrapper[4599]: I1012 07:51:09.331768 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"18f19858-7b8d-4c40-afde-102c00500fe0","Type":"ContainerDied","Data":"f237955579909e1c4bc6d182e14ba03f1b581fc1cd2c657361f3dae91c2afbab"} Oct 12 07:51:09 crc kubenswrapper[4599]: I1012 07:51:09.331829 4599 scope.go:117] "RemoveContainer" containerID="498ce5d4ca86aa9fffe991724399ac67b69637f733112dad41b0786a3f28c57e" Oct 12 07:51:09 crc kubenswrapper[4599]: I1012 07:51:09.332076 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 12 07:51:09 crc kubenswrapper[4599]: I1012 07:51:09.335714 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"04538b3c-68d8-43fa-adc3-d319f98bf0d9","Type":"ContainerStarted","Data":"73f4c8b92bd06f1e1fdc40db556350adf07c9fcecf7d9a766f30e3b507692ed8"} Oct 12 07:51:09 crc kubenswrapper[4599]: I1012 07:51:09.348692 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"2e47371c-f8d3-4eef-9a33-f4f6f7169dee","Type":"ContainerStarted","Data":"f75f2f7840736ad38279f593d358c36d383a46619d646761943c18436cfc04f4"} Oct 12 07:51:09 crc kubenswrapper[4599]: I1012 07:51:09.351692 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.771174864 podStartE2EDuration="5.351677681s" podCreationTimestamp="2025-10-12 07:51:04 +0000 UTC" firstStartedPulling="2025-10-12 07:51:05.283553497 +0000 UTC m=+962.072748998" lastFinishedPulling="2025-10-12 07:51:08.864056313 +0000 UTC m=+965.653251815" observedRunningTime="2025-10-12 07:51:09.34873628 +0000 UTC m=+966.137931783" watchObservedRunningTime="2025-10-12 07:51:09.351677681 +0000 UTC m=+966.140873184" Oct 12 07:51:09 crc kubenswrapper[4599]: I1012 07:51:09.348852 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="2e47371c-f8d3-4eef-9a33-f4f6f7169dee" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://f75f2f7840736ad38279f593d358c36d383a46619d646761943c18436cfc04f4" gracePeriod=30 Oct 12 07:51:09 crc kubenswrapper[4599]: I1012 07:51:09.371590 4599 scope.go:117] "RemoveContainer" containerID="aaab87415dfc693913a370b19728746de29f16583adf69abec100847729b553b" Oct 12 07:51:09 crc kubenswrapper[4599]: I1012 07:51:09.381673 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3b8ae9f7-4558-4445-8976-db978922edde","Type":"ContainerStarted","Data":"8aa833c06a2a9e762622ad8c932f1593ea522e274c9387755ea12c13264466ad"} Oct 12 07:51:09 crc kubenswrapper[4599]: I1012 07:51:09.384625 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18f19858-7b8d-4c40-afde-102c00500fe0-config-data" (OuterVolumeSpecName: "config-data") pod "18f19858-7b8d-4c40-afde-102c00500fe0" (UID: "18f19858-7b8d-4c40-afde-102c00500fe0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:51:09 crc kubenswrapper[4599]: I1012 07:51:09.386832 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"03a40b71-5a8f-42cd-97dd-e1b360a15b68","Type":"ContainerStarted","Data":"04032b6c50a43a5e83c9256c25d87b97e08329ff9b1c7bd7111ae9616cacf462"} Oct 12 07:51:09 crc kubenswrapper[4599]: I1012 07:51:09.386878 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"03a40b71-5a8f-42cd-97dd-e1b360a15b68","Type":"ContainerStarted","Data":"c48f4a4318c84d447fd3f730124816115c82dc5863275408073c4576381b77c1"} Oct 12 07:51:09 crc kubenswrapper[4599]: I1012 07:51:09.413251 4599 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18f19858-7b8d-4c40-afde-102c00500fe0-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 07:51:09 crc kubenswrapper[4599]: I1012 07:51:09.416385 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.04558391 podStartE2EDuration="5.41637467s" podCreationTimestamp="2025-10-12 07:51:04 +0000 UTC" firstStartedPulling="2025-10-12 07:51:05.489579557 +0000 UTC m=+962.278775058" lastFinishedPulling="2025-10-12 07:51:08.860370316 +0000 UTC m=+965.649565818" observedRunningTime="2025-10-12 07:51:09.381919084 +0000 UTC m=+966.171114587" watchObservedRunningTime="2025-10-12 07:51:09.41637467 +0000 UTC m=+966.205570163" Oct 12 07:51:09 crc kubenswrapper[4599]: I1012 07:51:09.418557 4599 scope.go:117] "RemoveContainer" containerID="50d89208ffbfaf2b47cfe9fde6f024ae70473304133d3f34769e62780f33b003" Oct 12 07:51:09 crc kubenswrapper[4599]: I1012 07:51:09.428439 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=3.102363846 podStartE2EDuration="3.428427797s" podCreationTimestamp="2025-10-12 07:51:06 +0000 UTC" firstStartedPulling="2025-10-12 07:51:08.780066324 +0000 UTC m=+965.569261826" lastFinishedPulling="2025-10-12 07:51:09.106130275 +0000 UTC m=+965.895325777" observedRunningTime="2025-10-12 07:51:09.422699196 +0000 UTC m=+966.211894699" watchObservedRunningTime="2025-10-12 07:51:09.428427797 +0000 UTC m=+966.217623299" Oct 12 07:51:09 crc kubenswrapper[4599]: I1012 07:51:09.442479 4599 scope.go:117] "RemoveContainer" containerID="be351c4ab187da14c841501e1db3f02eca70d2ffcf8fd7925cbf3e97393380de" Oct 12 07:51:09 crc kubenswrapper[4599]: I1012 07:51:09.538578 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 12 07:51:09 crc kubenswrapper[4599]: I1012 07:51:09.658379 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 12 07:51:09 crc kubenswrapper[4599]: I1012 07:51:09.674605 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 12 07:51:09 crc kubenswrapper[4599]: I1012 07:51:09.687035 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 12 07:51:09 crc kubenswrapper[4599]: E1012 07:51:09.694532 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18f19858-7b8d-4c40-afde-102c00500fe0" containerName="sg-core" Oct 12 07:51:09 crc kubenswrapper[4599]: I1012 07:51:09.694554 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="18f19858-7b8d-4c40-afde-102c00500fe0" containerName="sg-core" Oct 12 07:51:09 crc kubenswrapper[4599]: E1012 07:51:09.694574 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18f19858-7b8d-4c40-afde-102c00500fe0" containerName="ceilometer-notification-agent" Oct 12 07:51:09 crc kubenswrapper[4599]: I1012 07:51:09.694582 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="18f19858-7b8d-4c40-afde-102c00500fe0" containerName="ceilometer-notification-agent" Oct 12 07:51:09 crc kubenswrapper[4599]: E1012 07:51:09.694595 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18f19858-7b8d-4c40-afde-102c00500fe0" containerName="ceilometer-central-agent" Oct 12 07:51:09 crc kubenswrapper[4599]: I1012 07:51:09.694601 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="18f19858-7b8d-4c40-afde-102c00500fe0" containerName="ceilometer-central-agent" Oct 12 07:51:09 crc kubenswrapper[4599]: E1012 07:51:09.694617 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18f19858-7b8d-4c40-afde-102c00500fe0" containerName="proxy-httpd" Oct 12 07:51:09 crc kubenswrapper[4599]: I1012 07:51:09.694622 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="18f19858-7b8d-4c40-afde-102c00500fe0" containerName="proxy-httpd" Oct 12 07:51:09 crc kubenswrapper[4599]: I1012 07:51:09.694798 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="18f19858-7b8d-4c40-afde-102c00500fe0" containerName="ceilometer-notification-agent" Oct 12 07:51:09 crc kubenswrapper[4599]: I1012 07:51:09.694812 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="18f19858-7b8d-4c40-afde-102c00500fe0" containerName="proxy-httpd" Oct 12 07:51:09 crc kubenswrapper[4599]: I1012 07:51:09.694824 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="18f19858-7b8d-4c40-afde-102c00500fe0" containerName="ceilometer-central-agent" Oct 12 07:51:09 crc kubenswrapper[4599]: I1012 07:51:09.694833 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="18f19858-7b8d-4c40-afde-102c00500fe0" containerName="sg-core" Oct 12 07:51:09 crc kubenswrapper[4599]: I1012 07:51:09.696481 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 12 07:51:09 crc kubenswrapper[4599]: I1012 07:51:09.708810 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 12 07:51:09 crc kubenswrapper[4599]: I1012 07:51:09.709007 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 12 07:51:09 crc kubenswrapper[4599]: I1012 07:51:09.709057 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 12 07:51:09 crc kubenswrapper[4599]: I1012 07:51:09.738405 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 12 07:51:09 crc kubenswrapper[4599]: I1012 07:51:09.777489 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 12 07:51:09 crc kubenswrapper[4599]: I1012 07:51:09.821256 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f64dd0c-359b-4342-a814-923eb2a16de8-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2f64dd0c-359b-4342-a814-923eb2a16de8\") " pod="openstack/ceilometer-0" Oct 12 07:51:09 crc kubenswrapper[4599]: I1012 07:51:09.821796 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f64dd0c-359b-4342-a814-923eb2a16de8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2f64dd0c-359b-4342-a814-923eb2a16de8\") " pod="openstack/ceilometer-0" Oct 12 07:51:09 crc kubenswrapper[4599]: I1012 07:51:09.821956 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f64dd0c-359b-4342-a814-923eb2a16de8-scripts\") pod \"ceilometer-0\" (UID: \"2f64dd0c-359b-4342-a814-923eb2a16de8\") " pod="openstack/ceilometer-0" Oct 12 07:51:09 crc kubenswrapper[4599]: I1012 07:51:09.822020 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2f64dd0c-359b-4342-a814-923eb2a16de8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2f64dd0c-359b-4342-a814-923eb2a16de8\") " pod="openstack/ceilometer-0" Oct 12 07:51:09 crc kubenswrapper[4599]: I1012 07:51:09.822111 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2f64dd0c-359b-4342-a814-923eb2a16de8-run-httpd\") pod \"ceilometer-0\" (UID: \"2f64dd0c-359b-4342-a814-923eb2a16de8\") " pod="openstack/ceilometer-0" Oct 12 07:51:09 crc kubenswrapper[4599]: I1012 07:51:09.822212 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gg2kk\" (UniqueName: \"kubernetes.io/projected/2f64dd0c-359b-4342-a814-923eb2a16de8-kube-api-access-gg2kk\") pod \"ceilometer-0\" (UID: \"2f64dd0c-359b-4342-a814-923eb2a16de8\") " pod="openstack/ceilometer-0" Oct 12 07:51:09 crc kubenswrapper[4599]: I1012 07:51:09.822398 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2f64dd0c-359b-4342-a814-923eb2a16de8-log-httpd\") pod \"ceilometer-0\" (UID: \"2f64dd0c-359b-4342-a814-923eb2a16de8\") " pod="openstack/ceilometer-0" Oct 12 07:51:09 crc kubenswrapper[4599]: I1012 07:51:09.822505 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f64dd0c-359b-4342-a814-923eb2a16de8-config-data\") pod \"ceilometer-0\" (UID: \"2f64dd0c-359b-4342-a814-923eb2a16de8\") " pod="openstack/ceilometer-0" Oct 12 07:51:09 crc kubenswrapper[4599]: I1012 07:51:09.924782 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2f64dd0c-359b-4342-a814-923eb2a16de8-log-httpd\") pod \"ceilometer-0\" (UID: \"2f64dd0c-359b-4342-a814-923eb2a16de8\") " pod="openstack/ceilometer-0" Oct 12 07:51:09 crc kubenswrapper[4599]: I1012 07:51:09.924836 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f64dd0c-359b-4342-a814-923eb2a16de8-config-data\") pod \"ceilometer-0\" (UID: \"2f64dd0c-359b-4342-a814-923eb2a16de8\") " pod="openstack/ceilometer-0" Oct 12 07:51:09 crc kubenswrapper[4599]: I1012 07:51:09.924908 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f64dd0c-359b-4342-a814-923eb2a16de8-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2f64dd0c-359b-4342-a814-923eb2a16de8\") " pod="openstack/ceilometer-0" Oct 12 07:51:09 crc kubenswrapper[4599]: I1012 07:51:09.924968 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f64dd0c-359b-4342-a814-923eb2a16de8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2f64dd0c-359b-4342-a814-923eb2a16de8\") " pod="openstack/ceilometer-0" Oct 12 07:51:09 crc kubenswrapper[4599]: I1012 07:51:09.925011 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f64dd0c-359b-4342-a814-923eb2a16de8-scripts\") pod \"ceilometer-0\" (UID: \"2f64dd0c-359b-4342-a814-923eb2a16de8\") " pod="openstack/ceilometer-0" Oct 12 07:51:09 crc kubenswrapper[4599]: I1012 07:51:09.925026 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2f64dd0c-359b-4342-a814-923eb2a16de8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2f64dd0c-359b-4342-a814-923eb2a16de8\") " pod="openstack/ceilometer-0" Oct 12 07:51:09 crc kubenswrapper[4599]: I1012 07:51:09.925048 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2f64dd0c-359b-4342-a814-923eb2a16de8-run-httpd\") pod \"ceilometer-0\" (UID: \"2f64dd0c-359b-4342-a814-923eb2a16de8\") " pod="openstack/ceilometer-0" Oct 12 07:51:09 crc kubenswrapper[4599]: I1012 07:51:09.925081 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gg2kk\" (UniqueName: \"kubernetes.io/projected/2f64dd0c-359b-4342-a814-923eb2a16de8-kube-api-access-gg2kk\") pod \"ceilometer-0\" (UID: \"2f64dd0c-359b-4342-a814-923eb2a16de8\") " pod="openstack/ceilometer-0" Oct 12 07:51:09 crc kubenswrapper[4599]: I1012 07:51:09.925275 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2f64dd0c-359b-4342-a814-923eb2a16de8-log-httpd\") pod \"ceilometer-0\" (UID: \"2f64dd0c-359b-4342-a814-923eb2a16de8\") " pod="openstack/ceilometer-0" Oct 12 07:51:09 crc kubenswrapper[4599]: I1012 07:51:09.925477 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2f64dd0c-359b-4342-a814-923eb2a16de8-run-httpd\") pod \"ceilometer-0\" (UID: \"2f64dd0c-359b-4342-a814-923eb2a16de8\") " pod="openstack/ceilometer-0" Oct 12 07:51:09 crc kubenswrapper[4599]: I1012 07:51:09.929171 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f64dd0c-359b-4342-a814-923eb2a16de8-scripts\") pod \"ceilometer-0\" (UID: \"2f64dd0c-359b-4342-a814-923eb2a16de8\") " pod="openstack/ceilometer-0" Oct 12 07:51:09 crc kubenswrapper[4599]: I1012 07:51:09.929276 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f64dd0c-359b-4342-a814-923eb2a16de8-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2f64dd0c-359b-4342-a814-923eb2a16de8\") " pod="openstack/ceilometer-0" Oct 12 07:51:09 crc kubenswrapper[4599]: I1012 07:51:09.929764 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f64dd0c-359b-4342-a814-923eb2a16de8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2f64dd0c-359b-4342-a814-923eb2a16de8\") " pod="openstack/ceilometer-0" Oct 12 07:51:09 crc kubenswrapper[4599]: I1012 07:51:09.930056 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2f64dd0c-359b-4342-a814-923eb2a16de8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2f64dd0c-359b-4342-a814-923eb2a16de8\") " pod="openstack/ceilometer-0" Oct 12 07:51:09 crc kubenswrapper[4599]: I1012 07:51:09.930445 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f64dd0c-359b-4342-a814-923eb2a16de8-config-data\") pod \"ceilometer-0\" (UID: \"2f64dd0c-359b-4342-a814-923eb2a16de8\") " pod="openstack/ceilometer-0" Oct 12 07:51:09 crc kubenswrapper[4599]: I1012 07:51:09.940489 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gg2kk\" (UniqueName: \"kubernetes.io/projected/2f64dd0c-359b-4342-a814-923eb2a16de8-kube-api-access-gg2kk\") pod \"ceilometer-0\" (UID: \"2f64dd0c-359b-4342-a814-923eb2a16de8\") " pod="openstack/ceilometer-0" Oct 12 07:51:10 crc kubenswrapper[4599]: I1012 07:51:10.014534 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 12 07:51:10 crc kubenswrapper[4599]: I1012 07:51:10.396212 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3b8ae9f7-4558-4445-8976-db978922edde","Type":"ContainerStarted","Data":"1674c42961443f7863d2efc66469ce66109d749bd6a3f1ddbb3de56ae242fd08"} Oct 12 07:51:10 crc kubenswrapper[4599]: I1012 07:51:10.397943 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3bcc5217-6fbf-4f31-9f4d-cbaabc914b51","Type":"ContainerStarted","Data":"8994568a346847fe20c2ff047a0787b91f9da9ef5dcd578f40098327dc207211"} Oct 12 07:51:10 crc kubenswrapper[4599]: I1012 07:51:10.398004 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3bcc5217-6fbf-4f31-9f4d-cbaabc914b51" containerName="nova-metadata-log" containerID="cri-o://3a94e2f8a5f5cb8b8dd449c9db77eb5c5888f40fdd3fd11c92797faf920b6f0a" gracePeriod=30 Oct 12 07:51:10 crc kubenswrapper[4599]: I1012 07:51:10.398033 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3bcc5217-6fbf-4f31-9f4d-cbaabc914b51" containerName="nova-metadata-metadata" containerID="cri-o://8994568a346847fe20c2ff047a0787b91f9da9ef5dcd578f40098327dc207211" gracePeriod=30 Oct 12 07:51:10 crc kubenswrapper[4599]: I1012 07:51:10.401812 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 12 07:51:10 crc kubenswrapper[4599]: I1012 07:51:10.420950 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.13751177 podStartE2EDuration="6.420928825s" podCreationTimestamp="2025-10-12 07:51:04 +0000 UTC" firstStartedPulling="2025-10-12 07:51:05.587531125 +0000 UTC m=+962.376726627" lastFinishedPulling="2025-10-12 07:51:08.87094818 +0000 UTC m=+965.660143682" observedRunningTime="2025-10-12 07:51:10.417549266 +0000 UTC m=+967.206744768" watchObservedRunningTime="2025-10-12 07:51:10.420928825 +0000 UTC m=+967.210124327" Oct 12 07:51:10 crc kubenswrapper[4599]: I1012 07:51:10.433133 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 12 07:51:10 crc kubenswrapper[4599]: I1012 07:51:10.446754 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.371861464 podStartE2EDuration="6.446735748s" podCreationTimestamp="2025-10-12 07:51:04 +0000 UTC" firstStartedPulling="2025-10-12 07:51:05.791093837 +0000 UTC m=+962.580289339" lastFinishedPulling="2025-10-12 07:51:08.86596812 +0000 UTC m=+965.655163623" observedRunningTime="2025-10-12 07:51:10.445123205 +0000 UTC m=+967.234318707" watchObservedRunningTime="2025-10-12 07:51:10.446735748 +0000 UTC m=+967.235931250" Oct 12 07:51:10 crc kubenswrapper[4599]: I1012 07:51:10.889466 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 12 07:51:10 crc kubenswrapper[4599]: I1012 07:51:10.943296 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3bcc5217-6fbf-4f31-9f4d-cbaabc914b51-logs\") pod \"3bcc5217-6fbf-4f31-9f4d-cbaabc914b51\" (UID: \"3bcc5217-6fbf-4f31-9f4d-cbaabc914b51\") " Oct 12 07:51:10 crc kubenswrapper[4599]: I1012 07:51:10.943368 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bcc5217-6fbf-4f31-9f4d-cbaabc914b51-config-data\") pod \"3bcc5217-6fbf-4f31-9f4d-cbaabc914b51\" (UID: \"3bcc5217-6fbf-4f31-9f4d-cbaabc914b51\") " Oct 12 07:51:10 crc kubenswrapper[4599]: I1012 07:51:10.943453 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c45zb\" (UniqueName: \"kubernetes.io/projected/3bcc5217-6fbf-4f31-9f4d-cbaabc914b51-kube-api-access-c45zb\") pod \"3bcc5217-6fbf-4f31-9f4d-cbaabc914b51\" (UID: \"3bcc5217-6fbf-4f31-9f4d-cbaabc914b51\") " Oct 12 07:51:10 crc kubenswrapper[4599]: I1012 07:51:10.943499 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bcc5217-6fbf-4f31-9f4d-cbaabc914b51-combined-ca-bundle\") pod \"3bcc5217-6fbf-4f31-9f4d-cbaabc914b51\" (UID: \"3bcc5217-6fbf-4f31-9f4d-cbaabc914b51\") " Oct 12 07:51:10 crc kubenswrapper[4599]: I1012 07:51:10.944730 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3bcc5217-6fbf-4f31-9f4d-cbaabc914b51-logs" (OuterVolumeSpecName: "logs") pod "3bcc5217-6fbf-4f31-9f4d-cbaabc914b51" (UID: "3bcc5217-6fbf-4f31-9f4d-cbaabc914b51"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:51:10 crc kubenswrapper[4599]: I1012 07:51:10.949280 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bcc5217-6fbf-4f31-9f4d-cbaabc914b51-kube-api-access-c45zb" (OuterVolumeSpecName: "kube-api-access-c45zb") pod "3bcc5217-6fbf-4f31-9f4d-cbaabc914b51" (UID: "3bcc5217-6fbf-4f31-9f4d-cbaabc914b51"). InnerVolumeSpecName "kube-api-access-c45zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:51:10 crc kubenswrapper[4599]: I1012 07:51:10.970163 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3bcc5217-6fbf-4f31-9f4d-cbaabc914b51-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3bcc5217-6fbf-4f31-9f4d-cbaabc914b51" (UID: "3bcc5217-6fbf-4f31-9f4d-cbaabc914b51"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:51:10 crc kubenswrapper[4599]: I1012 07:51:10.970986 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3bcc5217-6fbf-4f31-9f4d-cbaabc914b51-config-data" (OuterVolumeSpecName: "config-data") pod "3bcc5217-6fbf-4f31-9f4d-cbaabc914b51" (UID: "3bcc5217-6fbf-4f31-9f4d-cbaabc914b51"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:51:11 crc kubenswrapper[4599]: I1012 07:51:11.045476 4599 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3bcc5217-6fbf-4f31-9f4d-cbaabc914b51-logs\") on node \"crc\" DevicePath \"\"" Oct 12 07:51:11 crc kubenswrapper[4599]: I1012 07:51:11.045502 4599 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bcc5217-6fbf-4f31-9f4d-cbaabc914b51-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 07:51:11 crc kubenswrapper[4599]: I1012 07:51:11.045513 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c45zb\" (UniqueName: \"kubernetes.io/projected/3bcc5217-6fbf-4f31-9f4d-cbaabc914b51-kube-api-access-c45zb\") on node \"crc\" DevicePath \"\"" Oct 12 07:51:11 crc kubenswrapper[4599]: I1012 07:51:11.045524 4599 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bcc5217-6fbf-4f31-9f4d-cbaabc914b51-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 07:51:11 crc kubenswrapper[4599]: I1012 07:51:11.410451 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2f64dd0c-359b-4342-a814-923eb2a16de8","Type":"ContainerStarted","Data":"e165eb4006b0aef1fa3a4b1b73a4c8cd2fbea6091ee302ca98c3f51058dfdea5"} Oct 12 07:51:11 crc kubenswrapper[4599]: I1012 07:51:11.410804 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2f64dd0c-359b-4342-a814-923eb2a16de8","Type":"ContainerStarted","Data":"d85eddc6c4050e4b6a07b4774ae8eae193972c6f81e8ef799e267af2d40f1f06"} Oct 12 07:51:11 crc kubenswrapper[4599]: I1012 07:51:11.414166 4599 generic.go:334] "Generic (PLEG): container finished" podID="73cd69f3-194c-4484-a0cf-01815024d884" containerID="9251b6efd4054f03bf78b785b242d671aede9a26bb2415e8b96330e3b4e6d0de" exitCode=0 Oct 12 07:51:11 crc kubenswrapper[4599]: I1012 07:51:11.414259 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-f7jhc" event={"ID":"73cd69f3-194c-4484-a0cf-01815024d884","Type":"ContainerDied","Data":"9251b6efd4054f03bf78b785b242d671aede9a26bb2415e8b96330e3b4e6d0de"} Oct 12 07:51:11 crc kubenswrapper[4599]: I1012 07:51:11.416101 4599 generic.go:334] "Generic (PLEG): container finished" podID="3bcc5217-6fbf-4f31-9f4d-cbaabc914b51" containerID="8994568a346847fe20c2ff047a0787b91f9da9ef5dcd578f40098327dc207211" exitCode=0 Oct 12 07:51:11 crc kubenswrapper[4599]: I1012 07:51:11.416194 4599 generic.go:334] "Generic (PLEG): container finished" podID="3bcc5217-6fbf-4f31-9f4d-cbaabc914b51" containerID="3a94e2f8a5f5cb8b8dd449c9db77eb5c5888f40fdd3fd11c92797faf920b6f0a" exitCode=143 Oct 12 07:51:11 crc kubenswrapper[4599]: I1012 07:51:11.416373 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3bcc5217-6fbf-4f31-9f4d-cbaabc914b51","Type":"ContainerDied","Data":"8994568a346847fe20c2ff047a0787b91f9da9ef5dcd578f40098327dc207211"} Oct 12 07:51:11 crc kubenswrapper[4599]: I1012 07:51:11.416434 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 12 07:51:11 crc kubenswrapper[4599]: I1012 07:51:11.416461 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3bcc5217-6fbf-4f31-9f4d-cbaabc914b51","Type":"ContainerDied","Data":"3a94e2f8a5f5cb8b8dd449c9db77eb5c5888f40fdd3fd11c92797faf920b6f0a"} Oct 12 07:51:11 crc kubenswrapper[4599]: I1012 07:51:11.416475 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3bcc5217-6fbf-4f31-9f4d-cbaabc914b51","Type":"ContainerDied","Data":"d19a227617f6991796b6f345a88c0e0463bbff500e6fdc085b679d8fae9502cd"} Oct 12 07:51:11 crc kubenswrapper[4599]: I1012 07:51:11.416501 4599 scope.go:117] "RemoveContainer" containerID="8994568a346847fe20c2ff047a0787b91f9da9ef5dcd578f40098327dc207211" Oct 12 07:51:11 crc kubenswrapper[4599]: I1012 07:51:11.460172 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 12 07:51:11 crc kubenswrapper[4599]: I1012 07:51:11.460324 4599 scope.go:117] "RemoveContainer" containerID="3a94e2f8a5f5cb8b8dd449c9db77eb5c5888f40fdd3fd11c92797faf920b6f0a" Oct 12 07:51:11 crc kubenswrapper[4599]: I1012 07:51:11.500597 4599 scope.go:117] "RemoveContainer" containerID="8994568a346847fe20c2ff047a0787b91f9da9ef5dcd578f40098327dc207211" Oct 12 07:51:11 crc kubenswrapper[4599]: E1012 07:51:11.501100 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8994568a346847fe20c2ff047a0787b91f9da9ef5dcd578f40098327dc207211\": container with ID starting with 8994568a346847fe20c2ff047a0787b91f9da9ef5dcd578f40098327dc207211 not found: ID does not exist" containerID="8994568a346847fe20c2ff047a0787b91f9da9ef5dcd578f40098327dc207211" Oct 12 07:51:11 crc kubenswrapper[4599]: I1012 07:51:11.501152 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8994568a346847fe20c2ff047a0787b91f9da9ef5dcd578f40098327dc207211"} err="failed to get container status \"8994568a346847fe20c2ff047a0787b91f9da9ef5dcd578f40098327dc207211\": rpc error: code = NotFound desc = could not find container \"8994568a346847fe20c2ff047a0787b91f9da9ef5dcd578f40098327dc207211\": container with ID starting with 8994568a346847fe20c2ff047a0787b91f9da9ef5dcd578f40098327dc207211 not found: ID does not exist" Oct 12 07:51:11 crc kubenswrapper[4599]: I1012 07:51:11.501184 4599 scope.go:117] "RemoveContainer" containerID="3a94e2f8a5f5cb8b8dd449c9db77eb5c5888f40fdd3fd11c92797faf920b6f0a" Oct 12 07:51:11 crc kubenswrapper[4599]: E1012 07:51:11.501542 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a94e2f8a5f5cb8b8dd449c9db77eb5c5888f40fdd3fd11c92797faf920b6f0a\": container with ID starting with 3a94e2f8a5f5cb8b8dd449c9db77eb5c5888f40fdd3fd11c92797faf920b6f0a not found: ID does not exist" containerID="3a94e2f8a5f5cb8b8dd449c9db77eb5c5888f40fdd3fd11c92797faf920b6f0a" Oct 12 07:51:11 crc kubenswrapper[4599]: I1012 07:51:11.501589 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a94e2f8a5f5cb8b8dd449c9db77eb5c5888f40fdd3fd11c92797faf920b6f0a"} err="failed to get container status \"3a94e2f8a5f5cb8b8dd449c9db77eb5c5888f40fdd3fd11c92797faf920b6f0a\": rpc error: code = NotFound desc = could not find container \"3a94e2f8a5f5cb8b8dd449c9db77eb5c5888f40fdd3fd11c92797faf920b6f0a\": container with ID starting with 3a94e2f8a5f5cb8b8dd449c9db77eb5c5888f40fdd3fd11c92797faf920b6f0a not found: ID does not exist" Oct 12 07:51:11 crc kubenswrapper[4599]: I1012 07:51:11.501606 4599 scope.go:117] "RemoveContainer" containerID="8994568a346847fe20c2ff047a0787b91f9da9ef5dcd578f40098327dc207211" Oct 12 07:51:11 crc kubenswrapper[4599]: I1012 07:51:11.501864 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8994568a346847fe20c2ff047a0787b91f9da9ef5dcd578f40098327dc207211"} err="failed to get container status \"8994568a346847fe20c2ff047a0787b91f9da9ef5dcd578f40098327dc207211\": rpc error: code = NotFound desc = could not find container \"8994568a346847fe20c2ff047a0787b91f9da9ef5dcd578f40098327dc207211\": container with ID starting with 8994568a346847fe20c2ff047a0787b91f9da9ef5dcd578f40098327dc207211 not found: ID does not exist" Oct 12 07:51:11 crc kubenswrapper[4599]: I1012 07:51:11.501898 4599 scope.go:117] "RemoveContainer" containerID="3a94e2f8a5f5cb8b8dd449c9db77eb5c5888f40fdd3fd11c92797faf920b6f0a" Oct 12 07:51:11 crc kubenswrapper[4599]: I1012 07:51:11.502117 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a94e2f8a5f5cb8b8dd449c9db77eb5c5888f40fdd3fd11c92797faf920b6f0a"} err="failed to get container status \"3a94e2f8a5f5cb8b8dd449c9db77eb5c5888f40fdd3fd11c92797faf920b6f0a\": rpc error: code = NotFound desc = could not find container \"3a94e2f8a5f5cb8b8dd449c9db77eb5c5888f40fdd3fd11c92797faf920b6f0a\": container with ID starting with 3a94e2f8a5f5cb8b8dd449c9db77eb5c5888f40fdd3fd11c92797faf920b6f0a not found: ID does not exist" Oct 12 07:51:11 crc kubenswrapper[4599]: I1012 07:51:11.514174 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 12 07:51:11 crc kubenswrapper[4599]: I1012 07:51:11.520997 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 12 07:51:11 crc kubenswrapper[4599]: E1012 07:51:11.521604 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bcc5217-6fbf-4f31-9f4d-cbaabc914b51" containerName="nova-metadata-log" Oct 12 07:51:11 crc kubenswrapper[4599]: I1012 07:51:11.521627 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bcc5217-6fbf-4f31-9f4d-cbaabc914b51" containerName="nova-metadata-log" Oct 12 07:51:11 crc kubenswrapper[4599]: E1012 07:51:11.521640 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bcc5217-6fbf-4f31-9f4d-cbaabc914b51" containerName="nova-metadata-metadata" Oct 12 07:51:11 crc kubenswrapper[4599]: I1012 07:51:11.521646 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bcc5217-6fbf-4f31-9f4d-cbaabc914b51" containerName="nova-metadata-metadata" Oct 12 07:51:11 crc kubenswrapper[4599]: I1012 07:51:11.521832 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bcc5217-6fbf-4f31-9f4d-cbaabc914b51" containerName="nova-metadata-metadata" Oct 12 07:51:11 crc kubenswrapper[4599]: I1012 07:51:11.521854 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bcc5217-6fbf-4f31-9f4d-cbaabc914b51" containerName="nova-metadata-log" Oct 12 07:51:11 crc kubenswrapper[4599]: I1012 07:51:11.523003 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 12 07:51:11 crc kubenswrapper[4599]: I1012 07:51:11.525140 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 12 07:51:11 crc kubenswrapper[4599]: I1012 07:51:11.527122 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 12 07:51:11 crc kubenswrapper[4599]: I1012 07:51:11.540681 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 12 07:51:11 crc kubenswrapper[4599]: I1012 07:51:11.555447 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18f19858-7b8d-4c40-afde-102c00500fe0" path="/var/lib/kubelet/pods/18f19858-7b8d-4c40-afde-102c00500fe0/volumes" Oct 12 07:51:11 crc kubenswrapper[4599]: I1012 07:51:11.556197 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3bcc5217-6fbf-4f31-9f4d-cbaabc914b51" path="/var/lib/kubelet/pods/3bcc5217-6fbf-4f31-9f4d-cbaabc914b51/volumes" Oct 12 07:51:11 crc kubenswrapper[4599]: I1012 07:51:11.557327 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0cdd903-a3a8-4634-8c76-d8702ecce12c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e0cdd903-a3a8-4634-8c76-d8702ecce12c\") " pod="openstack/nova-metadata-0" Oct 12 07:51:11 crc kubenswrapper[4599]: I1012 07:51:11.557479 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lq9g5\" (UniqueName: \"kubernetes.io/projected/e0cdd903-a3a8-4634-8c76-d8702ecce12c-kube-api-access-lq9g5\") pod \"nova-metadata-0\" (UID: \"e0cdd903-a3a8-4634-8c76-d8702ecce12c\") " pod="openstack/nova-metadata-0" Oct 12 07:51:11 crc kubenswrapper[4599]: I1012 07:51:11.557511 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0cdd903-a3a8-4634-8c76-d8702ecce12c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e0cdd903-a3a8-4634-8c76-d8702ecce12c\") " pod="openstack/nova-metadata-0" Oct 12 07:51:11 crc kubenswrapper[4599]: I1012 07:51:11.557609 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0cdd903-a3a8-4634-8c76-d8702ecce12c-logs\") pod \"nova-metadata-0\" (UID: \"e0cdd903-a3a8-4634-8c76-d8702ecce12c\") " pod="openstack/nova-metadata-0" Oct 12 07:51:11 crc kubenswrapper[4599]: I1012 07:51:11.557690 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0cdd903-a3a8-4634-8c76-d8702ecce12c-config-data\") pod \"nova-metadata-0\" (UID: \"e0cdd903-a3a8-4634-8c76-d8702ecce12c\") " pod="openstack/nova-metadata-0" Oct 12 07:51:11 crc kubenswrapper[4599]: E1012 07:51:11.605168 4599 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3bcc5217_6fbf_4f31_9f4d_cbaabc914b51.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3bcc5217_6fbf_4f31_9f4d_cbaabc914b51.slice/crio-d19a227617f6991796b6f345a88c0e0463bbff500e6fdc085b679d8fae9502cd\": RecentStats: unable to find data in memory cache]" Oct 12 07:51:11 crc kubenswrapper[4599]: I1012 07:51:11.659545 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0cdd903-a3a8-4634-8c76-d8702ecce12c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e0cdd903-a3a8-4634-8c76-d8702ecce12c\") " pod="openstack/nova-metadata-0" Oct 12 07:51:11 crc kubenswrapper[4599]: I1012 07:51:11.659632 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0cdd903-a3a8-4634-8c76-d8702ecce12c-logs\") pod \"nova-metadata-0\" (UID: \"e0cdd903-a3a8-4634-8c76-d8702ecce12c\") " pod="openstack/nova-metadata-0" Oct 12 07:51:11 crc kubenswrapper[4599]: I1012 07:51:11.659704 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0cdd903-a3a8-4634-8c76-d8702ecce12c-config-data\") pod \"nova-metadata-0\" (UID: \"e0cdd903-a3a8-4634-8c76-d8702ecce12c\") " pod="openstack/nova-metadata-0" Oct 12 07:51:11 crc kubenswrapper[4599]: I1012 07:51:11.659829 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0cdd903-a3a8-4634-8c76-d8702ecce12c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e0cdd903-a3a8-4634-8c76-d8702ecce12c\") " pod="openstack/nova-metadata-0" Oct 12 07:51:11 crc kubenswrapper[4599]: I1012 07:51:11.659876 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lq9g5\" (UniqueName: \"kubernetes.io/projected/e0cdd903-a3a8-4634-8c76-d8702ecce12c-kube-api-access-lq9g5\") pod \"nova-metadata-0\" (UID: \"e0cdd903-a3a8-4634-8c76-d8702ecce12c\") " pod="openstack/nova-metadata-0" Oct 12 07:51:11 crc kubenswrapper[4599]: I1012 07:51:11.660822 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0cdd903-a3a8-4634-8c76-d8702ecce12c-logs\") pod \"nova-metadata-0\" (UID: \"e0cdd903-a3a8-4634-8c76-d8702ecce12c\") " pod="openstack/nova-metadata-0" Oct 12 07:51:11 crc kubenswrapper[4599]: I1012 07:51:11.664399 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0cdd903-a3a8-4634-8c76-d8702ecce12c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e0cdd903-a3a8-4634-8c76-d8702ecce12c\") " pod="openstack/nova-metadata-0" Oct 12 07:51:11 crc kubenswrapper[4599]: I1012 07:51:11.664756 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0cdd903-a3a8-4634-8c76-d8702ecce12c-config-data\") pod \"nova-metadata-0\" (UID: \"e0cdd903-a3a8-4634-8c76-d8702ecce12c\") " pod="openstack/nova-metadata-0" Oct 12 07:51:11 crc kubenswrapper[4599]: I1012 07:51:11.664816 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0cdd903-a3a8-4634-8c76-d8702ecce12c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e0cdd903-a3a8-4634-8c76-d8702ecce12c\") " pod="openstack/nova-metadata-0" Oct 12 07:51:11 crc kubenswrapper[4599]: I1012 07:51:11.676181 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lq9g5\" (UniqueName: \"kubernetes.io/projected/e0cdd903-a3a8-4634-8c76-d8702ecce12c-kube-api-access-lq9g5\") pod \"nova-metadata-0\" (UID: \"e0cdd903-a3a8-4634-8c76-d8702ecce12c\") " pod="openstack/nova-metadata-0" Oct 12 07:51:11 crc kubenswrapper[4599]: I1012 07:51:11.838815 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 12 07:51:12 crc kubenswrapper[4599]: I1012 07:51:12.279705 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 12 07:51:12 crc kubenswrapper[4599]: I1012 07:51:12.428159 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e0cdd903-a3a8-4634-8c76-d8702ecce12c","Type":"ContainerStarted","Data":"7e494cad81fe4b5babc364b23b49241bd0216c3348b868bc2015f0a07b199594"} Oct 12 07:51:12 crc kubenswrapper[4599]: I1012 07:51:12.428211 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e0cdd903-a3a8-4634-8c76-d8702ecce12c","Type":"ContainerStarted","Data":"ef5dbcadaffe879ff34dafd6584541118d5150d9784f7683683e4bb8fdffb5a6"} Oct 12 07:51:12 crc kubenswrapper[4599]: I1012 07:51:12.436361 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2f64dd0c-359b-4342-a814-923eb2a16de8","Type":"ContainerStarted","Data":"d2dedd30e3874783f4673920d521039ef9fdf4a60edf85cfeec76a0dba065727"} Oct 12 07:51:12 crc kubenswrapper[4599]: I1012 07:51:12.661740 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-f7jhc" Oct 12 07:51:12 crc kubenswrapper[4599]: I1012 07:51:12.678804 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lsb7c\" (UniqueName: \"kubernetes.io/projected/73cd69f3-194c-4484-a0cf-01815024d884-kube-api-access-lsb7c\") pod \"73cd69f3-194c-4484-a0cf-01815024d884\" (UID: \"73cd69f3-194c-4484-a0cf-01815024d884\") " Oct 12 07:51:12 crc kubenswrapper[4599]: I1012 07:51:12.678907 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73cd69f3-194c-4484-a0cf-01815024d884-config-data\") pod \"73cd69f3-194c-4484-a0cf-01815024d884\" (UID: \"73cd69f3-194c-4484-a0cf-01815024d884\") " Oct 12 07:51:12 crc kubenswrapper[4599]: I1012 07:51:12.679047 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73cd69f3-194c-4484-a0cf-01815024d884-combined-ca-bundle\") pod \"73cd69f3-194c-4484-a0cf-01815024d884\" (UID: \"73cd69f3-194c-4484-a0cf-01815024d884\") " Oct 12 07:51:12 crc kubenswrapper[4599]: I1012 07:51:12.679112 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73cd69f3-194c-4484-a0cf-01815024d884-scripts\") pod \"73cd69f3-194c-4484-a0cf-01815024d884\" (UID: \"73cd69f3-194c-4484-a0cf-01815024d884\") " Oct 12 07:51:12 crc kubenswrapper[4599]: I1012 07:51:12.688680 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73cd69f3-194c-4484-a0cf-01815024d884-scripts" (OuterVolumeSpecName: "scripts") pod "73cd69f3-194c-4484-a0cf-01815024d884" (UID: "73cd69f3-194c-4484-a0cf-01815024d884"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:51:12 crc kubenswrapper[4599]: I1012 07:51:12.688696 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73cd69f3-194c-4484-a0cf-01815024d884-kube-api-access-lsb7c" (OuterVolumeSpecName: "kube-api-access-lsb7c") pod "73cd69f3-194c-4484-a0cf-01815024d884" (UID: "73cd69f3-194c-4484-a0cf-01815024d884"). InnerVolumeSpecName "kube-api-access-lsb7c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:51:12 crc kubenswrapper[4599]: I1012 07:51:12.708059 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73cd69f3-194c-4484-a0cf-01815024d884-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "73cd69f3-194c-4484-a0cf-01815024d884" (UID: "73cd69f3-194c-4484-a0cf-01815024d884"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:51:12 crc kubenswrapper[4599]: I1012 07:51:12.710773 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73cd69f3-194c-4484-a0cf-01815024d884-config-data" (OuterVolumeSpecName: "config-data") pod "73cd69f3-194c-4484-a0cf-01815024d884" (UID: "73cd69f3-194c-4484-a0cf-01815024d884"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:51:12 crc kubenswrapper[4599]: I1012 07:51:12.781047 4599 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73cd69f3-194c-4484-a0cf-01815024d884-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 07:51:12 crc kubenswrapper[4599]: I1012 07:51:12.781083 4599 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73cd69f3-194c-4484-a0cf-01815024d884-scripts\") on node \"crc\" DevicePath \"\"" Oct 12 07:51:12 crc kubenswrapper[4599]: I1012 07:51:12.781096 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lsb7c\" (UniqueName: \"kubernetes.io/projected/73cd69f3-194c-4484-a0cf-01815024d884-kube-api-access-lsb7c\") on node \"crc\" DevicePath \"\"" Oct 12 07:51:12 crc kubenswrapper[4599]: I1012 07:51:12.781109 4599 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73cd69f3-194c-4484-a0cf-01815024d884-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 07:51:13 crc kubenswrapper[4599]: I1012 07:51:13.451308 4599 generic.go:334] "Generic (PLEG): container finished" podID="a5ccaa75-c98c-4f83-af3c-0276bb6ad3a8" containerID="4d592c4300dd77d3ce4d85000060919b3b15a7d14a77df4f4fd43b2a04a20080" exitCode=0 Oct 12 07:51:13 crc kubenswrapper[4599]: I1012 07:51:13.451749 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-tszt8" event={"ID":"a5ccaa75-c98c-4f83-af3c-0276bb6ad3a8","Type":"ContainerDied","Data":"4d592c4300dd77d3ce4d85000060919b3b15a7d14a77df4f4fd43b2a04a20080"} Oct 12 07:51:13 crc kubenswrapper[4599]: I1012 07:51:13.471105 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e0cdd903-a3a8-4634-8c76-d8702ecce12c","Type":"ContainerStarted","Data":"ae18e3632fde1f6b4c5e8e3d561b09c966a53414de6d55aaa8b5c5785033223b"} Oct 12 07:51:13 crc kubenswrapper[4599]: I1012 07:51:13.497809 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2f64dd0c-359b-4342-a814-923eb2a16de8","Type":"ContainerStarted","Data":"b0ff498e8a7d91880aab3b49293ea7be8e807a97fa9b5cf175d3fd4c0293ae08"} Oct 12 07:51:13 crc kubenswrapper[4599]: I1012 07:51:13.499570 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-f7jhc" event={"ID":"73cd69f3-194c-4484-a0cf-01815024d884","Type":"ContainerDied","Data":"5c443e4278162edd441b0ad051b349ad231b1ac1bf4c26e249843a9df4858158"} Oct 12 07:51:13 crc kubenswrapper[4599]: I1012 07:51:13.499605 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-f7jhc" Oct 12 07:51:13 crc kubenswrapper[4599]: I1012 07:51:13.499615 4599 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5c443e4278162edd441b0ad051b349ad231b1ac1bf4c26e249843a9df4858158" Oct 12 07:51:13 crc kubenswrapper[4599]: I1012 07:51:13.530455 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 12 07:51:13 crc kubenswrapper[4599]: E1012 07:51:13.532391 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73cd69f3-194c-4484-a0cf-01815024d884" containerName="nova-cell1-conductor-db-sync" Oct 12 07:51:13 crc kubenswrapper[4599]: I1012 07:51:13.532415 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="73cd69f3-194c-4484-a0cf-01815024d884" containerName="nova-cell1-conductor-db-sync" Oct 12 07:51:13 crc kubenswrapper[4599]: I1012 07:51:13.532617 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="73cd69f3-194c-4484-a0cf-01815024d884" containerName="nova-cell1-conductor-db-sync" Oct 12 07:51:13 crc kubenswrapper[4599]: I1012 07:51:13.533282 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 12 07:51:13 crc kubenswrapper[4599]: I1012 07:51:13.538451 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 12 07:51:13 crc kubenswrapper[4599]: I1012 07:51:13.543429 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 12 07:51:13 crc kubenswrapper[4599]: I1012 07:51:13.545129 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.545111105 podStartE2EDuration="2.545111105s" podCreationTimestamp="2025-10-12 07:51:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:51:13.502176387 +0000 UTC m=+970.291371889" watchObservedRunningTime="2025-10-12 07:51:13.545111105 +0000 UTC m=+970.334306608" Oct 12 07:51:13 crc kubenswrapper[4599]: I1012 07:51:13.596600 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c21fe8be-d815-4e07-9ea8-e22d73e2dd8f-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"c21fe8be-d815-4e07-9ea8-e22d73e2dd8f\") " pod="openstack/nova-cell1-conductor-0" Oct 12 07:51:13 crc kubenswrapper[4599]: I1012 07:51:13.597010 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c21fe8be-d815-4e07-9ea8-e22d73e2dd8f-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"c21fe8be-d815-4e07-9ea8-e22d73e2dd8f\") " pod="openstack/nova-cell1-conductor-0" Oct 12 07:51:13 crc kubenswrapper[4599]: I1012 07:51:13.597140 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwkdm\" (UniqueName: \"kubernetes.io/projected/c21fe8be-d815-4e07-9ea8-e22d73e2dd8f-kube-api-access-nwkdm\") pod \"nova-cell1-conductor-0\" (UID: \"c21fe8be-d815-4e07-9ea8-e22d73e2dd8f\") " pod="openstack/nova-cell1-conductor-0" Oct 12 07:51:13 crc kubenswrapper[4599]: I1012 07:51:13.699108 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c21fe8be-d815-4e07-9ea8-e22d73e2dd8f-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"c21fe8be-d815-4e07-9ea8-e22d73e2dd8f\") " pod="openstack/nova-cell1-conductor-0" Oct 12 07:51:13 crc kubenswrapper[4599]: I1012 07:51:13.699278 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c21fe8be-d815-4e07-9ea8-e22d73e2dd8f-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"c21fe8be-d815-4e07-9ea8-e22d73e2dd8f\") " pod="openstack/nova-cell1-conductor-0" Oct 12 07:51:13 crc kubenswrapper[4599]: I1012 07:51:13.699367 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwkdm\" (UniqueName: \"kubernetes.io/projected/c21fe8be-d815-4e07-9ea8-e22d73e2dd8f-kube-api-access-nwkdm\") pod \"nova-cell1-conductor-0\" (UID: \"c21fe8be-d815-4e07-9ea8-e22d73e2dd8f\") " pod="openstack/nova-cell1-conductor-0" Oct 12 07:51:13 crc kubenswrapper[4599]: I1012 07:51:13.705945 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c21fe8be-d815-4e07-9ea8-e22d73e2dd8f-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"c21fe8be-d815-4e07-9ea8-e22d73e2dd8f\") " pod="openstack/nova-cell1-conductor-0" Oct 12 07:51:13 crc kubenswrapper[4599]: I1012 07:51:13.715006 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c21fe8be-d815-4e07-9ea8-e22d73e2dd8f-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"c21fe8be-d815-4e07-9ea8-e22d73e2dd8f\") " pod="openstack/nova-cell1-conductor-0" Oct 12 07:51:13 crc kubenswrapper[4599]: I1012 07:51:13.725309 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwkdm\" (UniqueName: \"kubernetes.io/projected/c21fe8be-d815-4e07-9ea8-e22d73e2dd8f-kube-api-access-nwkdm\") pod \"nova-cell1-conductor-0\" (UID: \"c21fe8be-d815-4e07-9ea8-e22d73e2dd8f\") " pod="openstack/nova-cell1-conductor-0" Oct 12 07:51:13 crc kubenswrapper[4599]: I1012 07:51:13.853132 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 12 07:51:14 crc kubenswrapper[4599]: I1012 07:51:14.295997 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 12 07:51:14 crc kubenswrapper[4599]: I1012 07:51:14.512651 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2f64dd0c-359b-4342-a814-923eb2a16de8","Type":"ContainerStarted","Data":"604af05a6a54da5d9dec5916ada02a8812757e25950e35a59d9ad86da4c46901"} Oct 12 07:51:14 crc kubenswrapper[4599]: I1012 07:51:14.513635 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 12 07:51:14 crc kubenswrapper[4599]: I1012 07:51:14.517718 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"c21fe8be-d815-4e07-9ea8-e22d73e2dd8f","Type":"ContainerStarted","Data":"7c9bdd3cfc5f2365bfa9109f781ce6aed0eef602861912c50ab5614a00d88c90"} Oct 12 07:51:14 crc kubenswrapper[4599]: I1012 07:51:14.517766 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"c21fe8be-d815-4e07-9ea8-e22d73e2dd8f","Type":"ContainerStarted","Data":"cf0699965bbb2e909827f3a5662cd7c24a4af0bb9525cf1299dccd0f2f487c2c"} Oct 12 07:51:14 crc kubenswrapper[4599]: I1012 07:51:14.518720 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Oct 12 07:51:14 crc kubenswrapper[4599]: I1012 07:51:14.525534 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 12 07:51:14 crc kubenswrapper[4599]: I1012 07:51:14.525586 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 12 07:51:14 crc kubenswrapper[4599]: I1012 07:51:14.534636 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.6473536420000001 podStartE2EDuration="5.534616329s" podCreationTimestamp="2025-10-12 07:51:09 +0000 UTC" firstStartedPulling="2025-10-12 07:51:10.437496852 +0000 UTC m=+967.226692354" lastFinishedPulling="2025-10-12 07:51:14.324759539 +0000 UTC m=+971.113955041" observedRunningTime="2025-10-12 07:51:14.530283492 +0000 UTC m=+971.319478993" watchObservedRunningTime="2025-10-12 07:51:14.534616329 +0000 UTC m=+971.323811831" Oct 12 07:51:14 crc kubenswrapper[4599]: I1012 07:51:14.538725 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 12 07:51:14 crc kubenswrapper[4599]: I1012 07:51:14.550146 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=1.55012126 podStartE2EDuration="1.55012126s" podCreationTimestamp="2025-10-12 07:51:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:51:14.545309379 +0000 UTC m=+971.334504882" watchObservedRunningTime="2025-10-12 07:51:14.55012126 +0000 UTC m=+971.339316762" Oct 12 07:51:14 crc kubenswrapper[4599]: I1012 07:51:14.575651 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 12 07:51:14 crc kubenswrapper[4599]: I1012 07:51:14.743698 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-tszt8" Oct 12 07:51:14 crc kubenswrapper[4599]: I1012 07:51:14.819052 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5ccaa75-c98c-4f83-af3c-0276bb6ad3a8-config-data\") pod \"a5ccaa75-c98c-4f83-af3c-0276bb6ad3a8\" (UID: \"a5ccaa75-c98c-4f83-af3c-0276bb6ad3a8\") " Oct 12 07:51:14 crc kubenswrapper[4599]: I1012 07:51:14.820189 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5ccaa75-c98c-4f83-af3c-0276bb6ad3a8-scripts\") pod \"a5ccaa75-c98c-4f83-af3c-0276bb6ad3a8\" (UID: \"a5ccaa75-c98c-4f83-af3c-0276bb6ad3a8\") " Oct 12 07:51:14 crc kubenswrapper[4599]: I1012 07:51:14.820283 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcrsf\" (UniqueName: \"kubernetes.io/projected/a5ccaa75-c98c-4f83-af3c-0276bb6ad3a8-kube-api-access-fcrsf\") pod \"a5ccaa75-c98c-4f83-af3c-0276bb6ad3a8\" (UID: \"a5ccaa75-c98c-4f83-af3c-0276bb6ad3a8\") " Oct 12 07:51:14 crc kubenswrapper[4599]: I1012 07:51:14.820368 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5ccaa75-c98c-4f83-af3c-0276bb6ad3a8-combined-ca-bundle\") pod \"a5ccaa75-c98c-4f83-af3c-0276bb6ad3a8\" (UID: \"a5ccaa75-c98c-4f83-af3c-0276bb6ad3a8\") " Oct 12 07:51:14 crc kubenswrapper[4599]: I1012 07:51:14.824387 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5ccaa75-c98c-4f83-af3c-0276bb6ad3a8-scripts" (OuterVolumeSpecName: "scripts") pod "a5ccaa75-c98c-4f83-af3c-0276bb6ad3a8" (UID: "a5ccaa75-c98c-4f83-af3c-0276bb6ad3a8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:51:14 crc kubenswrapper[4599]: I1012 07:51:14.824438 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5ccaa75-c98c-4f83-af3c-0276bb6ad3a8-kube-api-access-fcrsf" (OuterVolumeSpecName: "kube-api-access-fcrsf") pod "a5ccaa75-c98c-4f83-af3c-0276bb6ad3a8" (UID: "a5ccaa75-c98c-4f83-af3c-0276bb6ad3a8"). InnerVolumeSpecName "kube-api-access-fcrsf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:51:14 crc kubenswrapper[4599]: E1012 07:51:14.838148 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5ccaa75-c98c-4f83-af3c-0276bb6ad3a8-config-data podName:a5ccaa75-c98c-4f83-af3c-0276bb6ad3a8 nodeName:}" failed. No retries permitted until 2025-10-12 07:51:15.338126685 +0000 UTC m=+972.127322187 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "config-data" (UniqueName: "kubernetes.io/secret/a5ccaa75-c98c-4f83-af3c-0276bb6ad3a8-config-data") pod "a5ccaa75-c98c-4f83-af3c-0276bb6ad3a8" (UID: "a5ccaa75-c98c-4f83-af3c-0276bb6ad3a8") : error deleting /var/lib/kubelet/pods/a5ccaa75-c98c-4f83-af3c-0276bb6ad3a8/volume-subpaths: remove /var/lib/kubelet/pods/a5ccaa75-c98c-4f83-af3c-0276bb6ad3a8/volume-subpaths: no such file or directory Oct 12 07:51:14 crc kubenswrapper[4599]: I1012 07:51:14.840677 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5ccaa75-c98c-4f83-af3c-0276bb6ad3a8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a5ccaa75-c98c-4f83-af3c-0276bb6ad3a8" (UID: "a5ccaa75-c98c-4f83-af3c-0276bb6ad3a8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:51:14 crc kubenswrapper[4599]: I1012 07:51:14.908702 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-94c48d4d7-z6cgn" Oct 12 07:51:14 crc kubenswrapper[4599]: I1012 07:51:14.921847 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcrsf\" (UniqueName: \"kubernetes.io/projected/a5ccaa75-c98c-4f83-af3c-0276bb6ad3a8-kube-api-access-fcrsf\") on node \"crc\" DevicePath \"\"" Oct 12 07:51:14 crc kubenswrapper[4599]: I1012 07:51:14.921875 4599 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5ccaa75-c98c-4f83-af3c-0276bb6ad3a8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 07:51:14 crc kubenswrapper[4599]: I1012 07:51:14.921888 4599 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5ccaa75-c98c-4f83-af3c-0276bb6ad3a8-scripts\") on node \"crc\" DevicePath \"\"" Oct 12 07:51:14 crc kubenswrapper[4599]: I1012 07:51:14.959947 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56c55fd88c-fnjlg"] Oct 12 07:51:14 crc kubenswrapper[4599]: I1012 07:51:14.960174 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-56c55fd88c-fnjlg" podUID="ebd046c8-0af0-4d96-99a4-ce5bb0dec770" containerName="dnsmasq-dns" containerID="cri-o://658f7996d8872d03e757d2c4af95c3bc377dfbed1c8efa8a57c96049fa3952f0" gracePeriod=10 Oct 12 07:51:15 crc kubenswrapper[4599]: I1012 07:51:15.412791 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56c55fd88c-fnjlg" Oct 12 07:51:15 crc kubenswrapper[4599]: I1012 07:51:15.432824 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5vfz5\" (UniqueName: \"kubernetes.io/projected/ebd046c8-0af0-4d96-99a4-ce5bb0dec770-kube-api-access-5vfz5\") pod \"ebd046c8-0af0-4d96-99a4-ce5bb0dec770\" (UID: \"ebd046c8-0af0-4d96-99a4-ce5bb0dec770\") " Oct 12 07:51:15 crc kubenswrapper[4599]: I1012 07:51:15.433142 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ebd046c8-0af0-4d96-99a4-ce5bb0dec770-ovsdbserver-sb\") pod \"ebd046c8-0af0-4d96-99a4-ce5bb0dec770\" (UID: \"ebd046c8-0af0-4d96-99a4-ce5bb0dec770\") " Oct 12 07:51:15 crc kubenswrapper[4599]: I1012 07:51:15.433260 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5ccaa75-c98c-4f83-af3c-0276bb6ad3a8-config-data\") pod \"a5ccaa75-c98c-4f83-af3c-0276bb6ad3a8\" (UID: \"a5ccaa75-c98c-4f83-af3c-0276bb6ad3a8\") " Oct 12 07:51:15 crc kubenswrapper[4599]: I1012 07:51:15.433365 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebd046c8-0af0-4d96-99a4-ce5bb0dec770-config\") pod \"ebd046c8-0af0-4d96-99a4-ce5bb0dec770\" (UID: \"ebd046c8-0af0-4d96-99a4-ce5bb0dec770\") " Oct 12 07:51:15 crc kubenswrapper[4599]: I1012 07:51:15.433446 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ebd046c8-0af0-4d96-99a4-ce5bb0dec770-dns-svc\") pod \"ebd046c8-0af0-4d96-99a4-ce5bb0dec770\" (UID: \"ebd046c8-0af0-4d96-99a4-ce5bb0dec770\") " Oct 12 07:51:15 crc kubenswrapper[4599]: I1012 07:51:15.433524 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ebd046c8-0af0-4d96-99a4-ce5bb0dec770-dns-swift-storage-0\") pod \"ebd046c8-0af0-4d96-99a4-ce5bb0dec770\" (UID: \"ebd046c8-0af0-4d96-99a4-ce5bb0dec770\") " Oct 12 07:51:15 crc kubenswrapper[4599]: I1012 07:51:15.433624 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ebd046c8-0af0-4d96-99a4-ce5bb0dec770-ovsdbserver-nb\") pod \"ebd046c8-0af0-4d96-99a4-ce5bb0dec770\" (UID: \"ebd046c8-0af0-4d96-99a4-ce5bb0dec770\") " Oct 12 07:51:15 crc kubenswrapper[4599]: I1012 07:51:15.445666 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebd046c8-0af0-4d96-99a4-ce5bb0dec770-kube-api-access-5vfz5" (OuterVolumeSpecName: "kube-api-access-5vfz5") pod "ebd046c8-0af0-4d96-99a4-ce5bb0dec770" (UID: "ebd046c8-0af0-4d96-99a4-ce5bb0dec770"). InnerVolumeSpecName "kube-api-access-5vfz5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:51:15 crc kubenswrapper[4599]: I1012 07:51:15.455708 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5ccaa75-c98c-4f83-af3c-0276bb6ad3a8-config-data" (OuterVolumeSpecName: "config-data") pod "a5ccaa75-c98c-4f83-af3c-0276bb6ad3a8" (UID: "a5ccaa75-c98c-4f83-af3c-0276bb6ad3a8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:51:15 crc kubenswrapper[4599]: I1012 07:51:15.512861 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ebd046c8-0af0-4d96-99a4-ce5bb0dec770-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ebd046c8-0af0-4d96-99a4-ce5bb0dec770" (UID: "ebd046c8-0af0-4d96-99a4-ce5bb0dec770"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:51:15 crc kubenswrapper[4599]: I1012 07:51:15.514745 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ebd046c8-0af0-4d96-99a4-ce5bb0dec770-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ebd046c8-0af0-4d96-99a4-ce5bb0dec770" (UID: "ebd046c8-0af0-4d96-99a4-ce5bb0dec770"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:51:15 crc kubenswrapper[4599]: I1012 07:51:15.520216 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ebd046c8-0af0-4d96-99a4-ce5bb0dec770-config" (OuterVolumeSpecName: "config") pod "ebd046c8-0af0-4d96-99a4-ce5bb0dec770" (UID: "ebd046c8-0af0-4d96-99a4-ce5bb0dec770"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:51:15 crc kubenswrapper[4599]: I1012 07:51:15.525640 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ebd046c8-0af0-4d96-99a4-ce5bb0dec770-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ebd046c8-0af0-4d96-99a4-ce5bb0dec770" (UID: "ebd046c8-0af0-4d96-99a4-ce5bb0dec770"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:51:15 crc kubenswrapper[4599]: I1012 07:51:15.528786 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-tszt8" Oct 12 07:51:15 crc kubenswrapper[4599]: I1012 07:51:15.530017 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-tszt8" event={"ID":"a5ccaa75-c98c-4f83-af3c-0276bb6ad3a8","Type":"ContainerDied","Data":"23a42d6d88c6e5c7392e3d9b49f7690b56078cfd080d27d4e7e682d702e7c0c2"} Oct 12 07:51:15 crc kubenswrapper[4599]: I1012 07:51:15.530190 4599 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="23a42d6d88c6e5c7392e3d9b49f7690b56078cfd080d27d4e7e682d702e7c0c2" Oct 12 07:51:15 crc kubenswrapper[4599]: I1012 07:51:15.534696 4599 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ebd046c8-0af0-4d96-99a4-ce5bb0dec770-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 12 07:51:15 crc kubenswrapper[4599]: I1012 07:51:15.534799 4599 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5ccaa75-c98c-4f83-af3c-0276bb6ad3a8-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 07:51:15 crc kubenswrapper[4599]: I1012 07:51:15.534864 4599 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebd046c8-0af0-4d96-99a4-ce5bb0dec770-config\") on node \"crc\" DevicePath \"\"" Oct 12 07:51:15 crc kubenswrapper[4599]: I1012 07:51:15.534926 4599 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ebd046c8-0af0-4d96-99a4-ce5bb0dec770-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 12 07:51:15 crc kubenswrapper[4599]: I1012 07:51:15.534981 4599 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ebd046c8-0af0-4d96-99a4-ce5bb0dec770-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 12 07:51:15 crc kubenswrapper[4599]: I1012 07:51:15.535038 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5vfz5\" (UniqueName: \"kubernetes.io/projected/ebd046c8-0af0-4d96-99a4-ce5bb0dec770-kube-api-access-5vfz5\") on node \"crc\" DevicePath \"\"" Oct 12 07:51:15 crc kubenswrapper[4599]: I1012 07:51:15.535018 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56c55fd88c-fnjlg" event={"ID":"ebd046c8-0af0-4d96-99a4-ce5bb0dec770","Type":"ContainerDied","Data":"658f7996d8872d03e757d2c4af95c3bc377dfbed1c8efa8a57c96049fa3952f0"} Oct 12 07:51:15 crc kubenswrapper[4599]: I1012 07:51:15.535160 4599 scope.go:117] "RemoveContainer" containerID="658f7996d8872d03e757d2c4af95c3bc377dfbed1c8efa8a57c96049fa3952f0" Oct 12 07:51:15 crc kubenswrapper[4599]: I1012 07:51:15.534986 4599 generic.go:334] "Generic (PLEG): container finished" podID="ebd046c8-0af0-4d96-99a4-ce5bb0dec770" containerID="658f7996d8872d03e757d2c4af95c3bc377dfbed1c8efa8a57c96049fa3952f0" exitCode=0 Oct 12 07:51:15 crc kubenswrapper[4599]: I1012 07:51:15.535817 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56c55fd88c-fnjlg" event={"ID":"ebd046c8-0af0-4d96-99a4-ce5bb0dec770","Type":"ContainerDied","Data":"308577e1ca57abdf929523a1ea568f4bb0cc48d143e57aabc6dd6726a949273f"} Oct 12 07:51:15 crc kubenswrapper[4599]: I1012 07:51:15.535091 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56c55fd88c-fnjlg" Oct 12 07:51:15 crc kubenswrapper[4599]: I1012 07:51:15.537063 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ebd046c8-0af0-4d96-99a4-ce5bb0dec770-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ebd046c8-0af0-4d96-99a4-ce5bb0dec770" (UID: "ebd046c8-0af0-4d96-99a4-ce5bb0dec770"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:51:15 crc kubenswrapper[4599]: I1012 07:51:15.571561 4599 scope.go:117] "RemoveContainer" containerID="862c749414fb2e59c5221d01459de7394ea1c056bcdadc2badceb43204cc4214" Oct 12 07:51:15 crc kubenswrapper[4599]: I1012 07:51:15.595819 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 12 07:51:15 crc kubenswrapper[4599]: I1012 07:51:15.610248 4599 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="3b8ae9f7-4558-4445-8976-db978922edde" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.182:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 12 07:51:15 crc kubenswrapper[4599]: I1012 07:51:15.612110 4599 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="3b8ae9f7-4558-4445-8976-db978922edde" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.182:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 12 07:51:15 crc kubenswrapper[4599]: I1012 07:51:15.626262 4599 scope.go:117] "RemoveContainer" containerID="658f7996d8872d03e757d2c4af95c3bc377dfbed1c8efa8a57c96049fa3952f0" Oct 12 07:51:15 crc kubenswrapper[4599]: E1012 07:51:15.626778 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"658f7996d8872d03e757d2c4af95c3bc377dfbed1c8efa8a57c96049fa3952f0\": container with ID starting with 658f7996d8872d03e757d2c4af95c3bc377dfbed1c8efa8a57c96049fa3952f0 not found: ID does not exist" containerID="658f7996d8872d03e757d2c4af95c3bc377dfbed1c8efa8a57c96049fa3952f0" Oct 12 07:51:15 crc kubenswrapper[4599]: I1012 07:51:15.626811 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"658f7996d8872d03e757d2c4af95c3bc377dfbed1c8efa8a57c96049fa3952f0"} err="failed to get container status \"658f7996d8872d03e757d2c4af95c3bc377dfbed1c8efa8a57c96049fa3952f0\": rpc error: code = NotFound desc = could not find container \"658f7996d8872d03e757d2c4af95c3bc377dfbed1c8efa8a57c96049fa3952f0\": container with ID starting with 658f7996d8872d03e757d2c4af95c3bc377dfbed1c8efa8a57c96049fa3952f0 not found: ID does not exist" Oct 12 07:51:15 crc kubenswrapper[4599]: I1012 07:51:15.626830 4599 scope.go:117] "RemoveContainer" containerID="862c749414fb2e59c5221d01459de7394ea1c056bcdadc2badceb43204cc4214" Oct 12 07:51:15 crc kubenswrapper[4599]: E1012 07:51:15.630354 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"862c749414fb2e59c5221d01459de7394ea1c056bcdadc2badceb43204cc4214\": container with ID starting with 862c749414fb2e59c5221d01459de7394ea1c056bcdadc2badceb43204cc4214 not found: ID does not exist" containerID="862c749414fb2e59c5221d01459de7394ea1c056bcdadc2badceb43204cc4214" Oct 12 07:51:15 crc kubenswrapper[4599]: I1012 07:51:15.630383 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"862c749414fb2e59c5221d01459de7394ea1c056bcdadc2badceb43204cc4214"} err="failed to get container status \"862c749414fb2e59c5221d01459de7394ea1c056bcdadc2badceb43204cc4214\": rpc error: code = NotFound desc = could not find container \"862c749414fb2e59c5221d01459de7394ea1c056bcdadc2badceb43204cc4214\": container with ID starting with 862c749414fb2e59c5221d01459de7394ea1c056bcdadc2badceb43204cc4214 not found: ID does not exist" Oct 12 07:51:15 crc kubenswrapper[4599]: I1012 07:51:15.637404 4599 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ebd046c8-0af0-4d96-99a4-ce5bb0dec770-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 12 07:51:15 crc kubenswrapper[4599]: I1012 07:51:15.641402 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 12 07:51:15 crc kubenswrapper[4599]: I1012 07:51:15.641683 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="3b8ae9f7-4558-4445-8976-db978922edde" containerName="nova-api-log" containerID="cri-o://8aa833c06a2a9e762622ad8c932f1593ea522e274c9387755ea12c13264466ad" gracePeriod=30 Oct 12 07:51:15 crc kubenswrapper[4599]: I1012 07:51:15.641855 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="3b8ae9f7-4558-4445-8976-db978922edde" containerName="nova-api-api" containerID="cri-o://1674c42961443f7863d2efc66469ce66109d749bd6a3f1ddbb3de56ae242fd08" gracePeriod=30 Oct 12 07:51:15 crc kubenswrapper[4599]: I1012 07:51:15.672992 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 12 07:51:15 crc kubenswrapper[4599]: I1012 07:51:15.704948 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 12 07:51:15 crc kubenswrapper[4599]: I1012 07:51:15.705193 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e0cdd903-a3a8-4634-8c76-d8702ecce12c" containerName="nova-metadata-log" containerID="cri-o://7e494cad81fe4b5babc364b23b49241bd0216c3348b868bc2015f0a07b199594" gracePeriod=30 Oct 12 07:51:15 crc kubenswrapper[4599]: I1012 07:51:15.705847 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e0cdd903-a3a8-4634-8c76-d8702ecce12c" containerName="nova-metadata-metadata" containerID="cri-o://ae18e3632fde1f6b4c5e8e3d561b09c966a53414de6d55aaa8b5c5785033223b" gracePeriod=30 Oct 12 07:51:15 crc kubenswrapper[4599]: I1012 07:51:15.870384 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56c55fd88c-fnjlg"] Oct 12 07:51:15 crc kubenswrapper[4599]: I1012 07:51:15.877755 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-56c55fd88c-fnjlg"] Oct 12 07:51:16 crc kubenswrapper[4599]: I1012 07:51:16.121661 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 12 07:51:16 crc kubenswrapper[4599]: I1012 07:51:16.144533 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0cdd903-a3a8-4634-8c76-d8702ecce12c-config-data\") pod \"e0cdd903-a3a8-4634-8c76-d8702ecce12c\" (UID: \"e0cdd903-a3a8-4634-8c76-d8702ecce12c\") " Oct 12 07:51:16 crc kubenswrapper[4599]: I1012 07:51:16.144787 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0cdd903-a3a8-4634-8c76-d8702ecce12c-logs\") pod \"e0cdd903-a3a8-4634-8c76-d8702ecce12c\" (UID: \"e0cdd903-a3a8-4634-8c76-d8702ecce12c\") " Oct 12 07:51:16 crc kubenswrapper[4599]: I1012 07:51:16.144879 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0cdd903-a3a8-4634-8c76-d8702ecce12c-combined-ca-bundle\") pod \"e0cdd903-a3a8-4634-8c76-d8702ecce12c\" (UID: \"e0cdd903-a3a8-4634-8c76-d8702ecce12c\") " Oct 12 07:51:16 crc kubenswrapper[4599]: I1012 07:51:16.145026 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0cdd903-a3a8-4634-8c76-d8702ecce12c-nova-metadata-tls-certs\") pod \"e0cdd903-a3a8-4634-8c76-d8702ecce12c\" (UID: \"e0cdd903-a3a8-4634-8c76-d8702ecce12c\") " Oct 12 07:51:16 crc kubenswrapper[4599]: I1012 07:51:16.145111 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lq9g5\" (UniqueName: \"kubernetes.io/projected/e0cdd903-a3a8-4634-8c76-d8702ecce12c-kube-api-access-lq9g5\") pod \"e0cdd903-a3a8-4634-8c76-d8702ecce12c\" (UID: \"e0cdd903-a3a8-4634-8c76-d8702ecce12c\") " Oct 12 07:51:16 crc kubenswrapper[4599]: I1012 07:51:16.146370 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0cdd903-a3a8-4634-8c76-d8702ecce12c-logs" (OuterVolumeSpecName: "logs") pod "e0cdd903-a3a8-4634-8c76-d8702ecce12c" (UID: "e0cdd903-a3a8-4634-8c76-d8702ecce12c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:51:16 crc kubenswrapper[4599]: I1012 07:51:16.150587 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0cdd903-a3a8-4634-8c76-d8702ecce12c-kube-api-access-lq9g5" (OuterVolumeSpecName: "kube-api-access-lq9g5") pod "e0cdd903-a3a8-4634-8c76-d8702ecce12c" (UID: "e0cdd903-a3a8-4634-8c76-d8702ecce12c"). InnerVolumeSpecName "kube-api-access-lq9g5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:51:16 crc kubenswrapper[4599]: I1012 07:51:16.174270 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0cdd903-a3a8-4634-8c76-d8702ecce12c-config-data" (OuterVolumeSpecName: "config-data") pod "e0cdd903-a3a8-4634-8c76-d8702ecce12c" (UID: "e0cdd903-a3a8-4634-8c76-d8702ecce12c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:51:16 crc kubenswrapper[4599]: I1012 07:51:16.175528 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0cdd903-a3a8-4634-8c76-d8702ecce12c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e0cdd903-a3a8-4634-8c76-d8702ecce12c" (UID: "e0cdd903-a3a8-4634-8c76-d8702ecce12c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:51:16 crc kubenswrapper[4599]: I1012 07:51:16.216548 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0cdd903-a3a8-4634-8c76-d8702ecce12c-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "e0cdd903-a3a8-4634-8c76-d8702ecce12c" (UID: "e0cdd903-a3a8-4634-8c76-d8702ecce12c"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:51:16 crc kubenswrapper[4599]: I1012 07:51:16.248669 4599 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0cdd903-a3a8-4634-8c76-d8702ecce12c-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 07:51:16 crc kubenswrapper[4599]: I1012 07:51:16.248712 4599 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0cdd903-a3a8-4634-8c76-d8702ecce12c-logs\") on node \"crc\" DevicePath \"\"" Oct 12 07:51:16 crc kubenswrapper[4599]: I1012 07:51:16.248722 4599 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0cdd903-a3a8-4634-8c76-d8702ecce12c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 07:51:16 crc kubenswrapper[4599]: I1012 07:51:16.248737 4599 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0cdd903-a3a8-4634-8c76-d8702ecce12c-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 12 07:51:16 crc kubenswrapper[4599]: I1012 07:51:16.248752 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lq9g5\" (UniqueName: \"kubernetes.io/projected/e0cdd903-a3a8-4634-8c76-d8702ecce12c-kube-api-access-lq9g5\") on node \"crc\" DevicePath \"\"" Oct 12 07:51:16 crc kubenswrapper[4599]: I1012 07:51:16.544311 4599 generic.go:334] "Generic (PLEG): container finished" podID="3b8ae9f7-4558-4445-8976-db978922edde" containerID="8aa833c06a2a9e762622ad8c932f1593ea522e274c9387755ea12c13264466ad" exitCode=143 Oct 12 07:51:16 crc kubenswrapper[4599]: I1012 07:51:16.544388 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3b8ae9f7-4558-4445-8976-db978922edde","Type":"ContainerDied","Data":"8aa833c06a2a9e762622ad8c932f1593ea522e274c9387755ea12c13264466ad"} Oct 12 07:51:16 crc kubenswrapper[4599]: I1012 07:51:16.547177 4599 generic.go:334] "Generic (PLEG): container finished" podID="e0cdd903-a3a8-4634-8c76-d8702ecce12c" containerID="ae18e3632fde1f6b4c5e8e3d561b09c966a53414de6d55aaa8b5c5785033223b" exitCode=0 Oct 12 07:51:16 crc kubenswrapper[4599]: I1012 07:51:16.547202 4599 generic.go:334] "Generic (PLEG): container finished" podID="e0cdd903-a3a8-4634-8c76-d8702ecce12c" containerID="7e494cad81fe4b5babc364b23b49241bd0216c3348b868bc2015f0a07b199594" exitCode=143 Oct 12 07:51:16 crc kubenswrapper[4599]: I1012 07:51:16.547251 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 12 07:51:16 crc kubenswrapper[4599]: I1012 07:51:16.547276 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e0cdd903-a3a8-4634-8c76-d8702ecce12c","Type":"ContainerDied","Data":"ae18e3632fde1f6b4c5e8e3d561b09c966a53414de6d55aaa8b5c5785033223b"} Oct 12 07:51:16 crc kubenswrapper[4599]: I1012 07:51:16.547302 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e0cdd903-a3a8-4634-8c76-d8702ecce12c","Type":"ContainerDied","Data":"7e494cad81fe4b5babc364b23b49241bd0216c3348b868bc2015f0a07b199594"} Oct 12 07:51:16 crc kubenswrapper[4599]: I1012 07:51:16.547312 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e0cdd903-a3a8-4634-8c76-d8702ecce12c","Type":"ContainerDied","Data":"ef5dbcadaffe879ff34dafd6584541118d5150d9784f7683683e4bb8fdffb5a6"} Oct 12 07:51:16 crc kubenswrapper[4599]: I1012 07:51:16.547329 4599 scope.go:117] "RemoveContainer" containerID="ae18e3632fde1f6b4c5e8e3d561b09c966a53414de6d55aaa8b5c5785033223b" Oct 12 07:51:16 crc kubenswrapper[4599]: I1012 07:51:16.573619 4599 scope.go:117] "RemoveContainer" containerID="7e494cad81fe4b5babc364b23b49241bd0216c3348b868bc2015f0a07b199594" Oct 12 07:51:16 crc kubenswrapper[4599]: I1012 07:51:16.575376 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 12 07:51:16 crc kubenswrapper[4599]: I1012 07:51:16.582374 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 12 07:51:16 crc kubenswrapper[4599]: I1012 07:51:16.601029 4599 scope.go:117] "RemoveContainer" containerID="ae18e3632fde1f6b4c5e8e3d561b09c966a53414de6d55aaa8b5c5785033223b" Oct 12 07:51:16 crc kubenswrapper[4599]: E1012 07:51:16.602469 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae18e3632fde1f6b4c5e8e3d561b09c966a53414de6d55aaa8b5c5785033223b\": container with ID starting with ae18e3632fde1f6b4c5e8e3d561b09c966a53414de6d55aaa8b5c5785033223b not found: ID does not exist" containerID="ae18e3632fde1f6b4c5e8e3d561b09c966a53414de6d55aaa8b5c5785033223b" Oct 12 07:51:16 crc kubenswrapper[4599]: I1012 07:51:16.602512 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae18e3632fde1f6b4c5e8e3d561b09c966a53414de6d55aaa8b5c5785033223b"} err="failed to get container status \"ae18e3632fde1f6b4c5e8e3d561b09c966a53414de6d55aaa8b5c5785033223b\": rpc error: code = NotFound desc = could not find container \"ae18e3632fde1f6b4c5e8e3d561b09c966a53414de6d55aaa8b5c5785033223b\": container with ID starting with ae18e3632fde1f6b4c5e8e3d561b09c966a53414de6d55aaa8b5c5785033223b not found: ID does not exist" Oct 12 07:51:16 crc kubenswrapper[4599]: I1012 07:51:16.602543 4599 scope.go:117] "RemoveContainer" containerID="7e494cad81fe4b5babc364b23b49241bd0216c3348b868bc2015f0a07b199594" Oct 12 07:51:16 crc kubenswrapper[4599]: E1012 07:51:16.605676 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e494cad81fe4b5babc364b23b49241bd0216c3348b868bc2015f0a07b199594\": container with ID starting with 7e494cad81fe4b5babc364b23b49241bd0216c3348b868bc2015f0a07b199594 not found: ID does not exist" containerID="7e494cad81fe4b5babc364b23b49241bd0216c3348b868bc2015f0a07b199594" Oct 12 07:51:16 crc kubenswrapper[4599]: I1012 07:51:16.605699 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e494cad81fe4b5babc364b23b49241bd0216c3348b868bc2015f0a07b199594"} err="failed to get container status \"7e494cad81fe4b5babc364b23b49241bd0216c3348b868bc2015f0a07b199594\": rpc error: code = NotFound desc = could not find container \"7e494cad81fe4b5babc364b23b49241bd0216c3348b868bc2015f0a07b199594\": container with ID starting with 7e494cad81fe4b5babc364b23b49241bd0216c3348b868bc2015f0a07b199594 not found: ID does not exist" Oct 12 07:51:16 crc kubenswrapper[4599]: I1012 07:51:16.605714 4599 scope.go:117] "RemoveContainer" containerID="ae18e3632fde1f6b4c5e8e3d561b09c966a53414de6d55aaa8b5c5785033223b" Oct 12 07:51:16 crc kubenswrapper[4599]: I1012 07:51:16.605767 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 12 07:51:16 crc kubenswrapper[4599]: I1012 07:51:16.606773 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae18e3632fde1f6b4c5e8e3d561b09c966a53414de6d55aaa8b5c5785033223b"} err="failed to get container status \"ae18e3632fde1f6b4c5e8e3d561b09c966a53414de6d55aaa8b5c5785033223b\": rpc error: code = NotFound desc = could not find container \"ae18e3632fde1f6b4c5e8e3d561b09c966a53414de6d55aaa8b5c5785033223b\": container with ID starting with ae18e3632fde1f6b4c5e8e3d561b09c966a53414de6d55aaa8b5c5785033223b not found: ID does not exist" Oct 12 07:51:16 crc kubenswrapper[4599]: I1012 07:51:16.606814 4599 scope.go:117] "RemoveContainer" containerID="7e494cad81fe4b5babc364b23b49241bd0216c3348b868bc2015f0a07b199594" Oct 12 07:51:16 crc kubenswrapper[4599]: I1012 07:51:16.607938 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e494cad81fe4b5babc364b23b49241bd0216c3348b868bc2015f0a07b199594"} err="failed to get container status \"7e494cad81fe4b5babc364b23b49241bd0216c3348b868bc2015f0a07b199594\": rpc error: code = NotFound desc = could not find container \"7e494cad81fe4b5babc364b23b49241bd0216c3348b868bc2015f0a07b199594\": container with ID starting with 7e494cad81fe4b5babc364b23b49241bd0216c3348b868bc2015f0a07b199594 not found: ID does not exist" Oct 12 07:51:16 crc kubenswrapper[4599]: E1012 07:51:16.609589 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0cdd903-a3a8-4634-8c76-d8702ecce12c" containerName="nova-metadata-log" Oct 12 07:51:16 crc kubenswrapper[4599]: I1012 07:51:16.609613 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0cdd903-a3a8-4634-8c76-d8702ecce12c" containerName="nova-metadata-log" Oct 12 07:51:16 crc kubenswrapper[4599]: E1012 07:51:16.609629 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0cdd903-a3a8-4634-8c76-d8702ecce12c" containerName="nova-metadata-metadata" Oct 12 07:51:16 crc kubenswrapper[4599]: I1012 07:51:16.609635 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0cdd903-a3a8-4634-8c76-d8702ecce12c" containerName="nova-metadata-metadata" Oct 12 07:51:16 crc kubenswrapper[4599]: E1012 07:51:16.609648 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebd046c8-0af0-4d96-99a4-ce5bb0dec770" containerName="init" Oct 12 07:51:16 crc kubenswrapper[4599]: I1012 07:51:16.609653 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebd046c8-0af0-4d96-99a4-ce5bb0dec770" containerName="init" Oct 12 07:51:16 crc kubenswrapper[4599]: E1012 07:51:16.609672 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5ccaa75-c98c-4f83-af3c-0276bb6ad3a8" containerName="nova-manage" Oct 12 07:51:16 crc kubenswrapper[4599]: I1012 07:51:16.609677 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5ccaa75-c98c-4f83-af3c-0276bb6ad3a8" containerName="nova-manage" Oct 12 07:51:16 crc kubenswrapper[4599]: E1012 07:51:16.609688 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebd046c8-0af0-4d96-99a4-ce5bb0dec770" containerName="dnsmasq-dns" Oct 12 07:51:16 crc kubenswrapper[4599]: I1012 07:51:16.609694 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebd046c8-0af0-4d96-99a4-ce5bb0dec770" containerName="dnsmasq-dns" Oct 12 07:51:16 crc kubenswrapper[4599]: I1012 07:51:16.609915 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0cdd903-a3a8-4634-8c76-d8702ecce12c" containerName="nova-metadata-metadata" Oct 12 07:51:16 crc kubenswrapper[4599]: I1012 07:51:16.609930 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebd046c8-0af0-4d96-99a4-ce5bb0dec770" containerName="dnsmasq-dns" Oct 12 07:51:16 crc kubenswrapper[4599]: I1012 07:51:16.609938 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5ccaa75-c98c-4f83-af3c-0276bb6ad3a8" containerName="nova-manage" Oct 12 07:51:16 crc kubenswrapper[4599]: I1012 07:51:16.609945 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0cdd903-a3a8-4634-8c76-d8702ecce12c" containerName="nova-metadata-log" Oct 12 07:51:16 crc kubenswrapper[4599]: I1012 07:51:16.610930 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 12 07:51:16 crc kubenswrapper[4599]: I1012 07:51:16.614989 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 12 07:51:16 crc kubenswrapper[4599]: I1012 07:51:16.615465 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 12 07:51:16 crc kubenswrapper[4599]: I1012 07:51:16.616274 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 12 07:51:16 crc kubenswrapper[4599]: I1012 07:51:16.655831 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44ffa825-906e-4f85-9da7-bf4f38d28e59-logs\") pod \"nova-metadata-0\" (UID: \"44ffa825-906e-4f85-9da7-bf4f38d28e59\") " pod="openstack/nova-metadata-0" Oct 12 07:51:16 crc kubenswrapper[4599]: I1012 07:51:16.656148 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44ffa825-906e-4f85-9da7-bf4f38d28e59-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"44ffa825-906e-4f85-9da7-bf4f38d28e59\") " pod="openstack/nova-metadata-0" Oct 12 07:51:16 crc kubenswrapper[4599]: I1012 07:51:16.656202 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtxbh\" (UniqueName: \"kubernetes.io/projected/44ffa825-906e-4f85-9da7-bf4f38d28e59-kube-api-access-vtxbh\") pod \"nova-metadata-0\" (UID: \"44ffa825-906e-4f85-9da7-bf4f38d28e59\") " pod="openstack/nova-metadata-0" Oct 12 07:51:16 crc kubenswrapper[4599]: I1012 07:51:16.656459 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/44ffa825-906e-4f85-9da7-bf4f38d28e59-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"44ffa825-906e-4f85-9da7-bf4f38d28e59\") " pod="openstack/nova-metadata-0" Oct 12 07:51:16 crc kubenswrapper[4599]: I1012 07:51:16.656577 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44ffa825-906e-4f85-9da7-bf4f38d28e59-config-data\") pod \"nova-metadata-0\" (UID: \"44ffa825-906e-4f85-9da7-bf4f38d28e59\") " pod="openstack/nova-metadata-0" Oct 12 07:51:16 crc kubenswrapper[4599]: I1012 07:51:16.716028 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 12 07:51:16 crc kubenswrapper[4599]: I1012 07:51:16.759206 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44ffa825-906e-4f85-9da7-bf4f38d28e59-logs\") pod \"nova-metadata-0\" (UID: \"44ffa825-906e-4f85-9da7-bf4f38d28e59\") " pod="openstack/nova-metadata-0" Oct 12 07:51:16 crc kubenswrapper[4599]: I1012 07:51:16.759279 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44ffa825-906e-4f85-9da7-bf4f38d28e59-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"44ffa825-906e-4f85-9da7-bf4f38d28e59\") " pod="openstack/nova-metadata-0" Oct 12 07:51:16 crc kubenswrapper[4599]: I1012 07:51:16.759314 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtxbh\" (UniqueName: \"kubernetes.io/projected/44ffa825-906e-4f85-9da7-bf4f38d28e59-kube-api-access-vtxbh\") pod \"nova-metadata-0\" (UID: \"44ffa825-906e-4f85-9da7-bf4f38d28e59\") " pod="openstack/nova-metadata-0" Oct 12 07:51:16 crc kubenswrapper[4599]: I1012 07:51:16.759390 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/44ffa825-906e-4f85-9da7-bf4f38d28e59-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"44ffa825-906e-4f85-9da7-bf4f38d28e59\") " pod="openstack/nova-metadata-0" Oct 12 07:51:16 crc kubenswrapper[4599]: I1012 07:51:16.759417 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44ffa825-906e-4f85-9da7-bf4f38d28e59-config-data\") pod \"nova-metadata-0\" (UID: \"44ffa825-906e-4f85-9da7-bf4f38d28e59\") " pod="openstack/nova-metadata-0" Oct 12 07:51:16 crc kubenswrapper[4599]: I1012 07:51:16.760996 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44ffa825-906e-4f85-9da7-bf4f38d28e59-logs\") pod \"nova-metadata-0\" (UID: \"44ffa825-906e-4f85-9da7-bf4f38d28e59\") " pod="openstack/nova-metadata-0" Oct 12 07:51:16 crc kubenswrapper[4599]: I1012 07:51:16.768213 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44ffa825-906e-4f85-9da7-bf4f38d28e59-config-data\") pod \"nova-metadata-0\" (UID: \"44ffa825-906e-4f85-9da7-bf4f38d28e59\") " pod="openstack/nova-metadata-0" Oct 12 07:51:16 crc kubenswrapper[4599]: I1012 07:51:16.769505 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44ffa825-906e-4f85-9da7-bf4f38d28e59-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"44ffa825-906e-4f85-9da7-bf4f38d28e59\") " pod="openstack/nova-metadata-0" Oct 12 07:51:16 crc kubenswrapper[4599]: I1012 07:51:16.769771 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/44ffa825-906e-4f85-9da7-bf4f38d28e59-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"44ffa825-906e-4f85-9da7-bf4f38d28e59\") " pod="openstack/nova-metadata-0" Oct 12 07:51:16 crc kubenswrapper[4599]: I1012 07:51:16.777080 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtxbh\" (UniqueName: \"kubernetes.io/projected/44ffa825-906e-4f85-9da7-bf4f38d28e59-kube-api-access-vtxbh\") pod \"nova-metadata-0\" (UID: \"44ffa825-906e-4f85-9da7-bf4f38d28e59\") " pod="openstack/nova-metadata-0" Oct 12 07:51:16 crc kubenswrapper[4599]: I1012 07:51:16.948073 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 12 07:51:17 crc kubenswrapper[4599]: I1012 07:51:17.357394 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 12 07:51:17 crc kubenswrapper[4599]: I1012 07:51:17.562451 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0cdd903-a3a8-4634-8c76-d8702ecce12c" path="/var/lib/kubelet/pods/e0cdd903-a3a8-4634-8c76-d8702ecce12c/volumes" Oct 12 07:51:17 crc kubenswrapper[4599]: I1012 07:51:17.563856 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebd046c8-0af0-4d96-99a4-ce5bb0dec770" path="/var/lib/kubelet/pods/ebd046c8-0af0-4d96-99a4-ce5bb0dec770/volumes" Oct 12 07:51:17 crc kubenswrapper[4599]: I1012 07:51:17.566203 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"44ffa825-906e-4f85-9da7-bf4f38d28e59","Type":"ContainerStarted","Data":"fd084a8cf01892565881cff8cde7533a774ae578612d822ce20bea0fe2efc930"} Oct 12 07:51:17 crc kubenswrapper[4599]: I1012 07:51:17.566246 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"44ffa825-906e-4f85-9da7-bf4f38d28e59","Type":"ContainerStarted","Data":"8a7e9532a9c9b00010e5b48ce7160f5806c5df784b198f9cc7e1cc9812c6762c"} Oct 12 07:51:17 crc kubenswrapper[4599]: I1012 07:51:17.567594 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="04538b3c-68d8-43fa-adc3-d319f98bf0d9" containerName="nova-scheduler-scheduler" containerID="cri-o://73f4c8b92bd06f1e1fdc40db556350adf07c9fcecf7d9a766f30e3b507692ed8" gracePeriod=30 Oct 12 07:51:18 crc kubenswrapper[4599]: I1012 07:51:18.579990 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"44ffa825-906e-4f85-9da7-bf4f38d28e59","Type":"ContainerStarted","Data":"65fb22f48efa762e822f7330fbe6757c38f2eae38bc94dbe46fc002baf4228cd"} Oct 12 07:51:18 crc kubenswrapper[4599]: I1012 07:51:18.612460 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.612434358 podStartE2EDuration="2.612434358s" podCreationTimestamp="2025-10-12 07:51:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:51:18.604051361 +0000 UTC m=+975.393246863" watchObservedRunningTime="2025-10-12 07:51:18.612434358 +0000 UTC m=+975.401629860" Oct 12 07:51:19 crc kubenswrapper[4599]: E1012 07:51:19.541801 4599 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="73f4c8b92bd06f1e1fdc40db556350adf07c9fcecf7d9a766f30e3b507692ed8" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 12 07:51:19 crc kubenswrapper[4599]: E1012 07:51:19.544129 4599 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="73f4c8b92bd06f1e1fdc40db556350adf07c9fcecf7d9a766f30e3b507692ed8" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 12 07:51:19 crc kubenswrapper[4599]: E1012 07:51:19.545708 4599 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="73f4c8b92bd06f1e1fdc40db556350adf07c9fcecf7d9a766f30e3b507692ed8" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 12 07:51:19 crc kubenswrapper[4599]: E1012 07:51:19.545849 4599 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="04538b3c-68d8-43fa-adc3-d319f98bf0d9" containerName="nova-scheduler-scheduler" Oct 12 07:51:20 crc kubenswrapper[4599]: I1012 07:51:20.604621 4599 generic.go:334] "Generic (PLEG): container finished" podID="04538b3c-68d8-43fa-adc3-d319f98bf0d9" containerID="73f4c8b92bd06f1e1fdc40db556350adf07c9fcecf7d9a766f30e3b507692ed8" exitCode=0 Oct 12 07:51:20 crc kubenswrapper[4599]: I1012 07:51:20.604721 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"04538b3c-68d8-43fa-adc3-d319f98bf0d9","Type":"ContainerDied","Data":"73f4c8b92bd06f1e1fdc40db556350adf07c9fcecf7d9a766f30e3b507692ed8"} Oct 12 07:51:20 crc kubenswrapper[4599]: I1012 07:51:20.786050 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 12 07:51:20 crc kubenswrapper[4599]: I1012 07:51:20.843291 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04538b3c-68d8-43fa-adc3-d319f98bf0d9-config-data\") pod \"04538b3c-68d8-43fa-adc3-d319f98bf0d9\" (UID: \"04538b3c-68d8-43fa-adc3-d319f98bf0d9\") " Oct 12 07:51:20 crc kubenswrapper[4599]: I1012 07:51:20.869756 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04538b3c-68d8-43fa-adc3-d319f98bf0d9-config-data" (OuterVolumeSpecName: "config-data") pod "04538b3c-68d8-43fa-adc3-d319f98bf0d9" (UID: "04538b3c-68d8-43fa-adc3-d319f98bf0d9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:51:20 crc kubenswrapper[4599]: I1012 07:51:20.945952 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04538b3c-68d8-43fa-adc3-d319f98bf0d9-combined-ca-bundle\") pod \"04538b3c-68d8-43fa-adc3-d319f98bf0d9\" (UID: \"04538b3c-68d8-43fa-adc3-d319f98bf0d9\") " Oct 12 07:51:20 crc kubenswrapper[4599]: I1012 07:51:20.946538 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2c68\" (UniqueName: \"kubernetes.io/projected/04538b3c-68d8-43fa-adc3-d319f98bf0d9-kube-api-access-x2c68\") pod \"04538b3c-68d8-43fa-adc3-d319f98bf0d9\" (UID: \"04538b3c-68d8-43fa-adc3-d319f98bf0d9\") " Oct 12 07:51:20 crc kubenswrapper[4599]: I1012 07:51:20.947786 4599 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04538b3c-68d8-43fa-adc3-d319f98bf0d9-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 07:51:20 crc kubenswrapper[4599]: I1012 07:51:20.950488 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04538b3c-68d8-43fa-adc3-d319f98bf0d9-kube-api-access-x2c68" (OuterVolumeSpecName: "kube-api-access-x2c68") pod "04538b3c-68d8-43fa-adc3-d319f98bf0d9" (UID: "04538b3c-68d8-43fa-adc3-d319f98bf0d9"). InnerVolumeSpecName "kube-api-access-x2c68". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:51:20 crc kubenswrapper[4599]: I1012 07:51:20.972293 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04538b3c-68d8-43fa-adc3-d319f98bf0d9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "04538b3c-68d8-43fa-adc3-d319f98bf0d9" (UID: "04538b3c-68d8-43fa-adc3-d319f98bf0d9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:51:21 crc kubenswrapper[4599]: I1012 07:51:21.050114 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2c68\" (UniqueName: \"kubernetes.io/projected/04538b3c-68d8-43fa-adc3-d319f98bf0d9-kube-api-access-x2c68\") on node \"crc\" DevicePath \"\"" Oct 12 07:51:21 crc kubenswrapper[4599]: I1012 07:51:21.050170 4599 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04538b3c-68d8-43fa-adc3-d319f98bf0d9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 07:51:21 crc kubenswrapper[4599]: I1012 07:51:21.289286 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 12 07:51:21 crc kubenswrapper[4599]: I1012 07:51:21.460963 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b8ae9f7-4558-4445-8976-db978922edde-logs\") pod \"3b8ae9f7-4558-4445-8976-db978922edde\" (UID: \"3b8ae9f7-4558-4445-8976-db978922edde\") " Oct 12 07:51:21 crc kubenswrapper[4599]: I1012 07:51:21.461028 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5cflm\" (UniqueName: \"kubernetes.io/projected/3b8ae9f7-4558-4445-8976-db978922edde-kube-api-access-5cflm\") pod \"3b8ae9f7-4558-4445-8976-db978922edde\" (UID: \"3b8ae9f7-4558-4445-8976-db978922edde\") " Oct 12 07:51:21 crc kubenswrapper[4599]: I1012 07:51:21.461065 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b8ae9f7-4558-4445-8976-db978922edde-combined-ca-bundle\") pod \"3b8ae9f7-4558-4445-8976-db978922edde\" (UID: \"3b8ae9f7-4558-4445-8976-db978922edde\") " Oct 12 07:51:21 crc kubenswrapper[4599]: I1012 07:51:21.461373 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b8ae9f7-4558-4445-8976-db978922edde-config-data\") pod \"3b8ae9f7-4558-4445-8976-db978922edde\" (UID: \"3b8ae9f7-4558-4445-8976-db978922edde\") " Oct 12 07:51:21 crc kubenswrapper[4599]: I1012 07:51:21.461698 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b8ae9f7-4558-4445-8976-db978922edde-logs" (OuterVolumeSpecName: "logs") pod "3b8ae9f7-4558-4445-8976-db978922edde" (UID: "3b8ae9f7-4558-4445-8976-db978922edde"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:51:21 crc kubenswrapper[4599]: I1012 07:51:21.461959 4599 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b8ae9f7-4558-4445-8976-db978922edde-logs\") on node \"crc\" DevicePath \"\"" Oct 12 07:51:21 crc kubenswrapper[4599]: I1012 07:51:21.464308 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b8ae9f7-4558-4445-8976-db978922edde-kube-api-access-5cflm" (OuterVolumeSpecName: "kube-api-access-5cflm") pod "3b8ae9f7-4558-4445-8976-db978922edde" (UID: "3b8ae9f7-4558-4445-8976-db978922edde"). InnerVolumeSpecName "kube-api-access-5cflm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:51:21 crc kubenswrapper[4599]: I1012 07:51:21.481591 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b8ae9f7-4558-4445-8976-db978922edde-config-data" (OuterVolumeSpecName: "config-data") pod "3b8ae9f7-4558-4445-8976-db978922edde" (UID: "3b8ae9f7-4558-4445-8976-db978922edde"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:51:21 crc kubenswrapper[4599]: I1012 07:51:21.482399 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b8ae9f7-4558-4445-8976-db978922edde-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3b8ae9f7-4558-4445-8976-db978922edde" (UID: "3b8ae9f7-4558-4445-8976-db978922edde"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:51:21 crc kubenswrapper[4599]: I1012 07:51:21.567051 4599 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b8ae9f7-4558-4445-8976-db978922edde-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 07:51:21 crc kubenswrapper[4599]: I1012 07:51:21.567366 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5cflm\" (UniqueName: \"kubernetes.io/projected/3b8ae9f7-4558-4445-8976-db978922edde-kube-api-access-5cflm\") on node \"crc\" DevicePath \"\"" Oct 12 07:51:21 crc kubenswrapper[4599]: I1012 07:51:21.567378 4599 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b8ae9f7-4558-4445-8976-db978922edde-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 07:51:21 crc kubenswrapper[4599]: I1012 07:51:21.616026 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 12 07:51:21 crc kubenswrapper[4599]: I1012 07:51:21.616053 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"04538b3c-68d8-43fa-adc3-d319f98bf0d9","Type":"ContainerDied","Data":"4eda18998bd5accb28f0bbafa6a8b457511b836b15b0d252ddc4a14eaebc79cd"} Oct 12 07:51:21 crc kubenswrapper[4599]: I1012 07:51:21.616106 4599 scope.go:117] "RemoveContainer" containerID="73f4c8b92bd06f1e1fdc40db556350adf07c9fcecf7d9a766f30e3b507692ed8" Oct 12 07:51:21 crc kubenswrapper[4599]: I1012 07:51:21.621138 4599 generic.go:334] "Generic (PLEG): container finished" podID="3b8ae9f7-4558-4445-8976-db978922edde" containerID="1674c42961443f7863d2efc66469ce66109d749bd6a3f1ddbb3de56ae242fd08" exitCode=0 Oct 12 07:51:21 crc kubenswrapper[4599]: I1012 07:51:21.621174 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3b8ae9f7-4558-4445-8976-db978922edde","Type":"ContainerDied","Data":"1674c42961443f7863d2efc66469ce66109d749bd6a3f1ddbb3de56ae242fd08"} Oct 12 07:51:21 crc kubenswrapper[4599]: I1012 07:51:21.621189 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3b8ae9f7-4558-4445-8976-db978922edde","Type":"ContainerDied","Data":"01b7b9b04198b80b2a036e9d8886f5bb775f5ca51d6dd289dd2148954dd65a08"} Oct 12 07:51:21 crc kubenswrapper[4599]: I1012 07:51:21.621216 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 12 07:51:21 crc kubenswrapper[4599]: I1012 07:51:21.636367 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 12 07:51:21 crc kubenswrapper[4599]: I1012 07:51:21.647542 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 12 07:51:21 crc kubenswrapper[4599]: I1012 07:51:21.660541 4599 scope.go:117] "RemoveContainer" containerID="1674c42961443f7863d2efc66469ce66109d749bd6a3f1ddbb3de56ae242fd08" Oct 12 07:51:21 crc kubenswrapper[4599]: I1012 07:51:21.679748 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 12 07:51:21 crc kubenswrapper[4599]: I1012 07:51:21.682080 4599 scope.go:117] "RemoveContainer" containerID="8aa833c06a2a9e762622ad8c932f1593ea522e274c9387755ea12c13264466ad" Oct 12 07:51:21 crc kubenswrapper[4599]: I1012 07:51:21.690256 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 12 07:51:21 crc kubenswrapper[4599]: I1012 07:51:21.702110 4599 scope.go:117] "RemoveContainer" containerID="1674c42961443f7863d2efc66469ce66109d749bd6a3f1ddbb3de56ae242fd08" Oct 12 07:51:21 crc kubenswrapper[4599]: I1012 07:51:21.707658 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 12 07:51:21 crc kubenswrapper[4599]: E1012 07:51:21.709723 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1674c42961443f7863d2efc66469ce66109d749bd6a3f1ddbb3de56ae242fd08\": container with ID starting with 1674c42961443f7863d2efc66469ce66109d749bd6a3f1ddbb3de56ae242fd08 not found: ID does not exist" containerID="1674c42961443f7863d2efc66469ce66109d749bd6a3f1ddbb3de56ae242fd08" Oct 12 07:51:21 crc kubenswrapper[4599]: I1012 07:51:21.709788 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1674c42961443f7863d2efc66469ce66109d749bd6a3f1ddbb3de56ae242fd08"} err="failed to get container status \"1674c42961443f7863d2efc66469ce66109d749bd6a3f1ddbb3de56ae242fd08\": rpc error: code = NotFound desc = could not find container \"1674c42961443f7863d2efc66469ce66109d749bd6a3f1ddbb3de56ae242fd08\": container with ID starting with 1674c42961443f7863d2efc66469ce66109d749bd6a3f1ddbb3de56ae242fd08 not found: ID does not exist" Oct 12 07:51:21 crc kubenswrapper[4599]: I1012 07:51:21.709814 4599 scope.go:117] "RemoveContainer" containerID="8aa833c06a2a9e762622ad8c932f1593ea522e274c9387755ea12c13264466ad" Oct 12 07:51:21 crc kubenswrapper[4599]: E1012 07:51:21.711300 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8aa833c06a2a9e762622ad8c932f1593ea522e274c9387755ea12c13264466ad\": container with ID starting with 8aa833c06a2a9e762622ad8c932f1593ea522e274c9387755ea12c13264466ad not found: ID does not exist" containerID="8aa833c06a2a9e762622ad8c932f1593ea522e274c9387755ea12c13264466ad" Oct 12 07:51:21 crc kubenswrapper[4599]: I1012 07:51:21.711386 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8aa833c06a2a9e762622ad8c932f1593ea522e274c9387755ea12c13264466ad"} err="failed to get container status \"8aa833c06a2a9e762622ad8c932f1593ea522e274c9387755ea12c13264466ad\": rpc error: code = NotFound desc = could not find container \"8aa833c06a2a9e762622ad8c932f1593ea522e274c9387755ea12c13264466ad\": container with ID starting with 8aa833c06a2a9e762622ad8c932f1593ea522e274c9387755ea12c13264466ad not found: ID does not exist" Oct 12 07:51:21 crc kubenswrapper[4599]: E1012 07:51:21.711655 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04538b3c-68d8-43fa-adc3-d319f98bf0d9" containerName="nova-scheduler-scheduler" Oct 12 07:51:21 crc kubenswrapper[4599]: I1012 07:51:21.711713 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="04538b3c-68d8-43fa-adc3-d319f98bf0d9" containerName="nova-scheduler-scheduler" Oct 12 07:51:21 crc kubenswrapper[4599]: E1012 07:51:21.711732 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b8ae9f7-4558-4445-8976-db978922edde" containerName="nova-api-log" Oct 12 07:51:21 crc kubenswrapper[4599]: I1012 07:51:21.711741 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b8ae9f7-4558-4445-8976-db978922edde" containerName="nova-api-log" Oct 12 07:51:21 crc kubenswrapper[4599]: E1012 07:51:21.711803 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b8ae9f7-4558-4445-8976-db978922edde" containerName="nova-api-api" Oct 12 07:51:21 crc kubenswrapper[4599]: I1012 07:51:21.711813 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b8ae9f7-4558-4445-8976-db978922edde" containerName="nova-api-api" Oct 12 07:51:21 crc kubenswrapper[4599]: I1012 07:51:21.712711 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b8ae9f7-4558-4445-8976-db978922edde" containerName="nova-api-api" Oct 12 07:51:21 crc kubenswrapper[4599]: I1012 07:51:21.712762 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b8ae9f7-4558-4445-8976-db978922edde" containerName="nova-api-log" Oct 12 07:51:21 crc kubenswrapper[4599]: I1012 07:51:21.712783 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="04538b3c-68d8-43fa-adc3-d319f98bf0d9" containerName="nova-scheduler-scheduler" Oct 12 07:51:21 crc kubenswrapper[4599]: I1012 07:51:21.716989 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 12 07:51:21 crc kubenswrapper[4599]: I1012 07:51:21.720574 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 12 07:51:21 crc kubenswrapper[4599]: I1012 07:51:21.732641 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 12 07:51:21 crc kubenswrapper[4599]: I1012 07:51:21.743711 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 12 07:51:21 crc kubenswrapper[4599]: I1012 07:51:21.745866 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 12 07:51:21 crc kubenswrapper[4599]: I1012 07:51:21.748100 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 12 07:51:21 crc kubenswrapper[4599]: I1012 07:51:21.761205 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 12 07:51:21 crc kubenswrapper[4599]: I1012 07:51:21.872987 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vqbz\" (UniqueName: \"kubernetes.io/projected/9bbec8b9-8aae-4ee9-a258-1c7bb7a75fd0-kube-api-access-8vqbz\") pod \"nova-scheduler-0\" (UID: \"9bbec8b9-8aae-4ee9-a258-1c7bb7a75fd0\") " pod="openstack/nova-scheduler-0" Oct 12 07:51:21 crc kubenswrapper[4599]: I1012 07:51:21.873072 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bbec8b9-8aae-4ee9-a258-1c7bb7a75fd0-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9bbec8b9-8aae-4ee9-a258-1c7bb7a75fd0\") " pod="openstack/nova-scheduler-0" Oct 12 07:51:21 crc kubenswrapper[4599]: I1012 07:51:21.873156 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkmlf\" (UniqueName: \"kubernetes.io/projected/341bb186-49ec-48b2-85a0-05c66a619132-kube-api-access-xkmlf\") pod \"nova-api-0\" (UID: \"341bb186-49ec-48b2-85a0-05c66a619132\") " pod="openstack/nova-api-0" Oct 12 07:51:21 crc kubenswrapper[4599]: I1012 07:51:21.873217 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bbec8b9-8aae-4ee9-a258-1c7bb7a75fd0-config-data\") pod \"nova-scheduler-0\" (UID: \"9bbec8b9-8aae-4ee9-a258-1c7bb7a75fd0\") " pod="openstack/nova-scheduler-0" Oct 12 07:51:21 crc kubenswrapper[4599]: I1012 07:51:21.873319 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/341bb186-49ec-48b2-85a0-05c66a619132-logs\") pod \"nova-api-0\" (UID: \"341bb186-49ec-48b2-85a0-05c66a619132\") " pod="openstack/nova-api-0" Oct 12 07:51:21 crc kubenswrapper[4599]: I1012 07:51:21.873361 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/341bb186-49ec-48b2-85a0-05c66a619132-config-data\") pod \"nova-api-0\" (UID: \"341bb186-49ec-48b2-85a0-05c66a619132\") " pod="openstack/nova-api-0" Oct 12 07:51:21 crc kubenswrapper[4599]: I1012 07:51:21.873395 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/341bb186-49ec-48b2-85a0-05c66a619132-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"341bb186-49ec-48b2-85a0-05c66a619132\") " pod="openstack/nova-api-0" Oct 12 07:51:21 crc kubenswrapper[4599]: I1012 07:51:21.948730 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 12 07:51:21 crc kubenswrapper[4599]: I1012 07:51:21.948792 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 12 07:51:21 crc kubenswrapper[4599]: I1012 07:51:21.974689 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkmlf\" (UniqueName: \"kubernetes.io/projected/341bb186-49ec-48b2-85a0-05c66a619132-kube-api-access-xkmlf\") pod \"nova-api-0\" (UID: \"341bb186-49ec-48b2-85a0-05c66a619132\") " pod="openstack/nova-api-0" Oct 12 07:51:21 crc kubenswrapper[4599]: I1012 07:51:21.974756 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bbec8b9-8aae-4ee9-a258-1c7bb7a75fd0-config-data\") pod \"nova-scheduler-0\" (UID: \"9bbec8b9-8aae-4ee9-a258-1c7bb7a75fd0\") " pod="openstack/nova-scheduler-0" Oct 12 07:51:21 crc kubenswrapper[4599]: I1012 07:51:21.974809 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/341bb186-49ec-48b2-85a0-05c66a619132-logs\") pod \"nova-api-0\" (UID: \"341bb186-49ec-48b2-85a0-05c66a619132\") " pod="openstack/nova-api-0" Oct 12 07:51:21 crc kubenswrapper[4599]: I1012 07:51:21.974829 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/341bb186-49ec-48b2-85a0-05c66a619132-config-data\") pod \"nova-api-0\" (UID: \"341bb186-49ec-48b2-85a0-05c66a619132\") " pod="openstack/nova-api-0" Oct 12 07:51:21 crc kubenswrapper[4599]: I1012 07:51:21.974854 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/341bb186-49ec-48b2-85a0-05c66a619132-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"341bb186-49ec-48b2-85a0-05c66a619132\") " pod="openstack/nova-api-0" Oct 12 07:51:21 crc kubenswrapper[4599]: I1012 07:51:21.974884 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vqbz\" (UniqueName: \"kubernetes.io/projected/9bbec8b9-8aae-4ee9-a258-1c7bb7a75fd0-kube-api-access-8vqbz\") pod \"nova-scheduler-0\" (UID: \"9bbec8b9-8aae-4ee9-a258-1c7bb7a75fd0\") " pod="openstack/nova-scheduler-0" Oct 12 07:51:21 crc kubenswrapper[4599]: I1012 07:51:21.974914 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bbec8b9-8aae-4ee9-a258-1c7bb7a75fd0-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9bbec8b9-8aae-4ee9-a258-1c7bb7a75fd0\") " pod="openstack/nova-scheduler-0" Oct 12 07:51:21 crc kubenswrapper[4599]: I1012 07:51:21.975515 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/341bb186-49ec-48b2-85a0-05c66a619132-logs\") pod \"nova-api-0\" (UID: \"341bb186-49ec-48b2-85a0-05c66a619132\") " pod="openstack/nova-api-0" Oct 12 07:51:21 crc kubenswrapper[4599]: I1012 07:51:21.980532 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bbec8b9-8aae-4ee9-a258-1c7bb7a75fd0-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9bbec8b9-8aae-4ee9-a258-1c7bb7a75fd0\") " pod="openstack/nova-scheduler-0" Oct 12 07:51:21 crc kubenswrapper[4599]: I1012 07:51:21.981487 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bbec8b9-8aae-4ee9-a258-1c7bb7a75fd0-config-data\") pod \"nova-scheduler-0\" (UID: \"9bbec8b9-8aae-4ee9-a258-1c7bb7a75fd0\") " pod="openstack/nova-scheduler-0" Oct 12 07:51:21 crc kubenswrapper[4599]: I1012 07:51:21.982327 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/341bb186-49ec-48b2-85a0-05c66a619132-config-data\") pod \"nova-api-0\" (UID: \"341bb186-49ec-48b2-85a0-05c66a619132\") " pod="openstack/nova-api-0" Oct 12 07:51:21 crc kubenswrapper[4599]: I1012 07:51:21.984026 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/341bb186-49ec-48b2-85a0-05c66a619132-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"341bb186-49ec-48b2-85a0-05c66a619132\") " pod="openstack/nova-api-0" Oct 12 07:51:21 crc kubenswrapper[4599]: I1012 07:51:21.991899 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkmlf\" (UniqueName: \"kubernetes.io/projected/341bb186-49ec-48b2-85a0-05c66a619132-kube-api-access-xkmlf\") pod \"nova-api-0\" (UID: \"341bb186-49ec-48b2-85a0-05c66a619132\") " pod="openstack/nova-api-0" Oct 12 07:51:21 crc kubenswrapper[4599]: I1012 07:51:21.992395 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vqbz\" (UniqueName: \"kubernetes.io/projected/9bbec8b9-8aae-4ee9-a258-1c7bb7a75fd0-kube-api-access-8vqbz\") pod \"nova-scheduler-0\" (UID: \"9bbec8b9-8aae-4ee9-a258-1c7bb7a75fd0\") " pod="openstack/nova-scheduler-0" Oct 12 07:51:22 crc kubenswrapper[4599]: I1012 07:51:22.034652 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 12 07:51:22 crc kubenswrapper[4599]: I1012 07:51:22.071765 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 12 07:51:22 crc kubenswrapper[4599]: I1012 07:51:22.451376 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 12 07:51:22 crc kubenswrapper[4599]: I1012 07:51:22.531528 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 12 07:51:22 crc kubenswrapper[4599]: W1012 07:51:22.531664 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod341bb186_49ec_48b2_85a0_05c66a619132.slice/crio-75017b9472da4d7f7b42264459bf59f1ae1c80e9f31fa65ee8e9d7aec1326fae WatchSource:0}: Error finding container 75017b9472da4d7f7b42264459bf59f1ae1c80e9f31fa65ee8e9d7aec1326fae: Status 404 returned error can't find the container with id 75017b9472da4d7f7b42264459bf59f1ae1c80e9f31fa65ee8e9d7aec1326fae Oct 12 07:51:22 crc kubenswrapper[4599]: I1012 07:51:22.640426 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9bbec8b9-8aae-4ee9-a258-1c7bb7a75fd0","Type":"ContainerStarted","Data":"18b15edd1792c5b1d06a3bf2ac5b445e627efcee2e31bda0225826675b0cb81f"} Oct 12 07:51:22 crc kubenswrapper[4599]: I1012 07:51:22.643551 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"341bb186-49ec-48b2-85a0-05c66a619132","Type":"ContainerStarted","Data":"75017b9472da4d7f7b42264459bf59f1ae1c80e9f31fa65ee8e9d7aec1326fae"} Oct 12 07:51:23 crc kubenswrapper[4599]: I1012 07:51:23.564162 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04538b3c-68d8-43fa-adc3-d319f98bf0d9" path="/var/lib/kubelet/pods/04538b3c-68d8-43fa-adc3-d319f98bf0d9/volumes" Oct 12 07:51:23 crc kubenswrapper[4599]: I1012 07:51:23.565168 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b8ae9f7-4558-4445-8976-db978922edde" path="/var/lib/kubelet/pods/3b8ae9f7-4558-4445-8976-db978922edde/volumes" Oct 12 07:51:23 crc kubenswrapper[4599]: I1012 07:51:23.652010 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9bbec8b9-8aae-4ee9-a258-1c7bb7a75fd0","Type":"ContainerStarted","Data":"045f9179b498ac00a5359fba8cf9dd1337a8e9970cd9e345df9d4d0b0841076b"} Oct 12 07:51:23 crc kubenswrapper[4599]: I1012 07:51:23.655928 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"341bb186-49ec-48b2-85a0-05c66a619132","Type":"ContainerStarted","Data":"ddd8c04d279621d909b9a924d1a1d79dfdc5f9389691bd56dc312732c9a7aab2"} Oct 12 07:51:23 crc kubenswrapper[4599]: I1012 07:51:23.655954 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"341bb186-49ec-48b2-85a0-05c66a619132","Type":"ContainerStarted","Data":"0c14022bfa9db06149f05d04d312fa7b9ab14179e6e0d10545f40e2809cd3368"} Oct 12 07:51:23 crc kubenswrapper[4599]: I1012 07:51:23.690668 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.69064809 podStartE2EDuration="2.69064809s" podCreationTimestamp="2025-10-12 07:51:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:51:23.675708402 +0000 UTC m=+980.464903904" watchObservedRunningTime="2025-10-12 07:51:23.69064809 +0000 UTC m=+980.479843592" Oct 12 07:51:23 crc kubenswrapper[4599]: I1012 07:51:23.697023 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.697014813 podStartE2EDuration="2.697014813s" podCreationTimestamp="2025-10-12 07:51:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:51:23.687448714 +0000 UTC m=+980.476644216" watchObservedRunningTime="2025-10-12 07:51:23.697014813 +0000 UTC m=+980.486210316" Oct 12 07:51:23 crc kubenswrapper[4599]: I1012 07:51:23.879985 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Oct 12 07:51:26 crc kubenswrapper[4599]: I1012 07:51:26.948351 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 12 07:51:26 crc kubenswrapper[4599]: I1012 07:51:26.948833 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 12 07:51:27 crc kubenswrapper[4599]: I1012 07:51:27.035409 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 12 07:51:27 crc kubenswrapper[4599]: I1012 07:51:27.966478 4599 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="44ffa825-906e-4f85-9da7-bf4f38d28e59" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.192:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 12 07:51:27 crc kubenswrapper[4599]: I1012 07:51:27.966502 4599 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="44ffa825-906e-4f85-9da7-bf4f38d28e59" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.192:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 12 07:51:32 crc kubenswrapper[4599]: I1012 07:51:32.034849 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 12 07:51:32 crc kubenswrapper[4599]: I1012 07:51:32.056696 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 12 07:51:32 crc kubenswrapper[4599]: I1012 07:51:32.072080 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 12 07:51:32 crc kubenswrapper[4599]: I1012 07:51:32.072146 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 12 07:51:32 crc kubenswrapper[4599]: I1012 07:51:32.770396 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 12 07:51:33 crc kubenswrapper[4599]: I1012 07:51:33.154505 4599 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="341bb186-49ec-48b2-85a0-05c66a619132" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.194:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 12 07:51:33 crc kubenswrapper[4599]: I1012 07:51:33.154568 4599 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="341bb186-49ec-48b2-85a0-05c66a619132" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.194:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 12 07:51:36 crc kubenswrapper[4599]: I1012 07:51:36.955897 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 12 07:51:36 crc kubenswrapper[4599]: I1012 07:51:36.956443 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 12 07:51:36 crc kubenswrapper[4599]: I1012 07:51:36.964212 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 12 07:51:36 crc kubenswrapper[4599]: I1012 07:51:36.970085 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 12 07:51:39 crc kubenswrapper[4599]: I1012 07:51:39.736616 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 12 07:51:39 crc kubenswrapper[4599]: I1012 07:51:39.807268 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j89bz\" (UniqueName: \"kubernetes.io/projected/2e47371c-f8d3-4eef-9a33-f4f6f7169dee-kube-api-access-j89bz\") pod \"2e47371c-f8d3-4eef-9a33-f4f6f7169dee\" (UID: \"2e47371c-f8d3-4eef-9a33-f4f6f7169dee\") " Oct 12 07:51:39 crc kubenswrapper[4599]: I1012 07:51:39.807771 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e47371c-f8d3-4eef-9a33-f4f6f7169dee-config-data\") pod \"2e47371c-f8d3-4eef-9a33-f4f6f7169dee\" (UID: \"2e47371c-f8d3-4eef-9a33-f4f6f7169dee\") " Oct 12 07:51:39 crc kubenswrapper[4599]: I1012 07:51:39.807905 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e47371c-f8d3-4eef-9a33-f4f6f7169dee-combined-ca-bundle\") pod \"2e47371c-f8d3-4eef-9a33-f4f6f7169dee\" (UID: \"2e47371c-f8d3-4eef-9a33-f4f6f7169dee\") " Oct 12 07:51:39 crc kubenswrapper[4599]: I1012 07:51:39.812891 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e47371c-f8d3-4eef-9a33-f4f6f7169dee-kube-api-access-j89bz" (OuterVolumeSpecName: "kube-api-access-j89bz") pod "2e47371c-f8d3-4eef-9a33-f4f6f7169dee" (UID: "2e47371c-f8d3-4eef-9a33-f4f6f7169dee"). InnerVolumeSpecName "kube-api-access-j89bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:51:39 crc kubenswrapper[4599]: I1012 07:51:39.816875 4599 generic.go:334] "Generic (PLEG): container finished" podID="2e47371c-f8d3-4eef-9a33-f4f6f7169dee" containerID="f75f2f7840736ad38279f593d358c36d383a46619d646761943c18436cfc04f4" exitCode=137 Oct 12 07:51:39 crc kubenswrapper[4599]: I1012 07:51:39.816956 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 12 07:51:39 crc kubenswrapper[4599]: I1012 07:51:39.816958 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"2e47371c-f8d3-4eef-9a33-f4f6f7169dee","Type":"ContainerDied","Data":"f75f2f7840736ad38279f593d358c36d383a46619d646761943c18436cfc04f4"} Oct 12 07:51:39 crc kubenswrapper[4599]: I1012 07:51:39.817371 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"2e47371c-f8d3-4eef-9a33-f4f6f7169dee","Type":"ContainerDied","Data":"df31d022720b454b4781a46c2f4b6d6755901b117df48c6dd5de8c4f97e43075"} Oct 12 07:51:39 crc kubenswrapper[4599]: I1012 07:51:39.817396 4599 scope.go:117] "RemoveContainer" containerID="f75f2f7840736ad38279f593d358c36d383a46619d646761943c18436cfc04f4" Oct 12 07:51:39 crc kubenswrapper[4599]: I1012 07:51:39.830070 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e47371c-f8d3-4eef-9a33-f4f6f7169dee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2e47371c-f8d3-4eef-9a33-f4f6f7169dee" (UID: "2e47371c-f8d3-4eef-9a33-f4f6f7169dee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:51:39 crc kubenswrapper[4599]: I1012 07:51:39.832765 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e47371c-f8d3-4eef-9a33-f4f6f7169dee-config-data" (OuterVolumeSpecName: "config-data") pod "2e47371c-f8d3-4eef-9a33-f4f6f7169dee" (UID: "2e47371c-f8d3-4eef-9a33-f4f6f7169dee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:51:39 crc kubenswrapper[4599]: I1012 07:51:39.888305 4599 scope.go:117] "RemoveContainer" containerID="f75f2f7840736ad38279f593d358c36d383a46619d646761943c18436cfc04f4" Oct 12 07:51:39 crc kubenswrapper[4599]: E1012 07:51:39.889731 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f75f2f7840736ad38279f593d358c36d383a46619d646761943c18436cfc04f4\": container with ID starting with f75f2f7840736ad38279f593d358c36d383a46619d646761943c18436cfc04f4 not found: ID does not exist" containerID="f75f2f7840736ad38279f593d358c36d383a46619d646761943c18436cfc04f4" Oct 12 07:51:39 crc kubenswrapper[4599]: I1012 07:51:39.889773 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f75f2f7840736ad38279f593d358c36d383a46619d646761943c18436cfc04f4"} err="failed to get container status \"f75f2f7840736ad38279f593d358c36d383a46619d646761943c18436cfc04f4\": rpc error: code = NotFound desc = could not find container \"f75f2f7840736ad38279f593d358c36d383a46619d646761943c18436cfc04f4\": container with ID starting with f75f2f7840736ad38279f593d358c36d383a46619d646761943c18436cfc04f4 not found: ID does not exist" Oct 12 07:51:39 crc kubenswrapper[4599]: I1012 07:51:39.910412 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j89bz\" (UniqueName: \"kubernetes.io/projected/2e47371c-f8d3-4eef-9a33-f4f6f7169dee-kube-api-access-j89bz\") on node \"crc\" DevicePath \"\"" Oct 12 07:51:39 crc kubenswrapper[4599]: I1012 07:51:39.910445 4599 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e47371c-f8d3-4eef-9a33-f4f6f7169dee-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 07:51:39 crc kubenswrapper[4599]: I1012 07:51:39.910459 4599 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e47371c-f8d3-4eef-9a33-f4f6f7169dee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 07:51:40 crc kubenswrapper[4599]: I1012 07:51:40.023025 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 12 07:51:40 crc kubenswrapper[4599]: I1012 07:51:40.145176 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 12 07:51:40 crc kubenswrapper[4599]: I1012 07:51:40.149972 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 12 07:51:40 crc kubenswrapper[4599]: I1012 07:51:40.161794 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 12 07:51:40 crc kubenswrapper[4599]: E1012 07:51:40.162209 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e47371c-f8d3-4eef-9a33-f4f6f7169dee" containerName="nova-cell1-novncproxy-novncproxy" Oct 12 07:51:40 crc kubenswrapper[4599]: I1012 07:51:40.162231 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e47371c-f8d3-4eef-9a33-f4f6f7169dee" containerName="nova-cell1-novncproxy-novncproxy" Oct 12 07:51:40 crc kubenswrapper[4599]: I1012 07:51:40.162447 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e47371c-f8d3-4eef-9a33-f4f6f7169dee" containerName="nova-cell1-novncproxy-novncproxy" Oct 12 07:51:40 crc kubenswrapper[4599]: I1012 07:51:40.164963 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 12 07:51:40 crc kubenswrapper[4599]: I1012 07:51:40.166865 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 12 07:51:40 crc kubenswrapper[4599]: I1012 07:51:40.167019 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Oct 12 07:51:40 crc kubenswrapper[4599]: I1012 07:51:40.167114 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Oct 12 07:51:40 crc kubenswrapper[4599]: I1012 07:51:40.179863 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 12 07:51:40 crc kubenswrapper[4599]: I1012 07:51:40.216645 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/2abe9d8b-a086-4e4c-8873-3b50714935c9-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"2abe9d8b-a086-4e4c-8873-3b50714935c9\") " pod="openstack/nova-cell1-novncproxy-0" Oct 12 07:51:40 crc kubenswrapper[4599]: I1012 07:51:40.216716 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/2abe9d8b-a086-4e4c-8873-3b50714935c9-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"2abe9d8b-a086-4e4c-8873-3b50714935c9\") " pod="openstack/nova-cell1-novncproxy-0" Oct 12 07:51:40 crc kubenswrapper[4599]: I1012 07:51:40.216756 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqkh7\" (UniqueName: \"kubernetes.io/projected/2abe9d8b-a086-4e4c-8873-3b50714935c9-kube-api-access-sqkh7\") pod \"nova-cell1-novncproxy-0\" (UID: \"2abe9d8b-a086-4e4c-8873-3b50714935c9\") " pod="openstack/nova-cell1-novncproxy-0" Oct 12 07:51:40 crc kubenswrapper[4599]: I1012 07:51:40.216822 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2abe9d8b-a086-4e4c-8873-3b50714935c9-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"2abe9d8b-a086-4e4c-8873-3b50714935c9\") " pod="openstack/nova-cell1-novncproxy-0" Oct 12 07:51:40 crc kubenswrapper[4599]: I1012 07:51:40.216886 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2abe9d8b-a086-4e4c-8873-3b50714935c9-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"2abe9d8b-a086-4e4c-8873-3b50714935c9\") " pod="openstack/nova-cell1-novncproxy-0" Oct 12 07:51:40 crc kubenswrapper[4599]: I1012 07:51:40.319118 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/2abe9d8b-a086-4e4c-8873-3b50714935c9-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"2abe9d8b-a086-4e4c-8873-3b50714935c9\") " pod="openstack/nova-cell1-novncproxy-0" Oct 12 07:51:40 crc kubenswrapper[4599]: I1012 07:51:40.319187 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqkh7\" (UniqueName: \"kubernetes.io/projected/2abe9d8b-a086-4e4c-8873-3b50714935c9-kube-api-access-sqkh7\") pod \"nova-cell1-novncproxy-0\" (UID: \"2abe9d8b-a086-4e4c-8873-3b50714935c9\") " pod="openstack/nova-cell1-novncproxy-0" Oct 12 07:51:40 crc kubenswrapper[4599]: I1012 07:51:40.319275 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2abe9d8b-a086-4e4c-8873-3b50714935c9-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"2abe9d8b-a086-4e4c-8873-3b50714935c9\") " pod="openstack/nova-cell1-novncproxy-0" Oct 12 07:51:40 crc kubenswrapper[4599]: I1012 07:51:40.319330 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2abe9d8b-a086-4e4c-8873-3b50714935c9-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"2abe9d8b-a086-4e4c-8873-3b50714935c9\") " pod="openstack/nova-cell1-novncproxy-0" Oct 12 07:51:40 crc kubenswrapper[4599]: I1012 07:51:40.319423 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/2abe9d8b-a086-4e4c-8873-3b50714935c9-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"2abe9d8b-a086-4e4c-8873-3b50714935c9\") " pod="openstack/nova-cell1-novncproxy-0" Oct 12 07:51:40 crc kubenswrapper[4599]: I1012 07:51:40.323362 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/2abe9d8b-a086-4e4c-8873-3b50714935c9-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"2abe9d8b-a086-4e4c-8873-3b50714935c9\") " pod="openstack/nova-cell1-novncproxy-0" Oct 12 07:51:40 crc kubenswrapper[4599]: I1012 07:51:40.323700 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/2abe9d8b-a086-4e4c-8873-3b50714935c9-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"2abe9d8b-a086-4e4c-8873-3b50714935c9\") " pod="openstack/nova-cell1-novncproxy-0" Oct 12 07:51:40 crc kubenswrapper[4599]: I1012 07:51:40.324110 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2abe9d8b-a086-4e4c-8873-3b50714935c9-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"2abe9d8b-a086-4e4c-8873-3b50714935c9\") " pod="openstack/nova-cell1-novncproxy-0" Oct 12 07:51:40 crc kubenswrapper[4599]: I1012 07:51:40.324596 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2abe9d8b-a086-4e4c-8873-3b50714935c9-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"2abe9d8b-a086-4e4c-8873-3b50714935c9\") " pod="openstack/nova-cell1-novncproxy-0" Oct 12 07:51:40 crc kubenswrapper[4599]: I1012 07:51:40.335149 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqkh7\" (UniqueName: \"kubernetes.io/projected/2abe9d8b-a086-4e4c-8873-3b50714935c9-kube-api-access-sqkh7\") pod \"nova-cell1-novncproxy-0\" (UID: \"2abe9d8b-a086-4e4c-8873-3b50714935c9\") " pod="openstack/nova-cell1-novncproxy-0" Oct 12 07:51:40 crc kubenswrapper[4599]: I1012 07:51:40.484115 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 12 07:51:40 crc kubenswrapper[4599]: I1012 07:51:40.889655 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 12 07:51:40 crc kubenswrapper[4599]: W1012 07:51:40.894380 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2abe9d8b_a086_4e4c_8873_3b50714935c9.slice/crio-92aba64d78d5ebfd8979901d42249fdb3f37a8b97230cfb7944e7e37a42172de WatchSource:0}: Error finding container 92aba64d78d5ebfd8979901d42249fdb3f37a8b97230cfb7944e7e37a42172de: Status 404 returned error can't find the container with id 92aba64d78d5ebfd8979901d42249fdb3f37a8b97230cfb7944e7e37a42172de Oct 12 07:51:41 crc kubenswrapper[4599]: I1012 07:51:41.557043 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e47371c-f8d3-4eef-9a33-f4f6f7169dee" path="/var/lib/kubelet/pods/2e47371c-f8d3-4eef-9a33-f4f6f7169dee/volumes" Oct 12 07:51:41 crc kubenswrapper[4599]: I1012 07:51:41.843198 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"2abe9d8b-a086-4e4c-8873-3b50714935c9","Type":"ContainerStarted","Data":"0c90c0cb540a96cd94a38c5b5df73d020b49e339c7c9dc542079526abbceb76d"} Oct 12 07:51:41 crc kubenswrapper[4599]: I1012 07:51:41.843275 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"2abe9d8b-a086-4e4c-8873-3b50714935c9","Type":"ContainerStarted","Data":"92aba64d78d5ebfd8979901d42249fdb3f37a8b97230cfb7944e7e37a42172de"} Oct 12 07:51:41 crc kubenswrapper[4599]: I1012 07:51:41.861182 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=1.861170006 podStartE2EDuration="1.861170006s" podCreationTimestamp="2025-10-12 07:51:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:51:41.857940161 +0000 UTC m=+998.647135664" watchObservedRunningTime="2025-10-12 07:51:41.861170006 +0000 UTC m=+998.650365507" Oct 12 07:51:42 crc kubenswrapper[4599]: I1012 07:51:42.078782 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 12 07:51:42 crc kubenswrapper[4599]: I1012 07:51:42.078976 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 12 07:51:42 crc kubenswrapper[4599]: I1012 07:51:42.079561 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 12 07:51:42 crc kubenswrapper[4599]: I1012 07:51:42.079613 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 12 07:51:42 crc kubenswrapper[4599]: I1012 07:51:42.081490 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 12 07:51:42 crc kubenswrapper[4599]: I1012 07:51:42.083140 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 12 07:51:42 crc kubenswrapper[4599]: I1012 07:51:42.266512 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57db76b469-s5l9q"] Oct 12 07:51:42 crc kubenswrapper[4599]: I1012 07:51:42.275352 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57db76b469-s5l9q" Oct 12 07:51:42 crc kubenswrapper[4599]: I1012 07:51:42.279367 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57db76b469-s5l9q"] Oct 12 07:51:42 crc kubenswrapper[4599]: I1012 07:51:42.369851 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64zv7\" (UniqueName: \"kubernetes.io/projected/2f4ce98b-b927-4e97-8066-e8eac5fa0c5e-kube-api-access-64zv7\") pod \"dnsmasq-dns-57db76b469-s5l9q\" (UID: \"2f4ce98b-b927-4e97-8066-e8eac5fa0c5e\") " pod="openstack/dnsmasq-dns-57db76b469-s5l9q" Oct 12 07:51:42 crc kubenswrapper[4599]: I1012 07:51:42.369922 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2f4ce98b-b927-4e97-8066-e8eac5fa0c5e-ovsdbserver-sb\") pod \"dnsmasq-dns-57db76b469-s5l9q\" (UID: \"2f4ce98b-b927-4e97-8066-e8eac5fa0c5e\") " pod="openstack/dnsmasq-dns-57db76b469-s5l9q" Oct 12 07:51:42 crc kubenswrapper[4599]: I1012 07:51:42.369962 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2f4ce98b-b927-4e97-8066-e8eac5fa0c5e-dns-svc\") pod \"dnsmasq-dns-57db76b469-s5l9q\" (UID: \"2f4ce98b-b927-4e97-8066-e8eac5fa0c5e\") " pod="openstack/dnsmasq-dns-57db76b469-s5l9q" Oct 12 07:51:42 crc kubenswrapper[4599]: I1012 07:51:42.370261 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2f4ce98b-b927-4e97-8066-e8eac5fa0c5e-dns-swift-storage-0\") pod \"dnsmasq-dns-57db76b469-s5l9q\" (UID: \"2f4ce98b-b927-4e97-8066-e8eac5fa0c5e\") " pod="openstack/dnsmasq-dns-57db76b469-s5l9q" Oct 12 07:51:42 crc kubenswrapper[4599]: I1012 07:51:42.370391 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f4ce98b-b927-4e97-8066-e8eac5fa0c5e-config\") pod \"dnsmasq-dns-57db76b469-s5l9q\" (UID: \"2f4ce98b-b927-4e97-8066-e8eac5fa0c5e\") " pod="openstack/dnsmasq-dns-57db76b469-s5l9q" Oct 12 07:51:42 crc kubenswrapper[4599]: I1012 07:51:42.370559 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2f4ce98b-b927-4e97-8066-e8eac5fa0c5e-ovsdbserver-nb\") pod \"dnsmasq-dns-57db76b469-s5l9q\" (UID: \"2f4ce98b-b927-4e97-8066-e8eac5fa0c5e\") " pod="openstack/dnsmasq-dns-57db76b469-s5l9q" Oct 12 07:51:42 crc kubenswrapper[4599]: I1012 07:51:42.473405 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2f4ce98b-b927-4e97-8066-e8eac5fa0c5e-dns-swift-storage-0\") pod \"dnsmasq-dns-57db76b469-s5l9q\" (UID: \"2f4ce98b-b927-4e97-8066-e8eac5fa0c5e\") " pod="openstack/dnsmasq-dns-57db76b469-s5l9q" Oct 12 07:51:42 crc kubenswrapper[4599]: I1012 07:51:42.473750 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f4ce98b-b927-4e97-8066-e8eac5fa0c5e-config\") pod \"dnsmasq-dns-57db76b469-s5l9q\" (UID: \"2f4ce98b-b927-4e97-8066-e8eac5fa0c5e\") " pod="openstack/dnsmasq-dns-57db76b469-s5l9q" Oct 12 07:51:42 crc kubenswrapper[4599]: I1012 07:51:42.473817 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2f4ce98b-b927-4e97-8066-e8eac5fa0c5e-ovsdbserver-nb\") pod \"dnsmasq-dns-57db76b469-s5l9q\" (UID: \"2f4ce98b-b927-4e97-8066-e8eac5fa0c5e\") " pod="openstack/dnsmasq-dns-57db76b469-s5l9q" Oct 12 07:51:42 crc kubenswrapper[4599]: I1012 07:51:42.473851 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64zv7\" (UniqueName: \"kubernetes.io/projected/2f4ce98b-b927-4e97-8066-e8eac5fa0c5e-kube-api-access-64zv7\") pod \"dnsmasq-dns-57db76b469-s5l9q\" (UID: \"2f4ce98b-b927-4e97-8066-e8eac5fa0c5e\") " pod="openstack/dnsmasq-dns-57db76b469-s5l9q" Oct 12 07:51:42 crc kubenswrapper[4599]: I1012 07:51:42.473898 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2f4ce98b-b927-4e97-8066-e8eac5fa0c5e-ovsdbserver-sb\") pod \"dnsmasq-dns-57db76b469-s5l9q\" (UID: \"2f4ce98b-b927-4e97-8066-e8eac5fa0c5e\") " pod="openstack/dnsmasq-dns-57db76b469-s5l9q" Oct 12 07:51:42 crc kubenswrapper[4599]: I1012 07:51:42.473920 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2f4ce98b-b927-4e97-8066-e8eac5fa0c5e-dns-svc\") pod \"dnsmasq-dns-57db76b469-s5l9q\" (UID: \"2f4ce98b-b927-4e97-8066-e8eac5fa0c5e\") " pod="openstack/dnsmasq-dns-57db76b469-s5l9q" Oct 12 07:51:42 crc kubenswrapper[4599]: I1012 07:51:42.474192 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2f4ce98b-b927-4e97-8066-e8eac5fa0c5e-dns-swift-storage-0\") pod \"dnsmasq-dns-57db76b469-s5l9q\" (UID: \"2f4ce98b-b927-4e97-8066-e8eac5fa0c5e\") " pod="openstack/dnsmasq-dns-57db76b469-s5l9q" Oct 12 07:51:42 crc kubenswrapper[4599]: I1012 07:51:42.474714 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f4ce98b-b927-4e97-8066-e8eac5fa0c5e-config\") pod \"dnsmasq-dns-57db76b469-s5l9q\" (UID: \"2f4ce98b-b927-4e97-8066-e8eac5fa0c5e\") " pod="openstack/dnsmasq-dns-57db76b469-s5l9q" Oct 12 07:51:42 crc kubenswrapper[4599]: I1012 07:51:42.474807 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2f4ce98b-b927-4e97-8066-e8eac5fa0c5e-ovsdbserver-nb\") pod \"dnsmasq-dns-57db76b469-s5l9q\" (UID: \"2f4ce98b-b927-4e97-8066-e8eac5fa0c5e\") " pod="openstack/dnsmasq-dns-57db76b469-s5l9q" Oct 12 07:51:42 crc kubenswrapper[4599]: I1012 07:51:42.474935 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2f4ce98b-b927-4e97-8066-e8eac5fa0c5e-ovsdbserver-sb\") pod \"dnsmasq-dns-57db76b469-s5l9q\" (UID: \"2f4ce98b-b927-4e97-8066-e8eac5fa0c5e\") " pod="openstack/dnsmasq-dns-57db76b469-s5l9q" Oct 12 07:51:42 crc kubenswrapper[4599]: I1012 07:51:42.474997 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2f4ce98b-b927-4e97-8066-e8eac5fa0c5e-dns-svc\") pod \"dnsmasq-dns-57db76b469-s5l9q\" (UID: \"2f4ce98b-b927-4e97-8066-e8eac5fa0c5e\") " pod="openstack/dnsmasq-dns-57db76b469-s5l9q" Oct 12 07:51:42 crc kubenswrapper[4599]: I1012 07:51:42.490350 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64zv7\" (UniqueName: \"kubernetes.io/projected/2f4ce98b-b927-4e97-8066-e8eac5fa0c5e-kube-api-access-64zv7\") pod \"dnsmasq-dns-57db76b469-s5l9q\" (UID: \"2f4ce98b-b927-4e97-8066-e8eac5fa0c5e\") " pod="openstack/dnsmasq-dns-57db76b469-s5l9q" Oct 12 07:51:42 crc kubenswrapper[4599]: I1012 07:51:42.619478 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57db76b469-s5l9q" Oct 12 07:51:43 crc kubenswrapper[4599]: I1012 07:51:43.035299 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57db76b469-s5l9q"] Oct 12 07:51:43 crc kubenswrapper[4599]: I1012 07:51:43.864668 4599 generic.go:334] "Generic (PLEG): container finished" podID="2f4ce98b-b927-4e97-8066-e8eac5fa0c5e" containerID="2464fe787b05c66b60ac6861e434544738b2c96cdc2b4245742a7a2fb93983fd" exitCode=0 Oct 12 07:51:43 crc kubenswrapper[4599]: I1012 07:51:43.864801 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57db76b469-s5l9q" event={"ID":"2f4ce98b-b927-4e97-8066-e8eac5fa0c5e","Type":"ContainerDied","Data":"2464fe787b05c66b60ac6861e434544738b2c96cdc2b4245742a7a2fb93983fd"} Oct 12 07:51:43 crc kubenswrapper[4599]: I1012 07:51:43.865120 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57db76b469-s5l9q" event={"ID":"2f4ce98b-b927-4e97-8066-e8eac5fa0c5e","Type":"ContainerStarted","Data":"c907c305df36461dc94f5a11064d1a23d5a79a8d9f3de80a8f529e4d6d810682"} Oct 12 07:51:44 crc kubenswrapper[4599]: I1012 07:51:44.395808 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 12 07:51:44 crc kubenswrapper[4599]: I1012 07:51:44.711148 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 12 07:51:44 crc kubenswrapper[4599]: I1012 07:51:44.711709 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2f64dd0c-359b-4342-a814-923eb2a16de8" containerName="ceilometer-central-agent" containerID="cri-o://e165eb4006b0aef1fa3a4b1b73a4c8cd2fbea6091ee302ca98c3f51058dfdea5" gracePeriod=30 Oct 12 07:51:44 crc kubenswrapper[4599]: I1012 07:51:44.711784 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2f64dd0c-359b-4342-a814-923eb2a16de8" containerName="ceilometer-notification-agent" containerID="cri-o://d2dedd30e3874783f4673920d521039ef9fdf4a60edf85cfeec76a0dba065727" gracePeriod=30 Oct 12 07:51:44 crc kubenswrapper[4599]: I1012 07:51:44.711792 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2f64dd0c-359b-4342-a814-923eb2a16de8" containerName="sg-core" containerID="cri-o://b0ff498e8a7d91880aab3b49293ea7be8e807a97fa9b5cf175d3fd4c0293ae08" gracePeriod=30 Oct 12 07:51:44 crc kubenswrapper[4599]: I1012 07:51:44.711985 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2f64dd0c-359b-4342-a814-923eb2a16de8" containerName="proxy-httpd" containerID="cri-o://604af05a6a54da5d9dec5916ada02a8812757e25950e35a59d9ad86da4c46901" gracePeriod=30 Oct 12 07:51:44 crc kubenswrapper[4599]: I1012 07:51:44.876689 4599 generic.go:334] "Generic (PLEG): container finished" podID="2f64dd0c-359b-4342-a814-923eb2a16de8" containerID="604af05a6a54da5d9dec5916ada02a8812757e25950e35a59d9ad86da4c46901" exitCode=0 Oct 12 07:51:44 crc kubenswrapper[4599]: I1012 07:51:44.876732 4599 generic.go:334] "Generic (PLEG): container finished" podID="2f64dd0c-359b-4342-a814-923eb2a16de8" containerID="b0ff498e8a7d91880aab3b49293ea7be8e807a97fa9b5cf175d3fd4c0293ae08" exitCode=2 Oct 12 07:51:44 crc kubenswrapper[4599]: I1012 07:51:44.876770 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2f64dd0c-359b-4342-a814-923eb2a16de8","Type":"ContainerDied","Data":"604af05a6a54da5d9dec5916ada02a8812757e25950e35a59d9ad86da4c46901"} Oct 12 07:51:44 crc kubenswrapper[4599]: I1012 07:51:44.876808 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2f64dd0c-359b-4342-a814-923eb2a16de8","Type":"ContainerDied","Data":"b0ff498e8a7d91880aab3b49293ea7be8e807a97fa9b5cf175d3fd4c0293ae08"} Oct 12 07:51:44 crc kubenswrapper[4599]: I1012 07:51:44.879803 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="341bb186-49ec-48b2-85a0-05c66a619132" containerName="nova-api-log" containerID="cri-o://0c14022bfa9db06149f05d04d312fa7b9ab14179e6e0d10545f40e2809cd3368" gracePeriod=30 Oct 12 07:51:44 crc kubenswrapper[4599]: I1012 07:51:44.880627 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57db76b469-s5l9q" event={"ID":"2f4ce98b-b927-4e97-8066-e8eac5fa0c5e","Type":"ContainerStarted","Data":"12b9fccc3bac8c37f063d62ea18186c7beb8665965052758fae59d0afa547fc5"} Oct 12 07:51:44 crc kubenswrapper[4599]: I1012 07:51:44.880655 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57db76b469-s5l9q" Oct 12 07:51:44 crc kubenswrapper[4599]: I1012 07:51:44.880907 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="341bb186-49ec-48b2-85a0-05c66a619132" containerName="nova-api-api" containerID="cri-o://ddd8c04d279621d909b9a924d1a1d79dfdc5f9389691bd56dc312732c9a7aab2" gracePeriod=30 Oct 12 07:51:44 crc kubenswrapper[4599]: I1012 07:51:44.931632 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57db76b469-s5l9q" podStartSLOduration=2.931612836 podStartE2EDuration="2.931612836s" podCreationTimestamp="2025-10-12 07:51:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:51:44.926954086 +0000 UTC m=+1001.716149587" watchObservedRunningTime="2025-10-12 07:51:44.931612836 +0000 UTC m=+1001.720808338" Oct 12 07:51:45 crc kubenswrapper[4599]: I1012 07:51:45.485419 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 12 07:51:45 crc kubenswrapper[4599]: I1012 07:51:45.890804 4599 generic.go:334] "Generic (PLEG): container finished" podID="2f64dd0c-359b-4342-a814-923eb2a16de8" containerID="e165eb4006b0aef1fa3a4b1b73a4c8cd2fbea6091ee302ca98c3f51058dfdea5" exitCode=0 Oct 12 07:51:45 crc kubenswrapper[4599]: I1012 07:51:45.890854 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2f64dd0c-359b-4342-a814-923eb2a16de8","Type":"ContainerDied","Data":"e165eb4006b0aef1fa3a4b1b73a4c8cd2fbea6091ee302ca98c3f51058dfdea5"} Oct 12 07:51:45 crc kubenswrapper[4599]: I1012 07:51:45.893860 4599 generic.go:334] "Generic (PLEG): container finished" podID="341bb186-49ec-48b2-85a0-05c66a619132" containerID="0c14022bfa9db06149f05d04d312fa7b9ab14179e6e0d10545f40e2809cd3368" exitCode=143 Oct 12 07:51:45 crc kubenswrapper[4599]: I1012 07:51:45.893950 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"341bb186-49ec-48b2-85a0-05c66a619132","Type":"ContainerDied","Data":"0c14022bfa9db06149f05d04d312fa7b9ab14179e6e0d10545f40e2809cd3368"} Oct 12 07:51:47 crc kubenswrapper[4599]: I1012 07:51:47.308166 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 12 07:51:47 crc kubenswrapper[4599]: I1012 07:51:47.385529 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f64dd0c-359b-4342-a814-923eb2a16de8-scripts\") pod \"2f64dd0c-359b-4342-a814-923eb2a16de8\" (UID: \"2f64dd0c-359b-4342-a814-923eb2a16de8\") " Oct 12 07:51:47 crc kubenswrapper[4599]: I1012 07:51:47.385585 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f64dd0c-359b-4342-a814-923eb2a16de8-ceilometer-tls-certs\") pod \"2f64dd0c-359b-4342-a814-923eb2a16de8\" (UID: \"2f64dd0c-359b-4342-a814-923eb2a16de8\") " Oct 12 07:51:47 crc kubenswrapper[4599]: I1012 07:51:47.385619 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2f64dd0c-359b-4342-a814-923eb2a16de8-run-httpd\") pod \"2f64dd0c-359b-4342-a814-923eb2a16de8\" (UID: \"2f64dd0c-359b-4342-a814-923eb2a16de8\") " Oct 12 07:51:47 crc kubenswrapper[4599]: I1012 07:51:47.385710 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gg2kk\" (UniqueName: \"kubernetes.io/projected/2f64dd0c-359b-4342-a814-923eb2a16de8-kube-api-access-gg2kk\") pod \"2f64dd0c-359b-4342-a814-923eb2a16de8\" (UID: \"2f64dd0c-359b-4342-a814-923eb2a16de8\") " Oct 12 07:51:47 crc kubenswrapper[4599]: I1012 07:51:47.385731 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2f64dd0c-359b-4342-a814-923eb2a16de8-log-httpd\") pod \"2f64dd0c-359b-4342-a814-923eb2a16de8\" (UID: \"2f64dd0c-359b-4342-a814-923eb2a16de8\") " Oct 12 07:51:47 crc kubenswrapper[4599]: I1012 07:51:47.385863 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f64dd0c-359b-4342-a814-923eb2a16de8-config-data\") pod \"2f64dd0c-359b-4342-a814-923eb2a16de8\" (UID: \"2f64dd0c-359b-4342-a814-923eb2a16de8\") " Oct 12 07:51:47 crc kubenswrapper[4599]: I1012 07:51:47.386440 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f64dd0c-359b-4342-a814-923eb2a16de8-combined-ca-bundle\") pod \"2f64dd0c-359b-4342-a814-923eb2a16de8\" (UID: \"2f64dd0c-359b-4342-a814-923eb2a16de8\") " Oct 12 07:51:47 crc kubenswrapper[4599]: I1012 07:51:47.386467 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2f64dd0c-359b-4342-a814-923eb2a16de8-sg-core-conf-yaml\") pod \"2f64dd0c-359b-4342-a814-923eb2a16de8\" (UID: \"2f64dd0c-359b-4342-a814-923eb2a16de8\") " Oct 12 07:51:47 crc kubenswrapper[4599]: I1012 07:51:47.386055 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f64dd0c-359b-4342-a814-923eb2a16de8-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "2f64dd0c-359b-4342-a814-923eb2a16de8" (UID: "2f64dd0c-359b-4342-a814-923eb2a16de8"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:51:47 crc kubenswrapper[4599]: I1012 07:51:47.386207 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f64dd0c-359b-4342-a814-923eb2a16de8-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "2f64dd0c-359b-4342-a814-923eb2a16de8" (UID: "2f64dd0c-359b-4342-a814-923eb2a16de8"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:51:47 crc kubenswrapper[4599]: I1012 07:51:47.386852 4599 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2f64dd0c-359b-4342-a814-923eb2a16de8-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 12 07:51:47 crc kubenswrapper[4599]: I1012 07:51:47.386871 4599 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2f64dd0c-359b-4342-a814-923eb2a16de8-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 12 07:51:47 crc kubenswrapper[4599]: I1012 07:51:47.391718 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f64dd0c-359b-4342-a814-923eb2a16de8-scripts" (OuterVolumeSpecName: "scripts") pod "2f64dd0c-359b-4342-a814-923eb2a16de8" (UID: "2f64dd0c-359b-4342-a814-923eb2a16de8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:51:47 crc kubenswrapper[4599]: I1012 07:51:47.396478 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f64dd0c-359b-4342-a814-923eb2a16de8-kube-api-access-gg2kk" (OuterVolumeSpecName: "kube-api-access-gg2kk") pod "2f64dd0c-359b-4342-a814-923eb2a16de8" (UID: "2f64dd0c-359b-4342-a814-923eb2a16de8"). InnerVolumeSpecName "kube-api-access-gg2kk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:51:47 crc kubenswrapper[4599]: I1012 07:51:47.413281 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f64dd0c-359b-4342-a814-923eb2a16de8-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "2f64dd0c-359b-4342-a814-923eb2a16de8" (UID: "2f64dd0c-359b-4342-a814-923eb2a16de8"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:51:47 crc kubenswrapper[4599]: I1012 07:51:47.428214 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f64dd0c-359b-4342-a814-923eb2a16de8-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "2f64dd0c-359b-4342-a814-923eb2a16de8" (UID: "2f64dd0c-359b-4342-a814-923eb2a16de8"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:51:47 crc kubenswrapper[4599]: I1012 07:51:47.446903 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f64dd0c-359b-4342-a814-923eb2a16de8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2f64dd0c-359b-4342-a814-923eb2a16de8" (UID: "2f64dd0c-359b-4342-a814-923eb2a16de8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:51:47 crc kubenswrapper[4599]: I1012 07:51:47.461725 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f64dd0c-359b-4342-a814-923eb2a16de8-config-data" (OuterVolumeSpecName: "config-data") pod "2f64dd0c-359b-4342-a814-923eb2a16de8" (UID: "2f64dd0c-359b-4342-a814-923eb2a16de8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:51:47 crc kubenswrapper[4599]: I1012 07:51:47.488894 4599 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f64dd0c-359b-4342-a814-923eb2a16de8-scripts\") on node \"crc\" DevicePath \"\"" Oct 12 07:51:47 crc kubenswrapper[4599]: I1012 07:51:47.488926 4599 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f64dd0c-359b-4342-a814-923eb2a16de8-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 12 07:51:47 crc kubenswrapper[4599]: I1012 07:51:47.488940 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gg2kk\" (UniqueName: \"kubernetes.io/projected/2f64dd0c-359b-4342-a814-923eb2a16de8-kube-api-access-gg2kk\") on node \"crc\" DevicePath \"\"" Oct 12 07:51:47 crc kubenswrapper[4599]: I1012 07:51:47.488949 4599 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f64dd0c-359b-4342-a814-923eb2a16de8-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 07:51:47 crc kubenswrapper[4599]: I1012 07:51:47.488959 4599 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f64dd0c-359b-4342-a814-923eb2a16de8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 07:51:47 crc kubenswrapper[4599]: I1012 07:51:47.488968 4599 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2f64dd0c-359b-4342-a814-923eb2a16de8-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 12 07:51:47 crc kubenswrapper[4599]: I1012 07:51:47.919557 4599 generic.go:334] "Generic (PLEG): container finished" podID="2f64dd0c-359b-4342-a814-923eb2a16de8" containerID="d2dedd30e3874783f4673920d521039ef9fdf4a60edf85cfeec76a0dba065727" exitCode=0 Oct 12 07:51:47 crc kubenswrapper[4599]: I1012 07:51:47.919625 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2f64dd0c-359b-4342-a814-923eb2a16de8","Type":"ContainerDied","Data":"d2dedd30e3874783f4673920d521039ef9fdf4a60edf85cfeec76a0dba065727"} Oct 12 07:51:47 crc kubenswrapper[4599]: I1012 07:51:47.919696 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 12 07:51:47 crc kubenswrapper[4599]: I1012 07:51:47.920179 4599 scope.go:117] "RemoveContainer" containerID="604af05a6a54da5d9dec5916ada02a8812757e25950e35a59d9ad86da4c46901" Oct 12 07:51:47 crc kubenswrapper[4599]: I1012 07:51:47.920132 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2f64dd0c-359b-4342-a814-923eb2a16de8","Type":"ContainerDied","Data":"d85eddc6c4050e4b6a07b4774ae8eae193972c6f81e8ef799e267af2d40f1f06"} Oct 12 07:51:47 crc kubenswrapper[4599]: I1012 07:51:47.942165 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 12 07:51:47 crc kubenswrapper[4599]: I1012 07:51:47.943936 4599 scope.go:117] "RemoveContainer" containerID="b0ff498e8a7d91880aab3b49293ea7be8e807a97fa9b5cf175d3fd4c0293ae08" Oct 12 07:51:47 crc kubenswrapper[4599]: I1012 07:51:47.947526 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 12 07:51:47 crc kubenswrapper[4599]: I1012 07:51:47.962593 4599 scope.go:117] "RemoveContainer" containerID="d2dedd30e3874783f4673920d521039ef9fdf4a60edf85cfeec76a0dba065727" Oct 12 07:51:47 crc kubenswrapper[4599]: I1012 07:51:47.966569 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 12 07:51:47 crc kubenswrapper[4599]: E1012 07:51:47.966986 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f64dd0c-359b-4342-a814-923eb2a16de8" containerName="ceilometer-notification-agent" Oct 12 07:51:47 crc kubenswrapper[4599]: I1012 07:51:47.967007 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f64dd0c-359b-4342-a814-923eb2a16de8" containerName="ceilometer-notification-agent" Oct 12 07:51:47 crc kubenswrapper[4599]: E1012 07:51:47.967021 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f64dd0c-359b-4342-a814-923eb2a16de8" containerName="proxy-httpd" Oct 12 07:51:47 crc kubenswrapper[4599]: I1012 07:51:47.967028 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f64dd0c-359b-4342-a814-923eb2a16de8" containerName="proxy-httpd" Oct 12 07:51:47 crc kubenswrapper[4599]: E1012 07:51:47.967041 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f64dd0c-359b-4342-a814-923eb2a16de8" containerName="sg-core" Oct 12 07:51:47 crc kubenswrapper[4599]: I1012 07:51:47.967047 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f64dd0c-359b-4342-a814-923eb2a16de8" containerName="sg-core" Oct 12 07:51:47 crc kubenswrapper[4599]: E1012 07:51:47.967063 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f64dd0c-359b-4342-a814-923eb2a16de8" containerName="ceilometer-central-agent" Oct 12 07:51:47 crc kubenswrapper[4599]: I1012 07:51:47.967069 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f64dd0c-359b-4342-a814-923eb2a16de8" containerName="ceilometer-central-agent" Oct 12 07:51:47 crc kubenswrapper[4599]: I1012 07:51:47.967241 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f64dd0c-359b-4342-a814-923eb2a16de8" containerName="sg-core" Oct 12 07:51:47 crc kubenswrapper[4599]: I1012 07:51:47.967265 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f64dd0c-359b-4342-a814-923eb2a16de8" containerName="ceilometer-central-agent" Oct 12 07:51:47 crc kubenswrapper[4599]: I1012 07:51:47.967279 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f64dd0c-359b-4342-a814-923eb2a16de8" containerName="ceilometer-notification-agent" Oct 12 07:51:47 crc kubenswrapper[4599]: I1012 07:51:47.967301 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f64dd0c-359b-4342-a814-923eb2a16de8" containerName="proxy-httpd" Oct 12 07:51:47 crc kubenswrapper[4599]: I1012 07:51:47.969050 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 12 07:51:47 crc kubenswrapper[4599]: I1012 07:51:47.973011 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 12 07:51:47 crc kubenswrapper[4599]: I1012 07:51:47.973413 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 12 07:51:47 crc kubenswrapper[4599]: I1012 07:51:47.974999 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 12 07:51:47 crc kubenswrapper[4599]: I1012 07:51:47.976552 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 12 07:51:47 crc kubenswrapper[4599]: I1012 07:51:47.980238 4599 scope.go:117] "RemoveContainer" containerID="e165eb4006b0aef1fa3a4b1b73a4c8cd2fbea6091ee302ca98c3f51058dfdea5" Oct 12 07:51:47 crc kubenswrapper[4599]: I1012 07:51:47.998788 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95d7fa90-5a03-4991-810a-59cf46e55ebf-config-data\") pod \"ceilometer-0\" (UID: \"95d7fa90-5a03-4991-810a-59cf46e55ebf\") " pod="openstack/ceilometer-0" Oct 12 07:51:47 crc kubenswrapper[4599]: I1012 07:51:47.998850 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95d7fa90-5a03-4991-810a-59cf46e55ebf-log-httpd\") pod \"ceilometer-0\" (UID: \"95d7fa90-5a03-4991-810a-59cf46e55ebf\") " pod="openstack/ceilometer-0" Oct 12 07:51:47 crc kubenswrapper[4599]: I1012 07:51:47.998947 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95d7fa90-5a03-4991-810a-59cf46e55ebf-run-httpd\") pod \"ceilometer-0\" (UID: \"95d7fa90-5a03-4991-810a-59cf46e55ebf\") " pod="openstack/ceilometer-0" Oct 12 07:51:47 crc kubenswrapper[4599]: I1012 07:51:47.998966 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95d7fa90-5a03-4991-810a-59cf46e55ebf-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"95d7fa90-5a03-4991-810a-59cf46e55ebf\") " pod="openstack/ceilometer-0" Oct 12 07:51:47 crc kubenswrapper[4599]: I1012 07:51:47.998992 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/95d7fa90-5a03-4991-810a-59cf46e55ebf-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"95d7fa90-5a03-4991-810a-59cf46e55ebf\") " pod="openstack/ceilometer-0" Oct 12 07:51:47 crc kubenswrapper[4599]: I1012 07:51:47.999018 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdhfp\" (UniqueName: \"kubernetes.io/projected/95d7fa90-5a03-4991-810a-59cf46e55ebf-kube-api-access-vdhfp\") pod \"ceilometer-0\" (UID: \"95d7fa90-5a03-4991-810a-59cf46e55ebf\") " pod="openstack/ceilometer-0" Oct 12 07:51:47 crc kubenswrapper[4599]: I1012 07:51:47.999290 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/95d7fa90-5a03-4991-810a-59cf46e55ebf-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"95d7fa90-5a03-4991-810a-59cf46e55ebf\") " pod="openstack/ceilometer-0" Oct 12 07:51:47 crc kubenswrapper[4599]: I1012 07:51:47.999434 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95d7fa90-5a03-4991-810a-59cf46e55ebf-scripts\") pod \"ceilometer-0\" (UID: \"95d7fa90-5a03-4991-810a-59cf46e55ebf\") " pod="openstack/ceilometer-0" Oct 12 07:51:48 crc kubenswrapper[4599]: I1012 07:51:48.018332 4599 scope.go:117] "RemoveContainer" containerID="604af05a6a54da5d9dec5916ada02a8812757e25950e35a59d9ad86da4c46901" Oct 12 07:51:48 crc kubenswrapper[4599]: E1012 07:51:48.019191 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"604af05a6a54da5d9dec5916ada02a8812757e25950e35a59d9ad86da4c46901\": container with ID starting with 604af05a6a54da5d9dec5916ada02a8812757e25950e35a59d9ad86da4c46901 not found: ID does not exist" containerID="604af05a6a54da5d9dec5916ada02a8812757e25950e35a59d9ad86da4c46901" Oct 12 07:51:48 crc kubenswrapper[4599]: I1012 07:51:48.019271 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"604af05a6a54da5d9dec5916ada02a8812757e25950e35a59d9ad86da4c46901"} err="failed to get container status \"604af05a6a54da5d9dec5916ada02a8812757e25950e35a59d9ad86da4c46901\": rpc error: code = NotFound desc = could not find container \"604af05a6a54da5d9dec5916ada02a8812757e25950e35a59d9ad86da4c46901\": container with ID starting with 604af05a6a54da5d9dec5916ada02a8812757e25950e35a59d9ad86da4c46901 not found: ID does not exist" Oct 12 07:51:48 crc kubenswrapper[4599]: I1012 07:51:48.019301 4599 scope.go:117] "RemoveContainer" containerID="b0ff498e8a7d91880aab3b49293ea7be8e807a97fa9b5cf175d3fd4c0293ae08" Oct 12 07:51:48 crc kubenswrapper[4599]: E1012 07:51:48.019868 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0ff498e8a7d91880aab3b49293ea7be8e807a97fa9b5cf175d3fd4c0293ae08\": container with ID starting with b0ff498e8a7d91880aab3b49293ea7be8e807a97fa9b5cf175d3fd4c0293ae08 not found: ID does not exist" containerID="b0ff498e8a7d91880aab3b49293ea7be8e807a97fa9b5cf175d3fd4c0293ae08" Oct 12 07:51:48 crc kubenswrapper[4599]: I1012 07:51:48.019910 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0ff498e8a7d91880aab3b49293ea7be8e807a97fa9b5cf175d3fd4c0293ae08"} err="failed to get container status \"b0ff498e8a7d91880aab3b49293ea7be8e807a97fa9b5cf175d3fd4c0293ae08\": rpc error: code = NotFound desc = could not find container \"b0ff498e8a7d91880aab3b49293ea7be8e807a97fa9b5cf175d3fd4c0293ae08\": container with ID starting with b0ff498e8a7d91880aab3b49293ea7be8e807a97fa9b5cf175d3fd4c0293ae08 not found: ID does not exist" Oct 12 07:51:48 crc kubenswrapper[4599]: I1012 07:51:48.019925 4599 scope.go:117] "RemoveContainer" containerID="d2dedd30e3874783f4673920d521039ef9fdf4a60edf85cfeec76a0dba065727" Oct 12 07:51:48 crc kubenswrapper[4599]: E1012 07:51:48.020404 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2dedd30e3874783f4673920d521039ef9fdf4a60edf85cfeec76a0dba065727\": container with ID starting with d2dedd30e3874783f4673920d521039ef9fdf4a60edf85cfeec76a0dba065727 not found: ID does not exist" containerID="d2dedd30e3874783f4673920d521039ef9fdf4a60edf85cfeec76a0dba065727" Oct 12 07:51:48 crc kubenswrapper[4599]: I1012 07:51:48.020443 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2dedd30e3874783f4673920d521039ef9fdf4a60edf85cfeec76a0dba065727"} err="failed to get container status \"d2dedd30e3874783f4673920d521039ef9fdf4a60edf85cfeec76a0dba065727\": rpc error: code = NotFound desc = could not find container \"d2dedd30e3874783f4673920d521039ef9fdf4a60edf85cfeec76a0dba065727\": container with ID starting with d2dedd30e3874783f4673920d521039ef9fdf4a60edf85cfeec76a0dba065727 not found: ID does not exist" Oct 12 07:51:48 crc kubenswrapper[4599]: I1012 07:51:48.020466 4599 scope.go:117] "RemoveContainer" containerID="e165eb4006b0aef1fa3a4b1b73a4c8cd2fbea6091ee302ca98c3f51058dfdea5" Oct 12 07:51:48 crc kubenswrapper[4599]: E1012 07:51:48.020768 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e165eb4006b0aef1fa3a4b1b73a4c8cd2fbea6091ee302ca98c3f51058dfdea5\": container with ID starting with e165eb4006b0aef1fa3a4b1b73a4c8cd2fbea6091ee302ca98c3f51058dfdea5 not found: ID does not exist" containerID="e165eb4006b0aef1fa3a4b1b73a4c8cd2fbea6091ee302ca98c3f51058dfdea5" Oct 12 07:51:48 crc kubenswrapper[4599]: I1012 07:51:48.020854 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e165eb4006b0aef1fa3a4b1b73a4c8cd2fbea6091ee302ca98c3f51058dfdea5"} err="failed to get container status \"e165eb4006b0aef1fa3a4b1b73a4c8cd2fbea6091ee302ca98c3f51058dfdea5\": rpc error: code = NotFound desc = could not find container \"e165eb4006b0aef1fa3a4b1b73a4c8cd2fbea6091ee302ca98c3f51058dfdea5\": container with ID starting with e165eb4006b0aef1fa3a4b1b73a4c8cd2fbea6091ee302ca98c3f51058dfdea5 not found: ID does not exist" Oct 12 07:51:48 crc kubenswrapper[4599]: I1012 07:51:48.101897 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdhfp\" (UniqueName: \"kubernetes.io/projected/95d7fa90-5a03-4991-810a-59cf46e55ebf-kube-api-access-vdhfp\") pod \"ceilometer-0\" (UID: \"95d7fa90-5a03-4991-810a-59cf46e55ebf\") " pod="openstack/ceilometer-0" Oct 12 07:51:48 crc kubenswrapper[4599]: I1012 07:51:48.102299 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/95d7fa90-5a03-4991-810a-59cf46e55ebf-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"95d7fa90-5a03-4991-810a-59cf46e55ebf\") " pod="openstack/ceilometer-0" Oct 12 07:51:48 crc kubenswrapper[4599]: I1012 07:51:48.102433 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95d7fa90-5a03-4991-810a-59cf46e55ebf-scripts\") pod \"ceilometer-0\" (UID: \"95d7fa90-5a03-4991-810a-59cf46e55ebf\") " pod="openstack/ceilometer-0" Oct 12 07:51:48 crc kubenswrapper[4599]: I1012 07:51:48.102545 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95d7fa90-5a03-4991-810a-59cf46e55ebf-config-data\") pod \"ceilometer-0\" (UID: \"95d7fa90-5a03-4991-810a-59cf46e55ebf\") " pod="openstack/ceilometer-0" Oct 12 07:51:48 crc kubenswrapper[4599]: I1012 07:51:48.102628 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95d7fa90-5a03-4991-810a-59cf46e55ebf-log-httpd\") pod \"ceilometer-0\" (UID: \"95d7fa90-5a03-4991-810a-59cf46e55ebf\") " pod="openstack/ceilometer-0" Oct 12 07:51:48 crc kubenswrapper[4599]: I1012 07:51:48.102735 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95d7fa90-5a03-4991-810a-59cf46e55ebf-run-httpd\") pod \"ceilometer-0\" (UID: \"95d7fa90-5a03-4991-810a-59cf46e55ebf\") " pod="openstack/ceilometer-0" Oct 12 07:51:48 crc kubenswrapper[4599]: I1012 07:51:48.102808 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95d7fa90-5a03-4991-810a-59cf46e55ebf-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"95d7fa90-5a03-4991-810a-59cf46e55ebf\") " pod="openstack/ceilometer-0" Oct 12 07:51:48 crc kubenswrapper[4599]: I1012 07:51:48.102917 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/95d7fa90-5a03-4991-810a-59cf46e55ebf-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"95d7fa90-5a03-4991-810a-59cf46e55ebf\") " pod="openstack/ceilometer-0" Oct 12 07:51:48 crc kubenswrapper[4599]: I1012 07:51:48.103320 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95d7fa90-5a03-4991-810a-59cf46e55ebf-run-httpd\") pod \"ceilometer-0\" (UID: \"95d7fa90-5a03-4991-810a-59cf46e55ebf\") " pod="openstack/ceilometer-0" Oct 12 07:51:48 crc kubenswrapper[4599]: I1012 07:51:48.103266 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95d7fa90-5a03-4991-810a-59cf46e55ebf-log-httpd\") pod \"ceilometer-0\" (UID: \"95d7fa90-5a03-4991-810a-59cf46e55ebf\") " pod="openstack/ceilometer-0" Oct 12 07:51:48 crc kubenswrapper[4599]: I1012 07:51:48.107099 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/95d7fa90-5a03-4991-810a-59cf46e55ebf-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"95d7fa90-5a03-4991-810a-59cf46e55ebf\") " pod="openstack/ceilometer-0" Oct 12 07:51:48 crc kubenswrapper[4599]: I1012 07:51:48.107378 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95d7fa90-5a03-4991-810a-59cf46e55ebf-config-data\") pod \"ceilometer-0\" (UID: \"95d7fa90-5a03-4991-810a-59cf46e55ebf\") " pod="openstack/ceilometer-0" Oct 12 07:51:48 crc kubenswrapper[4599]: I1012 07:51:48.108320 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/95d7fa90-5a03-4991-810a-59cf46e55ebf-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"95d7fa90-5a03-4991-810a-59cf46e55ebf\") " pod="openstack/ceilometer-0" Oct 12 07:51:48 crc kubenswrapper[4599]: I1012 07:51:48.112892 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95d7fa90-5a03-4991-810a-59cf46e55ebf-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"95d7fa90-5a03-4991-810a-59cf46e55ebf\") " pod="openstack/ceilometer-0" Oct 12 07:51:48 crc kubenswrapper[4599]: I1012 07:51:48.113434 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95d7fa90-5a03-4991-810a-59cf46e55ebf-scripts\") pod \"ceilometer-0\" (UID: \"95d7fa90-5a03-4991-810a-59cf46e55ebf\") " pod="openstack/ceilometer-0" Oct 12 07:51:48 crc kubenswrapper[4599]: I1012 07:51:48.118732 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdhfp\" (UniqueName: \"kubernetes.io/projected/95d7fa90-5a03-4991-810a-59cf46e55ebf-kube-api-access-vdhfp\") pod \"ceilometer-0\" (UID: \"95d7fa90-5a03-4991-810a-59cf46e55ebf\") " pod="openstack/ceilometer-0" Oct 12 07:51:48 crc kubenswrapper[4599]: I1012 07:51:48.283065 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 12 07:51:48 crc kubenswrapper[4599]: I1012 07:51:48.692873 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 12 07:51:48 crc kubenswrapper[4599]: I1012 07:51:48.772499 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 12 07:51:48 crc kubenswrapper[4599]: I1012 07:51:48.822149 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/341bb186-49ec-48b2-85a0-05c66a619132-combined-ca-bundle\") pod \"341bb186-49ec-48b2-85a0-05c66a619132\" (UID: \"341bb186-49ec-48b2-85a0-05c66a619132\") " Oct 12 07:51:48 crc kubenswrapper[4599]: I1012 07:51:48.822493 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkmlf\" (UniqueName: \"kubernetes.io/projected/341bb186-49ec-48b2-85a0-05c66a619132-kube-api-access-xkmlf\") pod \"341bb186-49ec-48b2-85a0-05c66a619132\" (UID: \"341bb186-49ec-48b2-85a0-05c66a619132\") " Oct 12 07:51:48 crc kubenswrapper[4599]: I1012 07:51:48.822516 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/341bb186-49ec-48b2-85a0-05c66a619132-config-data\") pod \"341bb186-49ec-48b2-85a0-05c66a619132\" (UID: \"341bb186-49ec-48b2-85a0-05c66a619132\") " Oct 12 07:51:48 crc kubenswrapper[4599]: I1012 07:51:48.822801 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/341bb186-49ec-48b2-85a0-05c66a619132-logs\") pod \"341bb186-49ec-48b2-85a0-05c66a619132\" (UID: \"341bb186-49ec-48b2-85a0-05c66a619132\") " Oct 12 07:51:48 crc kubenswrapper[4599]: I1012 07:51:48.824178 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/341bb186-49ec-48b2-85a0-05c66a619132-logs" (OuterVolumeSpecName: "logs") pod "341bb186-49ec-48b2-85a0-05c66a619132" (UID: "341bb186-49ec-48b2-85a0-05c66a619132"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:51:48 crc kubenswrapper[4599]: I1012 07:51:48.829031 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/341bb186-49ec-48b2-85a0-05c66a619132-kube-api-access-xkmlf" (OuterVolumeSpecName: "kube-api-access-xkmlf") pod "341bb186-49ec-48b2-85a0-05c66a619132" (UID: "341bb186-49ec-48b2-85a0-05c66a619132"). InnerVolumeSpecName "kube-api-access-xkmlf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:51:48 crc kubenswrapper[4599]: I1012 07:51:48.847407 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/341bb186-49ec-48b2-85a0-05c66a619132-config-data" (OuterVolumeSpecName: "config-data") pod "341bb186-49ec-48b2-85a0-05c66a619132" (UID: "341bb186-49ec-48b2-85a0-05c66a619132"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:51:48 crc kubenswrapper[4599]: I1012 07:51:48.848885 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/341bb186-49ec-48b2-85a0-05c66a619132-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "341bb186-49ec-48b2-85a0-05c66a619132" (UID: "341bb186-49ec-48b2-85a0-05c66a619132"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:51:48 crc kubenswrapper[4599]: I1012 07:51:48.924741 4599 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/341bb186-49ec-48b2-85a0-05c66a619132-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 07:51:48 crc kubenswrapper[4599]: I1012 07:51:48.924773 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkmlf\" (UniqueName: \"kubernetes.io/projected/341bb186-49ec-48b2-85a0-05c66a619132-kube-api-access-xkmlf\") on node \"crc\" DevicePath \"\"" Oct 12 07:51:48 crc kubenswrapper[4599]: I1012 07:51:48.924788 4599 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/341bb186-49ec-48b2-85a0-05c66a619132-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 07:51:48 crc kubenswrapper[4599]: I1012 07:51:48.924796 4599 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/341bb186-49ec-48b2-85a0-05c66a619132-logs\") on node \"crc\" DevicePath \"\"" Oct 12 07:51:48 crc kubenswrapper[4599]: I1012 07:51:48.930942 4599 generic.go:334] "Generic (PLEG): container finished" podID="341bb186-49ec-48b2-85a0-05c66a619132" containerID="ddd8c04d279621d909b9a924d1a1d79dfdc5f9389691bd56dc312732c9a7aab2" exitCode=0 Oct 12 07:51:48 crc kubenswrapper[4599]: I1012 07:51:48.931090 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"341bb186-49ec-48b2-85a0-05c66a619132","Type":"ContainerDied","Data":"ddd8c04d279621d909b9a924d1a1d79dfdc5f9389691bd56dc312732c9a7aab2"} Oct 12 07:51:48 crc kubenswrapper[4599]: I1012 07:51:48.931189 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"341bb186-49ec-48b2-85a0-05c66a619132","Type":"ContainerDied","Data":"75017b9472da4d7f7b42264459bf59f1ae1c80e9f31fa65ee8e9d7aec1326fae"} Oct 12 07:51:48 crc kubenswrapper[4599]: I1012 07:51:48.931263 4599 scope.go:117] "RemoveContainer" containerID="ddd8c04d279621d909b9a924d1a1d79dfdc5f9389691bd56dc312732c9a7aab2" Oct 12 07:51:48 crc kubenswrapper[4599]: I1012 07:51:48.931444 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 12 07:51:48 crc kubenswrapper[4599]: I1012 07:51:48.933246 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95d7fa90-5a03-4991-810a-59cf46e55ebf","Type":"ContainerStarted","Data":"3d3e683dd9bc9222a1389f55e1bc63af4b88ec21e4c5d3ab41df5fec1004c96a"} Oct 12 07:51:48 crc kubenswrapper[4599]: I1012 07:51:48.949392 4599 scope.go:117] "RemoveContainer" containerID="0c14022bfa9db06149f05d04d312fa7b9ab14179e6e0d10545f40e2809cd3368" Oct 12 07:51:48 crc kubenswrapper[4599]: I1012 07:51:48.972280 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 12 07:51:48 crc kubenswrapper[4599]: I1012 07:51:48.976517 4599 scope.go:117] "RemoveContainer" containerID="ddd8c04d279621d909b9a924d1a1d79dfdc5f9389691bd56dc312732c9a7aab2" Oct 12 07:51:48 crc kubenswrapper[4599]: E1012 07:51:48.977348 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ddd8c04d279621d909b9a924d1a1d79dfdc5f9389691bd56dc312732c9a7aab2\": container with ID starting with ddd8c04d279621d909b9a924d1a1d79dfdc5f9389691bd56dc312732c9a7aab2 not found: ID does not exist" containerID="ddd8c04d279621d909b9a924d1a1d79dfdc5f9389691bd56dc312732c9a7aab2" Oct 12 07:51:48 crc kubenswrapper[4599]: I1012 07:51:48.977400 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddd8c04d279621d909b9a924d1a1d79dfdc5f9389691bd56dc312732c9a7aab2"} err="failed to get container status \"ddd8c04d279621d909b9a924d1a1d79dfdc5f9389691bd56dc312732c9a7aab2\": rpc error: code = NotFound desc = could not find container \"ddd8c04d279621d909b9a924d1a1d79dfdc5f9389691bd56dc312732c9a7aab2\": container with ID starting with ddd8c04d279621d909b9a924d1a1d79dfdc5f9389691bd56dc312732c9a7aab2 not found: ID does not exist" Oct 12 07:51:48 crc kubenswrapper[4599]: I1012 07:51:48.977428 4599 scope.go:117] "RemoveContainer" containerID="0c14022bfa9db06149f05d04d312fa7b9ab14179e6e0d10545f40e2809cd3368" Oct 12 07:51:48 crc kubenswrapper[4599]: E1012 07:51:48.977751 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c14022bfa9db06149f05d04d312fa7b9ab14179e6e0d10545f40e2809cd3368\": container with ID starting with 0c14022bfa9db06149f05d04d312fa7b9ab14179e6e0d10545f40e2809cd3368 not found: ID does not exist" containerID="0c14022bfa9db06149f05d04d312fa7b9ab14179e6e0d10545f40e2809cd3368" Oct 12 07:51:48 crc kubenswrapper[4599]: I1012 07:51:48.977790 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c14022bfa9db06149f05d04d312fa7b9ab14179e6e0d10545f40e2809cd3368"} err="failed to get container status \"0c14022bfa9db06149f05d04d312fa7b9ab14179e6e0d10545f40e2809cd3368\": rpc error: code = NotFound desc = could not find container \"0c14022bfa9db06149f05d04d312fa7b9ab14179e6e0d10545f40e2809cd3368\": container with ID starting with 0c14022bfa9db06149f05d04d312fa7b9ab14179e6e0d10545f40e2809cd3368 not found: ID does not exist" Oct 12 07:51:48 crc kubenswrapper[4599]: I1012 07:51:48.984413 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 12 07:51:48 crc kubenswrapper[4599]: I1012 07:51:48.992588 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 12 07:51:48 crc kubenswrapper[4599]: E1012 07:51:48.995104 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="341bb186-49ec-48b2-85a0-05c66a619132" containerName="nova-api-api" Oct 12 07:51:48 crc kubenswrapper[4599]: I1012 07:51:48.995138 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="341bb186-49ec-48b2-85a0-05c66a619132" containerName="nova-api-api" Oct 12 07:51:48 crc kubenswrapper[4599]: E1012 07:51:48.995159 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="341bb186-49ec-48b2-85a0-05c66a619132" containerName="nova-api-log" Oct 12 07:51:48 crc kubenswrapper[4599]: I1012 07:51:48.995167 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="341bb186-49ec-48b2-85a0-05c66a619132" containerName="nova-api-log" Oct 12 07:51:48 crc kubenswrapper[4599]: I1012 07:51:48.995490 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="341bb186-49ec-48b2-85a0-05c66a619132" containerName="nova-api-api" Oct 12 07:51:48 crc kubenswrapper[4599]: I1012 07:51:48.995518 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="341bb186-49ec-48b2-85a0-05c66a619132" containerName="nova-api-log" Oct 12 07:51:48 crc kubenswrapper[4599]: I1012 07:51:48.997054 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 12 07:51:48 crc kubenswrapper[4599]: I1012 07:51:48.999092 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 12 07:51:48 crc kubenswrapper[4599]: I1012 07:51:48.999094 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 12 07:51:48 crc kubenswrapper[4599]: I1012 07:51:48.999299 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 12 07:51:48 crc kubenswrapper[4599]: I1012 07:51:48.999793 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 12 07:51:49 crc kubenswrapper[4599]: I1012 07:51:49.128892 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d63b470-1f3c-4829-9f2b-f50560de4761-internal-tls-certs\") pod \"nova-api-0\" (UID: \"4d63b470-1f3c-4829-9f2b-f50560de4761\") " pod="openstack/nova-api-0" Oct 12 07:51:49 crc kubenswrapper[4599]: I1012 07:51:49.128964 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtn89\" (UniqueName: \"kubernetes.io/projected/4d63b470-1f3c-4829-9f2b-f50560de4761-kube-api-access-gtn89\") pod \"nova-api-0\" (UID: \"4d63b470-1f3c-4829-9f2b-f50560de4761\") " pod="openstack/nova-api-0" Oct 12 07:51:49 crc kubenswrapper[4599]: I1012 07:51:49.129102 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d63b470-1f3c-4829-9f2b-f50560de4761-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4d63b470-1f3c-4829-9f2b-f50560de4761\") " pod="openstack/nova-api-0" Oct 12 07:51:49 crc kubenswrapper[4599]: I1012 07:51:49.129360 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4d63b470-1f3c-4829-9f2b-f50560de4761-logs\") pod \"nova-api-0\" (UID: \"4d63b470-1f3c-4829-9f2b-f50560de4761\") " pod="openstack/nova-api-0" Oct 12 07:51:49 crc kubenswrapper[4599]: I1012 07:51:49.129577 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d63b470-1f3c-4829-9f2b-f50560de4761-public-tls-certs\") pod \"nova-api-0\" (UID: \"4d63b470-1f3c-4829-9f2b-f50560de4761\") " pod="openstack/nova-api-0" Oct 12 07:51:49 crc kubenswrapper[4599]: I1012 07:51:49.129677 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d63b470-1f3c-4829-9f2b-f50560de4761-config-data\") pod \"nova-api-0\" (UID: \"4d63b470-1f3c-4829-9f2b-f50560de4761\") " pod="openstack/nova-api-0" Oct 12 07:51:49 crc kubenswrapper[4599]: I1012 07:51:49.231876 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4d63b470-1f3c-4829-9f2b-f50560de4761-logs\") pod \"nova-api-0\" (UID: \"4d63b470-1f3c-4829-9f2b-f50560de4761\") " pod="openstack/nova-api-0" Oct 12 07:51:49 crc kubenswrapper[4599]: I1012 07:51:49.231969 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d63b470-1f3c-4829-9f2b-f50560de4761-public-tls-certs\") pod \"nova-api-0\" (UID: \"4d63b470-1f3c-4829-9f2b-f50560de4761\") " pod="openstack/nova-api-0" Oct 12 07:51:49 crc kubenswrapper[4599]: I1012 07:51:49.232004 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d63b470-1f3c-4829-9f2b-f50560de4761-config-data\") pod \"nova-api-0\" (UID: \"4d63b470-1f3c-4829-9f2b-f50560de4761\") " pod="openstack/nova-api-0" Oct 12 07:51:49 crc kubenswrapper[4599]: I1012 07:51:49.232036 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d63b470-1f3c-4829-9f2b-f50560de4761-internal-tls-certs\") pod \"nova-api-0\" (UID: \"4d63b470-1f3c-4829-9f2b-f50560de4761\") " pod="openstack/nova-api-0" Oct 12 07:51:49 crc kubenswrapper[4599]: I1012 07:51:49.232068 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtn89\" (UniqueName: \"kubernetes.io/projected/4d63b470-1f3c-4829-9f2b-f50560de4761-kube-api-access-gtn89\") pod \"nova-api-0\" (UID: \"4d63b470-1f3c-4829-9f2b-f50560de4761\") " pod="openstack/nova-api-0" Oct 12 07:51:49 crc kubenswrapper[4599]: I1012 07:51:49.232092 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d63b470-1f3c-4829-9f2b-f50560de4761-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4d63b470-1f3c-4829-9f2b-f50560de4761\") " pod="openstack/nova-api-0" Oct 12 07:51:49 crc kubenswrapper[4599]: I1012 07:51:49.232720 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4d63b470-1f3c-4829-9f2b-f50560de4761-logs\") pod \"nova-api-0\" (UID: \"4d63b470-1f3c-4829-9f2b-f50560de4761\") " pod="openstack/nova-api-0" Oct 12 07:51:49 crc kubenswrapper[4599]: I1012 07:51:49.236985 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d63b470-1f3c-4829-9f2b-f50560de4761-internal-tls-certs\") pod \"nova-api-0\" (UID: \"4d63b470-1f3c-4829-9f2b-f50560de4761\") " pod="openstack/nova-api-0" Oct 12 07:51:49 crc kubenswrapper[4599]: I1012 07:51:49.237132 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d63b470-1f3c-4829-9f2b-f50560de4761-public-tls-certs\") pod \"nova-api-0\" (UID: \"4d63b470-1f3c-4829-9f2b-f50560de4761\") " pod="openstack/nova-api-0" Oct 12 07:51:49 crc kubenswrapper[4599]: I1012 07:51:49.237497 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d63b470-1f3c-4829-9f2b-f50560de4761-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4d63b470-1f3c-4829-9f2b-f50560de4761\") " pod="openstack/nova-api-0" Oct 12 07:51:49 crc kubenswrapper[4599]: I1012 07:51:49.238085 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d63b470-1f3c-4829-9f2b-f50560de4761-config-data\") pod \"nova-api-0\" (UID: \"4d63b470-1f3c-4829-9f2b-f50560de4761\") " pod="openstack/nova-api-0" Oct 12 07:51:49 crc kubenswrapper[4599]: I1012 07:51:49.248471 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtn89\" (UniqueName: \"kubernetes.io/projected/4d63b470-1f3c-4829-9f2b-f50560de4761-kube-api-access-gtn89\") pod \"nova-api-0\" (UID: \"4d63b470-1f3c-4829-9f2b-f50560de4761\") " pod="openstack/nova-api-0" Oct 12 07:51:49 crc kubenswrapper[4599]: I1012 07:51:49.318955 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 12 07:51:49 crc kubenswrapper[4599]: I1012 07:51:49.557001 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f64dd0c-359b-4342-a814-923eb2a16de8" path="/var/lib/kubelet/pods/2f64dd0c-359b-4342-a814-923eb2a16de8/volumes" Oct 12 07:51:49 crc kubenswrapper[4599]: I1012 07:51:49.558085 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="341bb186-49ec-48b2-85a0-05c66a619132" path="/var/lib/kubelet/pods/341bb186-49ec-48b2-85a0-05c66a619132/volumes" Oct 12 07:51:49 crc kubenswrapper[4599]: I1012 07:51:49.736222 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 12 07:51:49 crc kubenswrapper[4599]: I1012 07:51:49.951324 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95d7fa90-5a03-4991-810a-59cf46e55ebf","Type":"ContainerStarted","Data":"07f8acfbcdfaf2596525edc33f5bfa5c5659dbca183adf89b264d75421a70e85"} Oct 12 07:51:49 crc kubenswrapper[4599]: I1012 07:51:49.953148 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4d63b470-1f3c-4829-9f2b-f50560de4761","Type":"ContainerStarted","Data":"64fa9a1c037021538c64e9a3010f89bc89d8f02666ba6a035b0f095101c1862b"} Oct 12 07:51:49 crc kubenswrapper[4599]: I1012 07:51:49.953190 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4d63b470-1f3c-4829-9f2b-f50560de4761","Type":"ContainerStarted","Data":"bb27c18448eb85b1eaa452044d837f2315ec51d669f63fded8d430a920cdb97e"} Oct 12 07:51:50 crc kubenswrapper[4599]: I1012 07:51:50.485299 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Oct 12 07:51:50 crc kubenswrapper[4599]: I1012 07:51:50.504566 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Oct 12 07:51:50 crc kubenswrapper[4599]: I1012 07:51:50.964425 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95d7fa90-5a03-4991-810a-59cf46e55ebf","Type":"ContainerStarted","Data":"8f69d15745c37c2cf9d30a5e6fc06d04ef9ccfcc81eb3dfab18ea15afc6871a1"} Oct 12 07:51:50 crc kubenswrapper[4599]: I1012 07:51:50.967276 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4d63b470-1f3c-4829-9f2b-f50560de4761","Type":"ContainerStarted","Data":"62f0ea2a44e2fe1ffbfbc08042079fe9de672411baf277fdd2a76fcc4cd128dd"} Oct 12 07:51:50 crc kubenswrapper[4599]: I1012 07:51:50.985327 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Oct 12 07:51:50 crc kubenswrapper[4599]: I1012 07:51:50.988301 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.9882734109999998 podStartE2EDuration="2.988273411s" podCreationTimestamp="2025-10-12 07:51:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:51:50.986853111 +0000 UTC m=+1007.776048613" watchObservedRunningTime="2025-10-12 07:51:50.988273411 +0000 UTC m=+1007.777468913" Oct 12 07:51:51 crc kubenswrapper[4599]: I1012 07:51:51.112755 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-mz8gv"] Oct 12 07:51:51 crc kubenswrapper[4599]: I1012 07:51:51.114225 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-mz8gv" Oct 12 07:51:51 crc kubenswrapper[4599]: I1012 07:51:51.116416 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Oct 12 07:51:51 crc kubenswrapper[4599]: I1012 07:51:51.117360 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Oct 12 07:51:51 crc kubenswrapper[4599]: I1012 07:51:51.124552 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-mz8gv"] Oct 12 07:51:51 crc kubenswrapper[4599]: I1012 07:51:51.275168 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7331ab7-59b5-4d49-9314-e59cbb00f156-scripts\") pod \"nova-cell1-cell-mapping-mz8gv\" (UID: \"e7331ab7-59b5-4d49-9314-e59cbb00f156\") " pod="openstack/nova-cell1-cell-mapping-mz8gv" Oct 12 07:51:51 crc kubenswrapper[4599]: I1012 07:51:51.275282 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7331ab7-59b5-4d49-9314-e59cbb00f156-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-mz8gv\" (UID: \"e7331ab7-59b5-4d49-9314-e59cbb00f156\") " pod="openstack/nova-cell1-cell-mapping-mz8gv" Oct 12 07:51:51 crc kubenswrapper[4599]: I1012 07:51:51.275403 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7331ab7-59b5-4d49-9314-e59cbb00f156-config-data\") pod \"nova-cell1-cell-mapping-mz8gv\" (UID: \"e7331ab7-59b5-4d49-9314-e59cbb00f156\") " pod="openstack/nova-cell1-cell-mapping-mz8gv" Oct 12 07:51:51 crc kubenswrapper[4599]: I1012 07:51:51.275442 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjbbf\" (UniqueName: \"kubernetes.io/projected/e7331ab7-59b5-4d49-9314-e59cbb00f156-kube-api-access-qjbbf\") pod \"nova-cell1-cell-mapping-mz8gv\" (UID: \"e7331ab7-59b5-4d49-9314-e59cbb00f156\") " pod="openstack/nova-cell1-cell-mapping-mz8gv" Oct 12 07:51:51 crc kubenswrapper[4599]: I1012 07:51:51.376738 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7331ab7-59b5-4d49-9314-e59cbb00f156-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-mz8gv\" (UID: \"e7331ab7-59b5-4d49-9314-e59cbb00f156\") " pod="openstack/nova-cell1-cell-mapping-mz8gv" Oct 12 07:51:51 crc kubenswrapper[4599]: I1012 07:51:51.376859 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7331ab7-59b5-4d49-9314-e59cbb00f156-config-data\") pod \"nova-cell1-cell-mapping-mz8gv\" (UID: \"e7331ab7-59b5-4d49-9314-e59cbb00f156\") " pod="openstack/nova-cell1-cell-mapping-mz8gv" Oct 12 07:51:51 crc kubenswrapper[4599]: I1012 07:51:51.376908 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjbbf\" (UniqueName: \"kubernetes.io/projected/e7331ab7-59b5-4d49-9314-e59cbb00f156-kube-api-access-qjbbf\") pod \"nova-cell1-cell-mapping-mz8gv\" (UID: \"e7331ab7-59b5-4d49-9314-e59cbb00f156\") " pod="openstack/nova-cell1-cell-mapping-mz8gv" Oct 12 07:51:51 crc kubenswrapper[4599]: I1012 07:51:51.377000 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7331ab7-59b5-4d49-9314-e59cbb00f156-scripts\") pod \"nova-cell1-cell-mapping-mz8gv\" (UID: \"e7331ab7-59b5-4d49-9314-e59cbb00f156\") " pod="openstack/nova-cell1-cell-mapping-mz8gv" Oct 12 07:51:51 crc kubenswrapper[4599]: I1012 07:51:51.380868 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7331ab7-59b5-4d49-9314-e59cbb00f156-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-mz8gv\" (UID: \"e7331ab7-59b5-4d49-9314-e59cbb00f156\") " pod="openstack/nova-cell1-cell-mapping-mz8gv" Oct 12 07:51:51 crc kubenswrapper[4599]: I1012 07:51:51.381742 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7331ab7-59b5-4d49-9314-e59cbb00f156-config-data\") pod \"nova-cell1-cell-mapping-mz8gv\" (UID: \"e7331ab7-59b5-4d49-9314-e59cbb00f156\") " pod="openstack/nova-cell1-cell-mapping-mz8gv" Oct 12 07:51:51 crc kubenswrapper[4599]: I1012 07:51:51.384750 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7331ab7-59b5-4d49-9314-e59cbb00f156-scripts\") pod \"nova-cell1-cell-mapping-mz8gv\" (UID: \"e7331ab7-59b5-4d49-9314-e59cbb00f156\") " pod="openstack/nova-cell1-cell-mapping-mz8gv" Oct 12 07:51:51 crc kubenswrapper[4599]: I1012 07:51:51.405665 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjbbf\" (UniqueName: \"kubernetes.io/projected/e7331ab7-59b5-4d49-9314-e59cbb00f156-kube-api-access-qjbbf\") pod \"nova-cell1-cell-mapping-mz8gv\" (UID: \"e7331ab7-59b5-4d49-9314-e59cbb00f156\") " pod="openstack/nova-cell1-cell-mapping-mz8gv" Oct 12 07:51:51 crc kubenswrapper[4599]: I1012 07:51:51.433412 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-mz8gv" Oct 12 07:51:51 crc kubenswrapper[4599]: I1012 07:51:51.832617 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-mz8gv"] Oct 12 07:51:51 crc kubenswrapper[4599]: W1012 07:51:51.836734 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode7331ab7_59b5_4d49_9314_e59cbb00f156.slice/crio-48db29a446f256d573b754287281cc83745cf1aa1ca801fb7d853783c0ba123e WatchSource:0}: Error finding container 48db29a446f256d573b754287281cc83745cf1aa1ca801fb7d853783c0ba123e: Status 404 returned error can't find the container with id 48db29a446f256d573b754287281cc83745cf1aa1ca801fb7d853783c0ba123e Oct 12 07:51:51 crc kubenswrapper[4599]: I1012 07:51:51.981453 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-mz8gv" event={"ID":"e7331ab7-59b5-4d49-9314-e59cbb00f156","Type":"ContainerStarted","Data":"48db29a446f256d573b754287281cc83745cf1aa1ca801fb7d853783c0ba123e"} Oct 12 07:51:51 crc kubenswrapper[4599]: I1012 07:51:51.988717 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95d7fa90-5a03-4991-810a-59cf46e55ebf","Type":"ContainerStarted","Data":"68a1843e3998ec2cec4c0a314da555bc7caaa6aec95a06336db4d2eae6eb88ce"} Oct 12 07:51:52 crc kubenswrapper[4599]: I1012 07:51:52.625210 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57db76b469-s5l9q" Oct 12 07:51:52 crc kubenswrapper[4599]: I1012 07:51:52.693082 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-94c48d4d7-z6cgn"] Oct 12 07:51:52 crc kubenswrapper[4599]: I1012 07:51:52.693367 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-94c48d4d7-z6cgn" podUID="8361da66-d18b-4f7c-8c5e-6e1fc04b8f11" containerName="dnsmasq-dns" containerID="cri-o://d90966a1acb1c1aa7d8e5c9d09d5294767275b776e8b1fb1a864173cd5629e23" gracePeriod=10 Oct 12 07:51:53 crc kubenswrapper[4599]: I1012 07:51:53.006398 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-mz8gv" event={"ID":"e7331ab7-59b5-4d49-9314-e59cbb00f156","Type":"ContainerStarted","Data":"e36713052874f7760fa2c79f35758ba1305bdd4f4f880c287b39882eb5b5c250"} Oct 12 07:51:53 crc kubenswrapper[4599]: I1012 07:51:53.013714 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95d7fa90-5a03-4991-810a-59cf46e55ebf","Type":"ContainerStarted","Data":"9a777b1e2ba7226cbbf2ecfa89b7a9be891ae16cbc847bdc8c43d6cfae9955a3"} Oct 12 07:51:53 crc kubenswrapper[4599]: I1012 07:51:53.013874 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 12 07:51:53 crc kubenswrapper[4599]: I1012 07:51:53.016021 4599 generic.go:334] "Generic (PLEG): container finished" podID="8361da66-d18b-4f7c-8c5e-6e1fc04b8f11" containerID="d90966a1acb1c1aa7d8e5c9d09d5294767275b776e8b1fb1a864173cd5629e23" exitCode=0 Oct 12 07:51:53 crc kubenswrapper[4599]: I1012 07:51:53.016065 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-94c48d4d7-z6cgn" event={"ID":"8361da66-d18b-4f7c-8c5e-6e1fc04b8f11","Type":"ContainerDied","Data":"d90966a1acb1c1aa7d8e5c9d09d5294767275b776e8b1fb1a864173cd5629e23"} Oct 12 07:51:53 crc kubenswrapper[4599]: I1012 07:51:53.022701 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-mz8gv" podStartSLOduration=2.02268583 podStartE2EDuration="2.02268583s" podCreationTimestamp="2025-10-12 07:51:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:51:53.018382259 +0000 UTC m=+1009.807577761" watchObservedRunningTime="2025-10-12 07:51:53.02268583 +0000 UTC m=+1009.811881332" Oct 12 07:51:53 crc kubenswrapper[4599]: I1012 07:51:53.077828 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.204940119 podStartE2EDuration="6.077802474s" podCreationTimestamp="2025-10-12 07:51:47 +0000 UTC" firstStartedPulling="2025-10-12 07:51:48.784068739 +0000 UTC m=+1005.573264241" lastFinishedPulling="2025-10-12 07:51:52.656931094 +0000 UTC m=+1009.446126596" observedRunningTime="2025-10-12 07:51:53.03889826 +0000 UTC m=+1009.828093761" watchObservedRunningTime="2025-10-12 07:51:53.077802474 +0000 UTC m=+1009.866997976" Oct 12 07:51:53 crc kubenswrapper[4599]: I1012 07:51:53.088507 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-94c48d4d7-z6cgn" Oct 12 07:51:53 crc kubenswrapper[4599]: I1012 07:51:53.114470 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8361da66-d18b-4f7c-8c5e-6e1fc04b8f11-ovsdbserver-sb\") pod \"8361da66-d18b-4f7c-8c5e-6e1fc04b8f11\" (UID: \"8361da66-d18b-4f7c-8c5e-6e1fc04b8f11\") " Oct 12 07:51:53 crc kubenswrapper[4599]: I1012 07:51:53.114568 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8361da66-d18b-4f7c-8c5e-6e1fc04b8f11-dns-svc\") pod \"8361da66-d18b-4f7c-8c5e-6e1fc04b8f11\" (UID: \"8361da66-d18b-4f7c-8c5e-6e1fc04b8f11\") " Oct 12 07:51:53 crc kubenswrapper[4599]: I1012 07:51:53.114587 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8361da66-d18b-4f7c-8c5e-6e1fc04b8f11-dns-swift-storage-0\") pod \"8361da66-d18b-4f7c-8c5e-6e1fc04b8f11\" (UID: \"8361da66-d18b-4f7c-8c5e-6e1fc04b8f11\") " Oct 12 07:51:53 crc kubenswrapper[4599]: I1012 07:51:53.114650 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j69hk\" (UniqueName: \"kubernetes.io/projected/8361da66-d18b-4f7c-8c5e-6e1fc04b8f11-kube-api-access-j69hk\") pod \"8361da66-d18b-4f7c-8c5e-6e1fc04b8f11\" (UID: \"8361da66-d18b-4f7c-8c5e-6e1fc04b8f11\") " Oct 12 07:51:53 crc kubenswrapper[4599]: I1012 07:51:53.114737 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8361da66-d18b-4f7c-8c5e-6e1fc04b8f11-ovsdbserver-nb\") pod \"8361da66-d18b-4f7c-8c5e-6e1fc04b8f11\" (UID: \"8361da66-d18b-4f7c-8c5e-6e1fc04b8f11\") " Oct 12 07:51:53 crc kubenswrapper[4599]: I1012 07:51:53.114769 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8361da66-d18b-4f7c-8c5e-6e1fc04b8f11-config\") pod \"8361da66-d18b-4f7c-8c5e-6e1fc04b8f11\" (UID: \"8361da66-d18b-4f7c-8c5e-6e1fc04b8f11\") " Oct 12 07:51:53 crc kubenswrapper[4599]: I1012 07:51:53.121695 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8361da66-d18b-4f7c-8c5e-6e1fc04b8f11-kube-api-access-j69hk" (OuterVolumeSpecName: "kube-api-access-j69hk") pod "8361da66-d18b-4f7c-8c5e-6e1fc04b8f11" (UID: "8361da66-d18b-4f7c-8c5e-6e1fc04b8f11"). InnerVolumeSpecName "kube-api-access-j69hk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:51:53 crc kubenswrapper[4599]: I1012 07:51:53.166511 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8361da66-d18b-4f7c-8c5e-6e1fc04b8f11-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8361da66-d18b-4f7c-8c5e-6e1fc04b8f11" (UID: "8361da66-d18b-4f7c-8c5e-6e1fc04b8f11"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:51:53 crc kubenswrapper[4599]: I1012 07:51:53.173825 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8361da66-d18b-4f7c-8c5e-6e1fc04b8f11-config" (OuterVolumeSpecName: "config") pod "8361da66-d18b-4f7c-8c5e-6e1fc04b8f11" (UID: "8361da66-d18b-4f7c-8c5e-6e1fc04b8f11"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:51:53 crc kubenswrapper[4599]: I1012 07:51:53.173867 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8361da66-d18b-4f7c-8c5e-6e1fc04b8f11-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "8361da66-d18b-4f7c-8c5e-6e1fc04b8f11" (UID: "8361da66-d18b-4f7c-8c5e-6e1fc04b8f11"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:51:53 crc kubenswrapper[4599]: I1012 07:51:53.176302 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8361da66-d18b-4f7c-8c5e-6e1fc04b8f11-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8361da66-d18b-4f7c-8c5e-6e1fc04b8f11" (UID: "8361da66-d18b-4f7c-8c5e-6e1fc04b8f11"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:51:53 crc kubenswrapper[4599]: I1012 07:51:53.189400 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8361da66-d18b-4f7c-8c5e-6e1fc04b8f11-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8361da66-d18b-4f7c-8c5e-6e1fc04b8f11" (UID: "8361da66-d18b-4f7c-8c5e-6e1fc04b8f11"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:51:53 crc kubenswrapper[4599]: I1012 07:51:53.216727 4599 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8361da66-d18b-4f7c-8c5e-6e1fc04b8f11-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 12 07:51:53 crc kubenswrapper[4599]: I1012 07:51:53.216754 4599 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8361da66-d18b-4f7c-8c5e-6e1fc04b8f11-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 12 07:51:53 crc kubenswrapper[4599]: I1012 07:51:53.216764 4599 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8361da66-d18b-4f7c-8c5e-6e1fc04b8f11-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 12 07:51:53 crc kubenswrapper[4599]: I1012 07:51:53.216776 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j69hk\" (UniqueName: \"kubernetes.io/projected/8361da66-d18b-4f7c-8c5e-6e1fc04b8f11-kube-api-access-j69hk\") on node \"crc\" DevicePath \"\"" Oct 12 07:51:53 crc kubenswrapper[4599]: I1012 07:51:53.216785 4599 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8361da66-d18b-4f7c-8c5e-6e1fc04b8f11-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 12 07:51:53 crc kubenswrapper[4599]: I1012 07:51:53.216793 4599 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8361da66-d18b-4f7c-8c5e-6e1fc04b8f11-config\") on node \"crc\" DevicePath \"\"" Oct 12 07:51:54 crc kubenswrapper[4599]: I1012 07:51:54.038803 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-94c48d4d7-z6cgn" Oct 12 07:51:54 crc kubenswrapper[4599]: I1012 07:51:54.039252 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-94c48d4d7-z6cgn" event={"ID":"8361da66-d18b-4f7c-8c5e-6e1fc04b8f11","Type":"ContainerDied","Data":"86ee33f29c0b4e525dc697531c0fd590867b86494f74dec87ac79c38f77490ab"} Oct 12 07:51:54 crc kubenswrapper[4599]: I1012 07:51:54.039297 4599 scope.go:117] "RemoveContainer" containerID="d90966a1acb1c1aa7d8e5c9d09d5294767275b776e8b1fb1a864173cd5629e23" Oct 12 07:51:54 crc kubenswrapper[4599]: I1012 07:51:54.061405 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-94c48d4d7-z6cgn"] Oct 12 07:51:54 crc kubenswrapper[4599]: I1012 07:51:54.066937 4599 scope.go:117] "RemoveContainer" containerID="434e31289d95dba5b878fa38d669f637c9667e9b30ef07bc25abe9022084dccb" Oct 12 07:51:54 crc kubenswrapper[4599]: I1012 07:51:54.070239 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-94c48d4d7-z6cgn"] Oct 12 07:51:55 crc kubenswrapper[4599]: I1012 07:51:55.554657 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8361da66-d18b-4f7c-8c5e-6e1fc04b8f11" path="/var/lib/kubelet/pods/8361da66-d18b-4f7c-8c5e-6e1fc04b8f11/volumes" Oct 12 07:51:57 crc kubenswrapper[4599]: I1012 07:51:57.072002 4599 generic.go:334] "Generic (PLEG): container finished" podID="e7331ab7-59b5-4d49-9314-e59cbb00f156" containerID="e36713052874f7760fa2c79f35758ba1305bdd4f4f880c287b39882eb5b5c250" exitCode=0 Oct 12 07:51:57 crc kubenswrapper[4599]: I1012 07:51:57.072065 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-mz8gv" event={"ID":"e7331ab7-59b5-4d49-9314-e59cbb00f156","Type":"ContainerDied","Data":"e36713052874f7760fa2c79f35758ba1305bdd4f4f880c287b39882eb5b5c250"} Oct 12 07:51:58 crc kubenswrapper[4599]: I1012 07:51:58.393987 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-mz8gv" Oct 12 07:51:58 crc kubenswrapper[4599]: I1012 07:51:58.554607 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7331ab7-59b5-4d49-9314-e59cbb00f156-scripts\") pod \"e7331ab7-59b5-4d49-9314-e59cbb00f156\" (UID: \"e7331ab7-59b5-4d49-9314-e59cbb00f156\") " Oct 12 07:51:58 crc kubenswrapper[4599]: I1012 07:51:58.554762 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7331ab7-59b5-4d49-9314-e59cbb00f156-combined-ca-bundle\") pod \"e7331ab7-59b5-4d49-9314-e59cbb00f156\" (UID: \"e7331ab7-59b5-4d49-9314-e59cbb00f156\") " Oct 12 07:51:58 crc kubenswrapper[4599]: I1012 07:51:58.555255 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7331ab7-59b5-4d49-9314-e59cbb00f156-config-data\") pod \"e7331ab7-59b5-4d49-9314-e59cbb00f156\" (UID: \"e7331ab7-59b5-4d49-9314-e59cbb00f156\") " Oct 12 07:51:58 crc kubenswrapper[4599]: I1012 07:51:58.555992 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qjbbf\" (UniqueName: \"kubernetes.io/projected/e7331ab7-59b5-4d49-9314-e59cbb00f156-kube-api-access-qjbbf\") pod \"e7331ab7-59b5-4d49-9314-e59cbb00f156\" (UID: \"e7331ab7-59b5-4d49-9314-e59cbb00f156\") " Oct 12 07:51:58 crc kubenswrapper[4599]: I1012 07:51:58.561975 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7331ab7-59b5-4d49-9314-e59cbb00f156-kube-api-access-qjbbf" (OuterVolumeSpecName: "kube-api-access-qjbbf") pod "e7331ab7-59b5-4d49-9314-e59cbb00f156" (UID: "e7331ab7-59b5-4d49-9314-e59cbb00f156"). InnerVolumeSpecName "kube-api-access-qjbbf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:51:58 crc kubenswrapper[4599]: I1012 07:51:58.568715 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7331ab7-59b5-4d49-9314-e59cbb00f156-scripts" (OuterVolumeSpecName: "scripts") pod "e7331ab7-59b5-4d49-9314-e59cbb00f156" (UID: "e7331ab7-59b5-4d49-9314-e59cbb00f156"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:51:58 crc kubenswrapper[4599]: I1012 07:51:58.584721 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7331ab7-59b5-4d49-9314-e59cbb00f156-config-data" (OuterVolumeSpecName: "config-data") pod "e7331ab7-59b5-4d49-9314-e59cbb00f156" (UID: "e7331ab7-59b5-4d49-9314-e59cbb00f156"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:51:58 crc kubenswrapper[4599]: I1012 07:51:58.590356 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7331ab7-59b5-4d49-9314-e59cbb00f156-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e7331ab7-59b5-4d49-9314-e59cbb00f156" (UID: "e7331ab7-59b5-4d49-9314-e59cbb00f156"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:51:58 crc kubenswrapper[4599]: I1012 07:51:58.659949 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qjbbf\" (UniqueName: \"kubernetes.io/projected/e7331ab7-59b5-4d49-9314-e59cbb00f156-kube-api-access-qjbbf\") on node \"crc\" DevicePath \"\"" Oct 12 07:51:58 crc kubenswrapper[4599]: I1012 07:51:58.659977 4599 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7331ab7-59b5-4d49-9314-e59cbb00f156-scripts\") on node \"crc\" DevicePath \"\"" Oct 12 07:51:58 crc kubenswrapper[4599]: I1012 07:51:58.659989 4599 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7331ab7-59b5-4d49-9314-e59cbb00f156-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 07:51:58 crc kubenswrapper[4599]: I1012 07:51:58.660001 4599 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7331ab7-59b5-4d49-9314-e59cbb00f156-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 07:51:59 crc kubenswrapper[4599]: I1012 07:51:59.095433 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-mz8gv" event={"ID":"e7331ab7-59b5-4d49-9314-e59cbb00f156","Type":"ContainerDied","Data":"48db29a446f256d573b754287281cc83745cf1aa1ca801fb7d853783c0ba123e"} Oct 12 07:51:59 crc kubenswrapper[4599]: I1012 07:51:59.095834 4599 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="48db29a446f256d573b754287281cc83745cf1aa1ca801fb7d853783c0ba123e" Oct 12 07:51:59 crc kubenswrapper[4599]: I1012 07:51:59.095502 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-mz8gv" Oct 12 07:51:59 crc kubenswrapper[4599]: I1012 07:51:59.314290 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 12 07:51:59 crc kubenswrapper[4599]: I1012 07:51:59.314549 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="9bbec8b9-8aae-4ee9-a258-1c7bb7a75fd0" containerName="nova-scheduler-scheduler" containerID="cri-o://045f9179b498ac00a5359fba8cf9dd1337a8e9970cd9e345df9d4d0b0841076b" gracePeriod=30 Oct 12 07:51:59 crc kubenswrapper[4599]: I1012 07:51:59.319485 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 12 07:51:59 crc kubenswrapper[4599]: I1012 07:51:59.319545 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 12 07:51:59 crc kubenswrapper[4599]: I1012 07:51:59.329257 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 12 07:51:59 crc kubenswrapper[4599]: I1012 07:51:59.329827 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="44ffa825-906e-4f85-9da7-bf4f38d28e59" containerName="nova-metadata-log" containerID="cri-o://fd084a8cf01892565881cff8cde7533a774ae578612d822ce20bea0fe2efc930" gracePeriod=30 Oct 12 07:51:59 crc kubenswrapper[4599]: I1012 07:51:59.329954 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="44ffa825-906e-4f85-9da7-bf4f38d28e59" containerName="nova-metadata-metadata" containerID="cri-o://65fb22f48efa762e822f7330fbe6757c38f2eae38bc94dbe46fc002baf4228cd" gracePeriod=30 Oct 12 07:51:59 crc kubenswrapper[4599]: I1012 07:51:59.343508 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 12 07:52:00 crc kubenswrapper[4599]: I1012 07:52:00.105366 4599 generic.go:334] "Generic (PLEG): container finished" podID="44ffa825-906e-4f85-9da7-bf4f38d28e59" containerID="fd084a8cf01892565881cff8cde7533a774ae578612d822ce20bea0fe2efc930" exitCode=143 Oct 12 07:52:00 crc kubenswrapper[4599]: I1012 07:52:00.105552 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"44ffa825-906e-4f85-9da7-bf4f38d28e59","Type":"ContainerDied","Data":"fd084a8cf01892565881cff8cde7533a774ae578612d822ce20bea0fe2efc930"} Oct 12 07:52:00 crc kubenswrapper[4599]: I1012 07:52:00.105796 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="4d63b470-1f3c-4829-9f2b-f50560de4761" containerName="nova-api-log" containerID="cri-o://64fa9a1c037021538c64e9a3010f89bc89d8f02666ba6a035b0f095101c1862b" gracePeriod=30 Oct 12 07:52:00 crc kubenswrapper[4599]: I1012 07:52:00.106115 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="4d63b470-1f3c-4829-9f2b-f50560de4761" containerName="nova-api-api" containerID="cri-o://62f0ea2a44e2fe1ffbfbc08042079fe9de672411baf277fdd2a76fcc4cd128dd" gracePeriod=30 Oct 12 07:52:00 crc kubenswrapper[4599]: I1012 07:52:00.118950 4599 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="4d63b470-1f3c-4829-9f2b-f50560de4761" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.198:8774/\": EOF" Oct 12 07:52:00 crc kubenswrapper[4599]: I1012 07:52:00.121856 4599 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="4d63b470-1f3c-4829-9f2b-f50560de4761" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.198:8774/\": EOF" Oct 12 07:52:00 crc kubenswrapper[4599]: I1012 07:52:00.640699 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 12 07:52:00 crc kubenswrapper[4599]: I1012 07:52:00.708396 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bbec8b9-8aae-4ee9-a258-1c7bb7a75fd0-config-data\") pod \"9bbec8b9-8aae-4ee9-a258-1c7bb7a75fd0\" (UID: \"9bbec8b9-8aae-4ee9-a258-1c7bb7a75fd0\") " Oct 12 07:52:00 crc kubenswrapper[4599]: I1012 07:52:00.708473 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bbec8b9-8aae-4ee9-a258-1c7bb7a75fd0-combined-ca-bundle\") pod \"9bbec8b9-8aae-4ee9-a258-1c7bb7a75fd0\" (UID: \"9bbec8b9-8aae-4ee9-a258-1c7bb7a75fd0\") " Oct 12 07:52:00 crc kubenswrapper[4599]: I1012 07:52:00.708515 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vqbz\" (UniqueName: \"kubernetes.io/projected/9bbec8b9-8aae-4ee9-a258-1c7bb7a75fd0-kube-api-access-8vqbz\") pod \"9bbec8b9-8aae-4ee9-a258-1c7bb7a75fd0\" (UID: \"9bbec8b9-8aae-4ee9-a258-1c7bb7a75fd0\") " Oct 12 07:52:00 crc kubenswrapper[4599]: I1012 07:52:00.716824 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9bbec8b9-8aae-4ee9-a258-1c7bb7a75fd0-kube-api-access-8vqbz" (OuterVolumeSpecName: "kube-api-access-8vqbz") pod "9bbec8b9-8aae-4ee9-a258-1c7bb7a75fd0" (UID: "9bbec8b9-8aae-4ee9-a258-1c7bb7a75fd0"). InnerVolumeSpecName "kube-api-access-8vqbz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:52:00 crc kubenswrapper[4599]: I1012 07:52:00.734183 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bbec8b9-8aae-4ee9-a258-1c7bb7a75fd0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9bbec8b9-8aae-4ee9-a258-1c7bb7a75fd0" (UID: "9bbec8b9-8aae-4ee9-a258-1c7bb7a75fd0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:52:00 crc kubenswrapper[4599]: I1012 07:52:00.734586 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bbec8b9-8aae-4ee9-a258-1c7bb7a75fd0-config-data" (OuterVolumeSpecName: "config-data") pod "9bbec8b9-8aae-4ee9-a258-1c7bb7a75fd0" (UID: "9bbec8b9-8aae-4ee9-a258-1c7bb7a75fd0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:52:00 crc kubenswrapper[4599]: I1012 07:52:00.812402 4599 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bbec8b9-8aae-4ee9-a258-1c7bb7a75fd0-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 07:52:00 crc kubenswrapper[4599]: I1012 07:52:00.812446 4599 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bbec8b9-8aae-4ee9-a258-1c7bb7a75fd0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 07:52:00 crc kubenswrapper[4599]: I1012 07:52:00.812461 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8vqbz\" (UniqueName: \"kubernetes.io/projected/9bbec8b9-8aae-4ee9-a258-1c7bb7a75fd0-kube-api-access-8vqbz\") on node \"crc\" DevicePath \"\"" Oct 12 07:52:01 crc kubenswrapper[4599]: I1012 07:52:01.116910 4599 generic.go:334] "Generic (PLEG): container finished" podID="4d63b470-1f3c-4829-9f2b-f50560de4761" containerID="64fa9a1c037021538c64e9a3010f89bc89d8f02666ba6a035b0f095101c1862b" exitCode=143 Oct 12 07:52:01 crc kubenswrapper[4599]: I1012 07:52:01.117735 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4d63b470-1f3c-4829-9f2b-f50560de4761","Type":"ContainerDied","Data":"64fa9a1c037021538c64e9a3010f89bc89d8f02666ba6a035b0f095101c1862b"} Oct 12 07:52:01 crc kubenswrapper[4599]: I1012 07:52:01.119372 4599 generic.go:334] "Generic (PLEG): container finished" podID="9bbec8b9-8aae-4ee9-a258-1c7bb7a75fd0" containerID="045f9179b498ac00a5359fba8cf9dd1337a8e9970cd9e345df9d4d0b0841076b" exitCode=0 Oct 12 07:52:01 crc kubenswrapper[4599]: I1012 07:52:01.119399 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9bbec8b9-8aae-4ee9-a258-1c7bb7a75fd0","Type":"ContainerDied","Data":"045f9179b498ac00a5359fba8cf9dd1337a8e9970cd9e345df9d4d0b0841076b"} Oct 12 07:52:01 crc kubenswrapper[4599]: I1012 07:52:01.119416 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9bbec8b9-8aae-4ee9-a258-1c7bb7a75fd0","Type":"ContainerDied","Data":"18b15edd1792c5b1d06a3bf2ac5b445e627efcee2e31bda0225826675b0cb81f"} Oct 12 07:52:01 crc kubenswrapper[4599]: I1012 07:52:01.119439 4599 scope.go:117] "RemoveContainer" containerID="045f9179b498ac00a5359fba8cf9dd1337a8e9970cd9e345df9d4d0b0841076b" Oct 12 07:52:01 crc kubenswrapper[4599]: I1012 07:52:01.119599 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 12 07:52:01 crc kubenswrapper[4599]: I1012 07:52:01.155742 4599 scope.go:117] "RemoveContainer" containerID="045f9179b498ac00a5359fba8cf9dd1337a8e9970cd9e345df9d4d0b0841076b" Oct 12 07:52:01 crc kubenswrapper[4599]: E1012 07:52:01.156208 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"045f9179b498ac00a5359fba8cf9dd1337a8e9970cd9e345df9d4d0b0841076b\": container with ID starting with 045f9179b498ac00a5359fba8cf9dd1337a8e9970cd9e345df9d4d0b0841076b not found: ID does not exist" containerID="045f9179b498ac00a5359fba8cf9dd1337a8e9970cd9e345df9d4d0b0841076b" Oct 12 07:52:01 crc kubenswrapper[4599]: I1012 07:52:01.156250 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"045f9179b498ac00a5359fba8cf9dd1337a8e9970cd9e345df9d4d0b0841076b"} err="failed to get container status \"045f9179b498ac00a5359fba8cf9dd1337a8e9970cd9e345df9d4d0b0841076b\": rpc error: code = NotFound desc = could not find container \"045f9179b498ac00a5359fba8cf9dd1337a8e9970cd9e345df9d4d0b0841076b\": container with ID starting with 045f9179b498ac00a5359fba8cf9dd1337a8e9970cd9e345df9d4d0b0841076b not found: ID does not exist" Oct 12 07:52:01 crc kubenswrapper[4599]: I1012 07:52:01.158736 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 12 07:52:01 crc kubenswrapper[4599]: I1012 07:52:01.165648 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 12 07:52:01 crc kubenswrapper[4599]: I1012 07:52:01.174027 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 12 07:52:01 crc kubenswrapper[4599]: E1012 07:52:01.174584 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8361da66-d18b-4f7c-8c5e-6e1fc04b8f11" containerName="init" Oct 12 07:52:01 crc kubenswrapper[4599]: I1012 07:52:01.174654 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="8361da66-d18b-4f7c-8c5e-6e1fc04b8f11" containerName="init" Oct 12 07:52:01 crc kubenswrapper[4599]: E1012 07:52:01.174729 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bbec8b9-8aae-4ee9-a258-1c7bb7a75fd0" containerName="nova-scheduler-scheduler" Oct 12 07:52:01 crc kubenswrapper[4599]: I1012 07:52:01.174779 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bbec8b9-8aae-4ee9-a258-1c7bb7a75fd0" containerName="nova-scheduler-scheduler" Oct 12 07:52:01 crc kubenswrapper[4599]: E1012 07:52:01.174841 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7331ab7-59b5-4d49-9314-e59cbb00f156" containerName="nova-manage" Oct 12 07:52:01 crc kubenswrapper[4599]: I1012 07:52:01.174898 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7331ab7-59b5-4d49-9314-e59cbb00f156" containerName="nova-manage" Oct 12 07:52:01 crc kubenswrapper[4599]: E1012 07:52:01.174972 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8361da66-d18b-4f7c-8c5e-6e1fc04b8f11" containerName="dnsmasq-dns" Oct 12 07:52:01 crc kubenswrapper[4599]: I1012 07:52:01.175021 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="8361da66-d18b-4f7c-8c5e-6e1fc04b8f11" containerName="dnsmasq-dns" Oct 12 07:52:01 crc kubenswrapper[4599]: I1012 07:52:01.175237 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7331ab7-59b5-4d49-9314-e59cbb00f156" containerName="nova-manage" Oct 12 07:52:01 crc kubenswrapper[4599]: I1012 07:52:01.176854 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="8361da66-d18b-4f7c-8c5e-6e1fc04b8f11" containerName="dnsmasq-dns" Oct 12 07:52:01 crc kubenswrapper[4599]: I1012 07:52:01.176877 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bbec8b9-8aae-4ee9-a258-1c7bb7a75fd0" containerName="nova-scheduler-scheduler" Oct 12 07:52:01 crc kubenswrapper[4599]: I1012 07:52:01.177688 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 12 07:52:01 crc kubenswrapper[4599]: I1012 07:52:01.184862 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 12 07:52:01 crc kubenswrapper[4599]: I1012 07:52:01.190964 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 12 07:52:01 crc kubenswrapper[4599]: I1012 07:52:01.219643 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/223d3e1a-86ac-49d9-a231-b77957770434-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"223d3e1a-86ac-49d9-a231-b77957770434\") " pod="openstack/nova-scheduler-0" Oct 12 07:52:01 crc kubenswrapper[4599]: I1012 07:52:01.219741 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/223d3e1a-86ac-49d9-a231-b77957770434-config-data\") pod \"nova-scheduler-0\" (UID: \"223d3e1a-86ac-49d9-a231-b77957770434\") " pod="openstack/nova-scheduler-0" Oct 12 07:52:01 crc kubenswrapper[4599]: I1012 07:52:01.219793 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgxnf\" (UniqueName: \"kubernetes.io/projected/223d3e1a-86ac-49d9-a231-b77957770434-kube-api-access-rgxnf\") pod \"nova-scheduler-0\" (UID: \"223d3e1a-86ac-49d9-a231-b77957770434\") " pod="openstack/nova-scheduler-0" Oct 12 07:52:01 crc kubenswrapper[4599]: I1012 07:52:01.320769 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/223d3e1a-86ac-49d9-a231-b77957770434-config-data\") pod \"nova-scheduler-0\" (UID: \"223d3e1a-86ac-49d9-a231-b77957770434\") " pod="openstack/nova-scheduler-0" Oct 12 07:52:01 crc kubenswrapper[4599]: I1012 07:52:01.321413 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgxnf\" (UniqueName: \"kubernetes.io/projected/223d3e1a-86ac-49d9-a231-b77957770434-kube-api-access-rgxnf\") pod \"nova-scheduler-0\" (UID: \"223d3e1a-86ac-49d9-a231-b77957770434\") " pod="openstack/nova-scheduler-0" Oct 12 07:52:01 crc kubenswrapper[4599]: I1012 07:52:01.321534 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/223d3e1a-86ac-49d9-a231-b77957770434-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"223d3e1a-86ac-49d9-a231-b77957770434\") " pod="openstack/nova-scheduler-0" Oct 12 07:52:01 crc kubenswrapper[4599]: I1012 07:52:01.325441 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/223d3e1a-86ac-49d9-a231-b77957770434-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"223d3e1a-86ac-49d9-a231-b77957770434\") " pod="openstack/nova-scheduler-0" Oct 12 07:52:01 crc kubenswrapper[4599]: I1012 07:52:01.331775 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/223d3e1a-86ac-49d9-a231-b77957770434-config-data\") pod \"nova-scheduler-0\" (UID: \"223d3e1a-86ac-49d9-a231-b77957770434\") " pod="openstack/nova-scheduler-0" Oct 12 07:52:01 crc kubenswrapper[4599]: I1012 07:52:01.340007 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgxnf\" (UniqueName: \"kubernetes.io/projected/223d3e1a-86ac-49d9-a231-b77957770434-kube-api-access-rgxnf\") pod \"nova-scheduler-0\" (UID: \"223d3e1a-86ac-49d9-a231-b77957770434\") " pod="openstack/nova-scheduler-0" Oct 12 07:52:01 crc kubenswrapper[4599]: I1012 07:52:01.492677 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 12 07:52:01 crc kubenswrapper[4599]: I1012 07:52:01.554889 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9bbec8b9-8aae-4ee9-a258-1c7bb7a75fd0" path="/var/lib/kubelet/pods/9bbec8b9-8aae-4ee9-a258-1c7bb7a75fd0/volumes" Oct 12 07:52:01 crc kubenswrapper[4599]: I1012 07:52:01.907293 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 12 07:52:02 crc kubenswrapper[4599]: I1012 07:52:02.132913 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"223d3e1a-86ac-49d9-a231-b77957770434","Type":"ContainerStarted","Data":"a968e64d933a71342123e58af117b3b96543a0c25b3dd987921e21c74cd7d961"} Oct 12 07:52:02 crc kubenswrapper[4599]: I1012 07:52:02.133231 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"223d3e1a-86ac-49d9-a231-b77957770434","Type":"ContainerStarted","Data":"772889ec3747326be98339da915cbfda46bd66120b11160b107ffe20262a8f1a"} Oct 12 07:52:02 crc kubenswrapper[4599]: I1012 07:52:02.151928 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.151905427 podStartE2EDuration="1.151905427s" podCreationTimestamp="2025-10-12 07:52:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:52:02.147207183 +0000 UTC m=+1018.936402685" watchObservedRunningTime="2025-10-12 07:52:02.151905427 +0000 UTC m=+1018.941100920" Oct 12 07:52:02 crc kubenswrapper[4599]: I1012 07:52:02.813678 4599 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="44ffa825-906e-4f85-9da7-bf4f38d28e59" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.192:8775/\": read tcp 10.217.0.2:39812->10.217.0.192:8775: read: connection reset by peer" Oct 12 07:52:02 crc kubenswrapper[4599]: I1012 07:52:02.813686 4599 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="44ffa825-906e-4f85-9da7-bf4f38d28e59" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.192:8775/\": read tcp 10.217.0.2:39810->10.217.0.192:8775: read: connection reset by peer" Oct 12 07:52:02 crc kubenswrapper[4599]: E1012 07:52:02.870845 4599 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod44ffa825_906e_4f85_9da7_bf4f38d28e59.slice/crio-65fb22f48efa762e822f7330fbe6757c38f2eae38bc94dbe46fc002baf4228cd.scope\": RecentStats: unable to find data in memory cache]" Oct 12 07:52:03 crc kubenswrapper[4599]: I1012 07:52:03.147698 4599 generic.go:334] "Generic (PLEG): container finished" podID="44ffa825-906e-4f85-9da7-bf4f38d28e59" containerID="65fb22f48efa762e822f7330fbe6757c38f2eae38bc94dbe46fc002baf4228cd" exitCode=0 Oct 12 07:52:03 crc kubenswrapper[4599]: I1012 07:52:03.147790 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"44ffa825-906e-4f85-9da7-bf4f38d28e59","Type":"ContainerDied","Data":"65fb22f48efa762e822f7330fbe6757c38f2eae38bc94dbe46fc002baf4228cd"} Oct 12 07:52:03 crc kubenswrapper[4599]: I1012 07:52:03.147842 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"44ffa825-906e-4f85-9da7-bf4f38d28e59","Type":"ContainerDied","Data":"8a7e9532a9c9b00010e5b48ce7160f5806c5df784b198f9cc7e1cc9812c6762c"} Oct 12 07:52:03 crc kubenswrapper[4599]: I1012 07:52:03.147854 4599 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8a7e9532a9c9b00010e5b48ce7160f5806c5df784b198f9cc7e1cc9812c6762c" Oct 12 07:52:03 crc kubenswrapper[4599]: I1012 07:52:03.163746 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 12 07:52:03 crc kubenswrapper[4599]: I1012 07:52:03.361455 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/44ffa825-906e-4f85-9da7-bf4f38d28e59-nova-metadata-tls-certs\") pod \"44ffa825-906e-4f85-9da7-bf4f38d28e59\" (UID: \"44ffa825-906e-4f85-9da7-bf4f38d28e59\") " Oct 12 07:52:03 crc kubenswrapper[4599]: I1012 07:52:03.362601 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vtxbh\" (UniqueName: \"kubernetes.io/projected/44ffa825-906e-4f85-9da7-bf4f38d28e59-kube-api-access-vtxbh\") pod \"44ffa825-906e-4f85-9da7-bf4f38d28e59\" (UID: \"44ffa825-906e-4f85-9da7-bf4f38d28e59\") " Oct 12 07:52:03 crc kubenswrapper[4599]: I1012 07:52:03.362734 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44ffa825-906e-4f85-9da7-bf4f38d28e59-logs\") pod \"44ffa825-906e-4f85-9da7-bf4f38d28e59\" (UID: \"44ffa825-906e-4f85-9da7-bf4f38d28e59\") " Oct 12 07:52:03 crc kubenswrapper[4599]: I1012 07:52:03.362809 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44ffa825-906e-4f85-9da7-bf4f38d28e59-combined-ca-bundle\") pod \"44ffa825-906e-4f85-9da7-bf4f38d28e59\" (UID: \"44ffa825-906e-4f85-9da7-bf4f38d28e59\") " Oct 12 07:52:03 crc kubenswrapper[4599]: I1012 07:52:03.363092 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44ffa825-906e-4f85-9da7-bf4f38d28e59-config-data\") pod \"44ffa825-906e-4f85-9da7-bf4f38d28e59\" (UID: \"44ffa825-906e-4f85-9da7-bf4f38d28e59\") " Oct 12 07:52:03 crc kubenswrapper[4599]: I1012 07:52:03.363461 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44ffa825-906e-4f85-9da7-bf4f38d28e59-logs" (OuterVolumeSpecName: "logs") pod "44ffa825-906e-4f85-9da7-bf4f38d28e59" (UID: "44ffa825-906e-4f85-9da7-bf4f38d28e59"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:52:03 crc kubenswrapper[4599]: I1012 07:52:03.363797 4599 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44ffa825-906e-4f85-9da7-bf4f38d28e59-logs\") on node \"crc\" DevicePath \"\"" Oct 12 07:52:03 crc kubenswrapper[4599]: I1012 07:52:03.377455 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44ffa825-906e-4f85-9da7-bf4f38d28e59-kube-api-access-vtxbh" (OuterVolumeSpecName: "kube-api-access-vtxbh") pod "44ffa825-906e-4f85-9da7-bf4f38d28e59" (UID: "44ffa825-906e-4f85-9da7-bf4f38d28e59"). InnerVolumeSpecName "kube-api-access-vtxbh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:52:03 crc kubenswrapper[4599]: I1012 07:52:03.389223 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44ffa825-906e-4f85-9da7-bf4f38d28e59-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "44ffa825-906e-4f85-9da7-bf4f38d28e59" (UID: "44ffa825-906e-4f85-9da7-bf4f38d28e59"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:52:03 crc kubenswrapper[4599]: I1012 07:52:03.403244 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44ffa825-906e-4f85-9da7-bf4f38d28e59-config-data" (OuterVolumeSpecName: "config-data") pod "44ffa825-906e-4f85-9da7-bf4f38d28e59" (UID: "44ffa825-906e-4f85-9da7-bf4f38d28e59"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:52:03 crc kubenswrapper[4599]: I1012 07:52:03.415174 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44ffa825-906e-4f85-9da7-bf4f38d28e59-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "44ffa825-906e-4f85-9da7-bf4f38d28e59" (UID: "44ffa825-906e-4f85-9da7-bf4f38d28e59"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:52:03 crc kubenswrapper[4599]: I1012 07:52:03.465706 4599 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44ffa825-906e-4f85-9da7-bf4f38d28e59-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 07:52:03 crc kubenswrapper[4599]: I1012 07:52:03.465976 4599 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44ffa825-906e-4f85-9da7-bf4f38d28e59-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 07:52:03 crc kubenswrapper[4599]: I1012 07:52:03.465986 4599 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/44ffa825-906e-4f85-9da7-bf4f38d28e59-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 12 07:52:03 crc kubenswrapper[4599]: I1012 07:52:03.466001 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vtxbh\" (UniqueName: \"kubernetes.io/projected/44ffa825-906e-4f85-9da7-bf4f38d28e59-kube-api-access-vtxbh\") on node \"crc\" DevicePath \"\"" Oct 12 07:52:04 crc kubenswrapper[4599]: I1012 07:52:04.157792 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 12 07:52:04 crc kubenswrapper[4599]: I1012 07:52:04.182994 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 12 07:52:04 crc kubenswrapper[4599]: I1012 07:52:04.192224 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 12 07:52:04 crc kubenswrapper[4599]: I1012 07:52:04.198255 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 12 07:52:04 crc kubenswrapper[4599]: E1012 07:52:04.198674 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44ffa825-906e-4f85-9da7-bf4f38d28e59" containerName="nova-metadata-metadata" Oct 12 07:52:04 crc kubenswrapper[4599]: I1012 07:52:04.198694 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="44ffa825-906e-4f85-9da7-bf4f38d28e59" containerName="nova-metadata-metadata" Oct 12 07:52:04 crc kubenswrapper[4599]: E1012 07:52:04.198707 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44ffa825-906e-4f85-9da7-bf4f38d28e59" containerName="nova-metadata-log" Oct 12 07:52:04 crc kubenswrapper[4599]: I1012 07:52:04.198714 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="44ffa825-906e-4f85-9da7-bf4f38d28e59" containerName="nova-metadata-log" Oct 12 07:52:04 crc kubenswrapper[4599]: I1012 07:52:04.198936 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="44ffa825-906e-4f85-9da7-bf4f38d28e59" containerName="nova-metadata-log" Oct 12 07:52:04 crc kubenswrapper[4599]: I1012 07:52:04.198965 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="44ffa825-906e-4f85-9da7-bf4f38d28e59" containerName="nova-metadata-metadata" Oct 12 07:52:04 crc kubenswrapper[4599]: I1012 07:52:04.200023 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 12 07:52:04 crc kubenswrapper[4599]: I1012 07:52:04.201712 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 12 07:52:04 crc kubenswrapper[4599]: I1012 07:52:04.203802 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 12 07:52:04 crc kubenswrapper[4599]: I1012 07:52:04.209136 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 12 07:52:04 crc kubenswrapper[4599]: I1012 07:52:04.382126 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/092769ed-436e-4aee-b3e9-4b2eeb4c487e-logs\") pod \"nova-metadata-0\" (UID: \"092769ed-436e-4aee-b3e9-4b2eeb4c487e\") " pod="openstack/nova-metadata-0" Oct 12 07:52:04 crc kubenswrapper[4599]: I1012 07:52:04.382186 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhvbd\" (UniqueName: \"kubernetes.io/projected/092769ed-436e-4aee-b3e9-4b2eeb4c487e-kube-api-access-rhvbd\") pod \"nova-metadata-0\" (UID: \"092769ed-436e-4aee-b3e9-4b2eeb4c487e\") " pod="openstack/nova-metadata-0" Oct 12 07:52:04 crc kubenswrapper[4599]: I1012 07:52:04.382254 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/092769ed-436e-4aee-b3e9-4b2eeb4c487e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"092769ed-436e-4aee-b3e9-4b2eeb4c487e\") " pod="openstack/nova-metadata-0" Oct 12 07:52:04 crc kubenswrapper[4599]: I1012 07:52:04.383510 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/092769ed-436e-4aee-b3e9-4b2eeb4c487e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"092769ed-436e-4aee-b3e9-4b2eeb4c487e\") " pod="openstack/nova-metadata-0" Oct 12 07:52:04 crc kubenswrapper[4599]: I1012 07:52:04.383584 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/092769ed-436e-4aee-b3e9-4b2eeb4c487e-config-data\") pod \"nova-metadata-0\" (UID: \"092769ed-436e-4aee-b3e9-4b2eeb4c487e\") " pod="openstack/nova-metadata-0" Oct 12 07:52:04 crc kubenswrapper[4599]: I1012 07:52:04.486253 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/092769ed-436e-4aee-b3e9-4b2eeb4c487e-config-data\") pod \"nova-metadata-0\" (UID: \"092769ed-436e-4aee-b3e9-4b2eeb4c487e\") " pod="openstack/nova-metadata-0" Oct 12 07:52:04 crc kubenswrapper[4599]: I1012 07:52:04.486698 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/092769ed-436e-4aee-b3e9-4b2eeb4c487e-logs\") pod \"nova-metadata-0\" (UID: \"092769ed-436e-4aee-b3e9-4b2eeb4c487e\") " pod="openstack/nova-metadata-0" Oct 12 07:52:04 crc kubenswrapper[4599]: I1012 07:52:04.486757 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhvbd\" (UniqueName: \"kubernetes.io/projected/092769ed-436e-4aee-b3e9-4b2eeb4c487e-kube-api-access-rhvbd\") pod \"nova-metadata-0\" (UID: \"092769ed-436e-4aee-b3e9-4b2eeb4c487e\") " pod="openstack/nova-metadata-0" Oct 12 07:52:04 crc kubenswrapper[4599]: I1012 07:52:04.486854 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/092769ed-436e-4aee-b3e9-4b2eeb4c487e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"092769ed-436e-4aee-b3e9-4b2eeb4c487e\") " pod="openstack/nova-metadata-0" Oct 12 07:52:04 crc kubenswrapper[4599]: I1012 07:52:04.486965 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/092769ed-436e-4aee-b3e9-4b2eeb4c487e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"092769ed-436e-4aee-b3e9-4b2eeb4c487e\") " pod="openstack/nova-metadata-0" Oct 12 07:52:04 crc kubenswrapper[4599]: I1012 07:52:04.487278 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/092769ed-436e-4aee-b3e9-4b2eeb4c487e-logs\") pod \"nova-metadata-0\" (UID: \"092769ed-436e-4aee-b3e9-4b2eeb4c487e\") " pod="openstack/nova-metadata-0" Oct 12 07:52:04 crc kubenswrapper[4599]: I1012 07:52:04.491942 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/092769ed-436e-4aee-b3e9-4b2eeb4c487e-config-data\") pod \"nova-metadata-0\" (UID: \"092769ed-436e-4aee-b3e9-4b2eeb4c487e\") " pod="openstack/nova-metadata-0" Oct 12 07:52:04 crc kubenswrapper[4599]: I1012 07:52:04.492460 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/092769ed-436e-4aee-b3e9-4b2eeb4c487e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"092769ed-436e-4aee-b3e9-4b2eeb4c487e\") " pod="openstack/nova-metadata-0" Oct 12 07:52:04 crc kubenswrapper[4599]: I1012 07:52:04.492717 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/092769ed-436e-4aee-b3e9-4b2eeb4c487e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"092769ed-436e-4aee-b3e9-4b2eeb4c487e\") " pod="openstack/nova-metadata-0" Oct 12 07:52:04 crc kubenswrapper[4599]: I1012 07:52:04.502551 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhvbd\" (UniqueName: \"kubernetes.io/projected/092769ed-436e-4aee-b3e9-4b2eeb4c487e-kube-api-access-rhvbd\") pod \"nova-metadata-0\" (UID: \"092769ed-436e-4aee-b3e9-4b2eeb4c487e\") " pod="openstack/nova-metadata-0" Oct 12 07:52:04 crc kubenswrapper[4599]: I1012 07:52:04.515739 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 12 07:52:05 crc kubenswrapper[4599]: I1012 07:52:04.940022 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 12 07:52:05 crc kubenswrapper[4599]: I1012 07:52:05.167851 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"092769ed-436e-4aee-b3e9-4b2eeb4c487e","Type":"ContainerStarted","Data":"5e11d8885b4c4337a6f5b862107a0019b5a6fd3ff3d64a8380fe7946be14d645"} Oct 12 07:52:05 crc kubenswrapper[4599]: I1012 07:52:05.167894 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"092769ed-436e-4aee-b3e9-4b2eeb4c487e","Type":"ContainerStarted","Data":"2ef869dbad423f18955f9229c81dc3b30fc4e72cd5d33642b7d02be24a6f623e"} Oct 12 07:52:05 crc kubenswrapper[4599]: I1012 07:52:05.555645 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44ffa825-906e-4f85-9da7-bf4f38d28e59" path="/var/lib/kubelet/pods/44ffa825-906e-4f85-9da7-bf4f38d28e59/volumes" Oct 12 07:52:05 crc kubenswrapper[4599]: I1012 07:52:05.920936 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 12 07:52:06 crc kubenswrapper[4599]: I1012 07:52:06.017731 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d63b470-1f3c-4829-9f2b-f50560de4761-public-tls-certs\") pod \"4d63b470-1f3c-4829-9f2b-f50560de4761\" (UID: \"4d63b470-1f3c-4829-9f2b-f50560de4761\") " Oct 12 07:52:06 crc kubenswrapper[4599]: I1012 07:52:06.018322 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4d63b470-1f3c-4829-9f2b-f50560de4761-logs\") pod \"4d63b470-1f3c-4829-9f2b-f50560de4761\" (UID: \"4d63b470-1f3c-4829-9f2b-f50560de4761\") " Oct 12 07:52:06 crc kubenswrapper[4599]: I1012 07:52:06.018470 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gtn89\" (UniqueName: \"kubernetes.io/projected/4d63b470-1f3c-4829-9f2b-f50560de4761-kube-api-access-gtn89\") pod \"4d63b470-1f3c-4829-9f2b-f50560de4761\" (UID: \"4d63b470-1f3c-4829-9f2b-f50560de4761\") " Oct 12 07:52:06 crc kubenswrapper[4599]: I1012 07:52:06.018929 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d63b470-1f3c-4829-9f2b-f50560de4761-logs" (OuterVolumeSpecName: "logs") pod "4d63b470-1f3c-4829-9f2b-f50560de4761" (UID: "4d63b470-1f3c-4829-9f2b-f50560de4761"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:52:06 crc kubenswrapper[4599]: I1012 07:52:06.022779 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d63b470-1f3c-4829-9f2b-f50560de4761-kube-api-access-gtn89" (OuterVolumeSpecName: "kube-api-access-gtn89") pod "4d63b470-1f3c-4829-9f2b-f50560de4761" (UID: "4d63b470-1f3c-4829-9f2b-f50560de4761"). InnerVolumeSpecName "kube-api-access-gtn89". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:52:06 crc kubenswrapper[4599]: I1012 07:52:06.057033 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d63b470-1f3c-4829-9f2b-f50560de4761-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "4d63b470-1f3c-4829-9f2b-f50560de4761" (UID: "4d63b470-1f3c-4829-9f2b-f50560de4761"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:52:06 crc kubenswrapper[4599]: I1012 07:52:06.120122 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d63b470-1f3c-4829-9f2b-f50560de4761-internal-tls-certs\") pod \"4d63b470-1f3c-4829-9f2b-f50560de4761\" (UID: \"4d63b470-1f3c-4829-9f2b-f50560de4761\") " Oct 12 07:52:06 crc kubenswrapper[4599]: I1012 07:52:06.120176 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d63b470-1f3c-4829-9f2b-f50560de4761-config-data\") pod \"4d63b470-1f3c-4829-9f2b-f50560de4761\" (UID: \"4d63b470-1f3c-4829-9f2b-f50560de4761\") " Oct 12 07:52:06 crc kubenswrapper[4599]: I1012 07:52:06.120234 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d63b470-1f3c-4829-9f2b-f50560de4761-combined-ca-bundle\") pod \"4d63b470-1f3c-4829-9f2b-f50560de4761\" (UID: \"4d63b470-1f3c-4829-9f2b-f50560de4761\") " Oct 12 07:52:06 crc kubenswrapper[4599]: I1012 07:52:06.120859 4599 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4d63b470-1f3c-4829-9f2b-f50560de4761-logs\") on node \"crc\" DevicePath \"\"" Oct 12 07:52:06 crc kubenswrapper[4599]: I1012 07:52:06.120882 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gtn89\" (UniqueName: \"kubernetes.io/projected/4d63b470-1f3c-4829-9f2b-f50560de4761-kube-api-access-gtn89\") on node \"crc\" DevicePath \"\"" Oct 12 07:52:06 crc kubenswrapper[4599]: I1012 07:52:06.120893 4599 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d63b470-1f3c-4829-9f2b-f50560de4761-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 12 07:52:06 crc kubenswrapper[4599]: I1012 07:52:06.139256 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d63b470-1f3c-4829-9f2b-f50560de4761-config-data" (OuterVolumeSpecName: "config-data") pod "4d63b470-1f3c-4829-9f2b-f50560de4761" (UID: "4d63b470-1f3c-4829-9f2b-f50560de4761"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:52:06 crc kubenswrapper[4599]: I1012 07:52:06.146727 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d63b470-1f3c-4829-9f2b-f50560de4761-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4d63b470-1f3c-4829-9f2b-f50560de4761" (UID: "4d63b470-1f3c-4829-9f2b-f50560de4761"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:52:06 crc kubenswrapper[4599]: I1012 07:52:06.159463 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d63b470-1f3c-4829-9f2b-f50560de4761-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "4d63b470-1f3c-4829-9f2b-f50560de4761" (UID: "4d63b470-1f3c-4829-9f2b-f50560de4761"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:52:06 crc kubenswrapper[4599]: I1012 07:52:06.187959 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"092769ed-436e-4aee-b3e9-4b2eeb4c487e","Type":"ContainerStarted","Data":"75f3f572a2cc7e72e81d993d056f0c3708440e567d1e3217a7f8dd4f05302b03"} Oct 12 07:52:06 crc kubenswrapper[4599]: I1012 07:52:06.190974 4599 generic.go:334] "Generic (PLEG): container finished" podID="4d63b470-1f3c-4829-9f2b-f50560de4761" containerID="62f0ea2a44e2fe1ffbfbc08042079fe9de672411baf277fdd2a76fcc4cd128dd" exitCode=0 Oct 12 07:52:06 crc kubenswrapper[4599]: I1012 07:52:06.191052 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4d63b470-1f3c-4829-9f2b-f50560de4761","Type":"ContainerDied","Data":"62f0ea2a44e2fe1ffbfbc08042079fe9de672411baf277fdd2a76fcc4cd128dd"} Oct 12 07:52:06 crc kubenswrapper[4599]: I1012 07:52:06.191089 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4d63b470-1f3c-4829-9f2b-f50560de4761","Type":"ContainerDied","Data":"bb27c18448eb85b1eaa452044d837f2315ec51d669f63fded8d430a920cdb97e"} Oct 12 07:52:06 crc kubenswrapper[4599]: I1012 07:52:06.191129 4599 scope.go:117] "RemoveContainer" containerID="62f0ea2a44e2fe1ffbfbc08042079fe9de672411baf277fdd2a76fcc4cd128dd" Oct 12 07:52:06 crc kubenswrapper[4599]: I1012 07:52:06.191421 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 12 07:52:06 crc kubenswrapper[4599]: I1012 07:52:06.213591 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.213564845 podStartE2EDuration="2.213564845s" podCreationTimestamp="2025-10-12 07:52:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:52:06.205490351 +0000 UTC m=+1022.994685853" watchObservedRunningTime="2025-10-12 07:52:06.213564845 +0000 UTC m=+1023.002760347" Oct 12 07:52:06 crc kubenswrapper[4599]: I1012 07:52:06.226186 4599 scope.go:117] "RemoveContainer" containerID="64fa9a1c037021538c64e9a3010f89bc89d8f02666ba6a035b0f095101c1862b" Oct 12 07:52:06 crc kubenswrapper[4599]: I1012 07:52:06.226445 4599 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d63b470-1f3c-4829-9f2b-f50560de4761-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 12 07:52:06 crc kubenswrapper[4599]: I1012 07:52:06.227098 4599 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d63b470-1f3c-4829-9f2b-f50560de4761-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 07:52:06 crc kubenswrapper[4599]: I1012 07:52:06.227112 4599 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d63b470-1f3c-4829-9f2b-f50560de4761-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 07:52:06 crc kubenswrapper[4599]: I1012 07:52:06.244649 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 12 07:52:06 crc kubenswrapper[4599]: I1012 07:52:06.246178 4599 scope.go:117] "RemoveContainer" containerID="62f0ea2a44e2fe1ffbfbc08042079fe9de672411baf277fdd2a76fcc4cd128dd" Oct 12 07:52:06 crc kubenswrapper[4599]: E1012 07:52:06.246704 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62f0ea2a44e2fe1ffbfbc08042079fe9de672411baf277fdd2a76fcc4cd128dd\": container with ID starting with 62f0ea2a44e2fe1ffbfbc08042079fe9de672411baf277fdd2a76fcc4cd128dd not found: ID does not exist" containerID="62f0ea2a44e2fe1ffbfbc08042079fe9de672411baf277fdd2a76fcc4cd128dd" Oct 12 07:52:06 crc kubenswrapper[4599]: I1012 07:52:06.246742 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62f0ea2a44e2fe1ffbfbc08042079fe9de672411baf277fdd2a76fcc4cd128dd"} err="failed to get container status \"62f0ea2a44e2fe1ffbfbc08042079fe9de672411baf277fdd2a76fcc4cd128dd\": rpc error: code = NotFound desc = could not find container \"62f0ea2a44e2fe1ffbfbc08042079fe9de672411baf277fdd2a76fcc4cd128dd\": container with ID starting with 62f0ea2a44e2fe1ffbfbc08042079fe9de672411baf277fdd2a76fcc4cd128dd not found: ID does not exist" Oct 12 07:52:06 crc kubenswrapper[4599]: I1012 07:52:06.246764 4599 scope.go:117] "RemoveContainer" containerID="64fa9a1c037021538c64e9a3010f89bc89d8f02666ba6a035b0f095101c1862b" Oct 12 07:52:06 crc kubenswrapper[4599]: E1012 07:52:06.247219 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64fa9a1c037021538c64e9a3010f89bc89d8f02666ba6a035b0f095101c1862b\": container with ID starting with 64fa9a1c037021538c64e9a3010f89bc89d8f02666ba6a035b0f095101c1862b not found: ID does not exist" containerID="64fa9a1c037021538c64e9a3010f89bc89d8f02666ba6a035b0f095101c1862b" Oct 12 07:52:06 crc kubenswrapper[4599]: I1012 07:52:06.247237 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64fa9a1c037021538c64e9a3010f89bc89d8f02666ba6a035b0f095101c1862b"} err="failed to get container status \"64fa9a1c037021538c64e9a3010f89bc89d8f02666ba6a035b0f095101c1862b\": rpc error: code = NotFound desc = could not find container \"64fa9a1c037021538c64e9a3010f89bc89d8f02666ba6a035b0f095101c1862b\": container with ID starting with 64fa9a1c037021538c64e9a3010f89bc89d8f02666ba6a035b0f095101c1862b not found: ID does not exist" Oct 12 07:52:06 crc kubenswrapper[4599]: I1012 07:52:06.255310 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 12 07:52:06 crc kubenswrapper[4599]: I1012 07:52:06.263477 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 12 07:52:06 crc kubenswrapper[4599]: E1012 07:52:06.263979 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d63b470-1f3c-4829-9f2b-f50560de4761" containerName="nova-api-api" Oct 12 07:52:06 crc kubenswrapper[4599]: I1012 07:52:06.263999 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d63b470-1f3c-4829-9f2b-f50560de4761" containerName="nova-api-api" Oct 12 07:52:06 crc kubenswrapper[4599]: E1012 07:52:06.264022 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d63b470-1f3c-4829-9f2b-f50560de4761" containerName="nova-api-log" Oct 12 07:52:06 crc kubenswrapper[4599]: I1012 07:52:06.264028 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d63b470-1f3c-4829-9f2b-f50560de4761" containerName="nova-api-log" Oct 12 07:52:06 crc kubenswrapper[4599]: I1012 07:52:06.264190 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d63b470-1f3c-4829-9f2b-f50560de4761" containerName="nova-api-log" Oct 12 07:52:06 crc kubenswrapper[4599]: I1012 07:52:06.264217 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d63b470-1f3c-4829-9f2b-f50560de4761" containerName="nova-api-api" Oct 12 07:52:06 crc kubenswrapper[4599]: I1012 07:52:06.265383 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 12 07:52:06 crc kubenswrapper[4599]: I1012 07:52:06.269036 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 12 07:52:06 crc kubenswrapper[4599]: I1012 07:52:06.269469 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 12 07:52:06 crc kubenswrapper[4599]: I1012 07:52:06.269492 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 12 07:52:06 crc kubenswrapper[4599]: I1012 07:52:06.269792 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 12 07:52:06 crc kubenswrapper[4599]: I1012 07:52:06.328669 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd526957-c4dc-40c6-87e3-eb3784e09fb5-config-data\") pod \"nova-api-0\" (UID: \"fd526957-c4dc-40c6-87e3-eb3784e09fb5\") " pod="openstack/nova-api-0" Oct 12 07:52:06 crc kubenswrapper[4599]: I1012 07:52:06.328722 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd526957-c4dc-40c6-87e3-eb3784e09fb5-public-tls-certs\") pod \"nova-api-0\" (UID: \"fd526957-c4dc-40c6-87e3-eb3784e09fb5\") " pod="openstack/nova-api-0" Oct 12 07:52:06 crc kubenswrapper[4599]: I1012 07:52:06.328745 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd526957-c4dc-40c6-87e3-eb3784e09fb5-logs\") pod \"nova-api-0\" (UID: \"fd526957-c4dc-40c6-87e3-eb3784e09fb5\") " pod="openstack/nova-api-0" Oct 12 07:52:06 crc kubenswrapper[4599]: I1012 07:52:06.329197 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd526957-c4dc-40c6-87e3-eb3784e09fb5-internal-tls-certs\") pod \"nova-api-0\" (UID: \"fd526957-c4dc-40c6-87e3-eb3784e09fb5\") " pod="openstack/nova-api-0" Oct 12 07:52:06 crc kubenswrapper[4599]: I1012 07:52:06.329477 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd526957-c4dc-40c6-87e3-eb3784e09fb5-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"fd526957-c4dc-40c6-87e3-eb3784e09fb5\") " pod="openstack/nova-api-0" Oct 12 07:52:06 crc kubenswrapper[4599]: I1012 07:52:06.329616 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mp72f\" (UniqueName: \"kubernetes.io/projected/fd526957-c4dc-40c6-87e3-eb3784e09fb5-kube-api-access-mp72f\") pod \"nova-api-0\" (UID: \"fd526957-c4dc-40c6-87e3-eb3784e09fb5\") " pod="openstack/nova-api-0" Oct 12 07:52:06 crc kubenswrapper[4599]: I1012 07:52:06.431000 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd526957-c4dc-40c6-87e3-eb3784e09fb5-internal-tls-certs\") pod \"nova-api-0\" (UID: \"fd526957-c4dc-40c6-87e3-eb3784e09fb5\") " pod="openstack/nova-api-0" Oct 12 07:52:06 crc kubenswrapper[4599]: I1012 07:52:06.431116 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd526957-c4dc-40c6-87e3-eb3784e09fb5-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"fd526957-c4dc-40c6-87e3-eb3784e09fb5\") " pod="openstack/nova-api-0" Oct 12 07:52:06 crc kubenswrapper[4599]: I1012 07:52:06.431213 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mp72f\" (UniqueName: \"kubernetes.io/projected/fd526957-c4dc-40c6-87e3-eb3784e09fb5-kube-api-access-mp72f\") pod \"nova-api-0\" (UID: \"fd526957-c4dc-40c6-87e3-eb3784e09fb5\") " pod="openstack/nova-api-0" Oct 12 07:52:06 crc kubenswrapper[4599]: I1012 07:52:06.431350 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd526957-c4dc-40c6-87e3-eb3784e09fb5-config-data\") pod \"nova-api-0\" (UID: \"fd526957-c4dc-40c6-87e3-eb3784e09fb5\") " pod="openstack/nova-api-0" Oct 12 07:52:06 crc kubenswrapper[4599]: I1012 07:52:06.431424 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd526957-c4dc-40c6-87e3-eb3784e09fb5-public-tls-certs\") pod \"nova-api-0\" (UID: \"fd526957-c4dc-40c6-87e3-eb3784e09fb5\") " pod="openstack/nova-api-0" Oct 12 07:52:06 crc kubenswrapper[4599]: I1012 07:52:06.431604 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd526957-c4dc-40c6-87e3-eb3784e09fb5-logs\") pod \"nova-api-0\" (UID: \"fd526957-c4dc-40c6-87e3-eb3784e09fb5\") " pod="openstack/nova-api-0" Oct 12 07:52:06 crc kubenswrapper[4599]: I1012 07:52:06.431936 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd526957-c4dc-40c6-87e3-eb3784e09fb5-logs\") pod \"nova-api-0\" (UID: \"fd526957-c4dc-40c6-87e3-eb3784e09fb5\") " pod="openstack/nova-api-0" Oct 12 07:52:06 crc kubenswrapper[4599]: I1012 07:52:06.435862 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd526957-c4dc-40c6-87e3-eb3784e09fb5-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"fd526957-c4dc-40c6-87e3-eb3784e09fb5\") " pod="openstack/nova-api-0" Oct 12 07:52:06 crc kubenswrapper[4599]: I1012 07:52:06.435908 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd526957-c4dc-40c6-87e3-eb3784e09fb5-internal-tls-certs\") pod \"nova-api-0\" (UID: \"fd526957-c4dc-40c6-87e3-eb3784e09fb5\") " pod="openstack/nova-api-0" Oct 12 07:52:06 crc kubenswrapper[4599]: I1012 07:52:06.436301 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd526957-c4dc-40c6-87e3-eb3784e09fb5-public-tls-certs\") pod \"nova-api-0\" (UID: \"fd526957-c4dc-40c6-87e3-eb3784e09fb5\") " pod="openstack/nova-api-0" Oct 12 07:52:06 crc kubenswrapper[4599]: I1012 07:52:06.437020 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd526957-c4dc-40c6-87e3-eb3784e09fb5-config-data\") pod \"nova-api-0\" (UID: \"fd526957-c4dc-40c6-87e3-eb3784e09fb5\") " pod="openstack/nova-api-0" Oct 12 07:52:06 crc kubenswrapper[4599]: I1012 07:52:06.445827 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mp72f\" (UniqueName: \"kubernetes.io/projected/fd526957-c4dc-40c6-87e3-eb3784e09fb5-kube-api-access-mp72f\") pod \"nova-api-0\" (UID: \"fd526957-c4dc-40c6-87e3-eb3784e09fb5\") " pod="openstack/nova-api-0" Oct 12 07:52:06 crc kubenswrapper[4599]: I1012 07:52:06.493398 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 12 07:52:06 crc kubenswrapper[4599]: I1012 07:52:06.584131 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 12 07:52:06 crc kubenswrapper[4599]: I1012 07:52:06.981481 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 12 07:52:06 crc kubenswrapper[4599]: W1012 07:52:06.986764 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfd526957_c4dc_40c6_87e3_eb3784e09fb5.slice/crio-66ab54dd8658279fbb28554755bdbc9ae9087881ee5b4014cd61c4b95c56fe8f WatchSource:0}: Error finding container 66ab54dd8658279fbb28554755bdbc9ae9087881ee5b4014cd61c4b95c56fe8f: Status 404 returned error can't find the container with id 66ab54dd8658279fbb28554755bdbc9ae9087881ee5b4014cd61c4b95c56fe8f Oct 12 07:52:07 crc kubenswrapper[4599]: I1012 07:52:07.206497 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fd526957-c4dc-40c6-87e3-eb3784e09fb5","Type":"ContainerStarted","Data":"c93c326a3d5e8d39984fdec3a2f11d497b229382202e73c7529f5eb09bad2746"} Oct 12 07:52:07 crc kubenswrapper[4599]: I1012 07:52:07.206873 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fd526957-c4dc-40c6-87e3-eb3784e09fb5","Type":"ContainerStarted","Data":"66ab54dd8658279fbb28554755bdbc9ae9087881ee5b4014cd61c4b95c56fe8f"} Oct 12 07:52:07 crc kubenswrapper[4599]: I1012 07:52:07.554820 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d63b470-1f3c-4829-9f2b-f50560de4761" path="/var/lib/kubelet/pods/4d63b470-1f3c-4829-9f2b-f50560de4761/volumes" Oct 12 07:52:08 crc kubenswrapper[4599]: I1012 07:52:08.215887 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fd526957-c4dc-40c6-87e3-eb3784e09fb5","Type":"ContainerStarted","Data":"58da7c9cd04aafa9210686321d9ebe3ec05644a03952adace805e3686558deae"} Oct 12 07:52:08 crc kubenswrapper[4599]: I1012 07:52:08.237667 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.237642635 podStartE2EDuration="2.237642635s" podCreationTimestamp="2025-10-12 07:52:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:52:08.230835531 +0000 UTC m=+1025.020031033" watchObservedRunningTime="2025-10-12 07:52:08.237642635 +0000 UTC m=+1025.026838137" Oct 12 07:52:09 crc kubenswrapper[4599]: I1012 07:52:09.518009 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 12 07:52:09 crc kubenswrapper[4599]: I1012 07:52:09.518076 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 12 07:52:11 crc kubenswrapper[4599]: I1012 07:52:11.493063 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 12 07:52:11 crc kubenswrapper[4599]: I1012 07:52:11.519829 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 12 07:52:12 crc kubenswrapper[4599]: I1012 07:52:12.278522 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 12 07:52:14 crc kubenswrapper[4599]: I1012 07:52:14.516474 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 12 07:52:14 crc kubenswrapper[4599]: I1012 07:52:14.516928 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 12 07:52:15 crc kubenswrapper[4599]: I1012 07:52:15.537494 4599 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="092769ed-436e-4aee-b3e9-4b2eeb4c487e" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.201:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 12 07:52:15 crc kubenswrapper[4599]: I1012 07:52:15.537903 4599 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="092769ed-436e-4aee-b3e9-4b2eeb4c487e" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.201:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 12 07:52:16 crc kubenswrapper[4599]: I1012 07:52:16.585176 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 12 07:52:16 crc kubenswrapper[4599]: I1012 07:52:16.585251 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 12 07:52:17 crc kubenswrapper[4599]: I1012 07:52:17.604490 4599 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="fd526957-c4dc-40c6-87e3-eb3784e09fb5" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.202:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 12 07:52:17 crc kubenswrapper[4599]: I1012 07:52:17.604551 4599 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="fd526957-c4dc-40c6-87e3-eb3784e09fb5" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.202:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 12 07:52:18 crc kubenswrapper[4599]: I1012 07:52:18.290539 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 12 07:52:24 crc kubenswrapper[4599]: I1012 07:52:24.522988 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 12 07:52:24 crc kubenswrapper[4599]: I1012 07:52:24.523783 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 12 07:52:24 crc kubenswrapper[4599]: I1012 07:52:24.528268 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 12 07:52:24 crc kubenswrapper[4599]: I1012 07:52:24.528638 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 12 07:52:26 crc kubenswrapper[4599]: I1012 07:52:26.592000 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 12 07:52:26 crc kubenswrapper[4599]: I1012 07:52:26.592552 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 12 07:52:26 crc kubenswrapper[4599]: I1012 07:52:26.594133 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 12 07:52:26 crc kubenswrapper[4599]: I1012 07:52:26.596954 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 12 07:52:27 crc kubenswrapper[4599]: I1012 07:52:27.390792 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 12 07:52:27 crc kubenswrapper[4599]: I1012 07:52:27.397867 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 12 07:52:28 crc kubenswrapper[4599]: I1012 07:52:28.322315 4599 patch_prober.go:28] interesting pod/machine-config-daemon-5mz5c container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 07:52:28 crc kubenswrapper[4599]: I1012 07:52:28.322425 4599 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 07:52:33 crc kubenswrapper[4599]: I1012 07:52:33.977378 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 12 07:52:34 crc kubenswrapper[4599]: I1012 07:52:34.642988 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 12 07:52:37 crc kubenswrapper[4599]: I1012 07:52:37.780753 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="4612b7e8-a507-4c57-989d-3411e4e302dd" containerName="rabbitmq" containerID="cri-o://d136139635ab8d327cc5286d3b7c813647993b35ce4f04c45a4355e388e1551c" gracePeriod=604797 Oct 12 07:52:38 crc kubenswrapper[4599]: I1012 07:52:38.156153 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="2e036a1a-bc46-419f-88e4-312037490ec1" containerName="rabbitmq" containerID="cri-o://ccf62acba6befb5b9bba752893021602b49c1f46c423bd607fea85c3143a2178" gracePeriod=604797 Oct 12 07:52:39 crc kubenswrapper[4599]: I1012 07:52:39.603784 4599 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="4612b7e8-a507-4c57-989d-3411e4e302dd" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.97:5671: connect: connection refused" Oct 12 07:52:39 crc kubenswrapper[4599]: I1012 07:52:39.903086 4599 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="2e036a1a-bc46-419f-88e4-312037490ec1" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.98:5671: connect: connection refused" Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.402597 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.456316 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4612b7e8-a507-4c57-989d-3411e4e302dd-config-data\") pod \"4612b7e8-a507-4c57-989d-3411e4e302dd\" (UID: \"4612b7e8-a507-4c57-989d-3411e4e302dd\") " Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.456376 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4612b7e8-a507-4c57-989d-3411e4e302dd-rabbitmq-tls\") pod \"4612b7e8-a507-4c57-989d-3411e4e302dd\" (UID: \"4612b7e8-a507-4c57-989d-3411e4e302dd\") " Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.456405 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"4612b7e8-a507-4c57-989d-3411e4e302dd\" (UID: \"4612b7e8-a507-4c57-989d-3411e4e302dd\") " Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.456424 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4612b7e8-a507-4c57-989d-3411e4e302dd-server-conf\") pod \"4612b7e8-a507-4c57-989d-3411e4e302dd\" (UID: \"4612b7e8-a507-4c57-989d-3411e4e302dd\") " Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.456476 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4612b7e8-a507-4c57-989d-3411e4e302dd-rabbitmq-erlang-cookie\") pod \"4612b7e8-a507-4c57-989d-3411e4e302dd\" (UID: \"4612b7e8-a507-4c57-989d-3411e4e302dd\") " Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.456508 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4612b7e8-a507-4c57-989d-3411e4e302dd-rabbitmq-confd\") pod \"4612b7e8-a507-4c57-989d-3411e4e302dd\" (UID: \"4612b7e8-a507-4c57-989d-3411e4e302dd\") " Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.456528 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x98v2\" (UniqueName: \"kubernetes.io/projected/4612b7e8-a507-4c57-989d-3411e4e302dd-kube-api-access-x98v2\") pod \"4612b7e8-a507-4c57-989d-3411e4e302dd\" (UID: \"4612b7e8-a507-4c57-989d-3411e4e302dd\") " Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.456545 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4612b7e8-a507-4c57-989d-3411e4e302dd-erlang-cookie-secret\") pod \"4612b7e8-a507-4c57-989d-3411e4e302dd\" (UID: \"4612b7e8-a507-4c57-989d-3411e4e302dd\") " Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.456572 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4612b7e8-a507-4c57-989d-3411e4e302dd-rabbitmq-plugins\") pod \"4612b7e8-a507-4c57-989d-3411e4e302dd\" (UID: \"4612b7e8-a507-4c57-989d-3411e4e302dd\") " Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.456593 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4612b7e8-a507-4c57-989d-3411e4e302dd-pod-info\") pod \"4612b7e8-a507-4c57-989d-3411e4e302dd\" (UID: \"4612b7e8-a507-4c57-989d-3411e4e302dd\") " Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.456639 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4612b7e8-a507-4c57-989d-3411e4e302dd-plugins-conf\") pod \"4612b7e8-a507-4c57-989d-3411e4e302dd\" (UID: \"4612b7e8-a507-4c57-989d-3411e4e302dd\") " Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.457700 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4612b7e8-a507-4c57-989d-3411e4e302dd-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "4612b7e8-a507-4c57-989d-3411e4e302dd" (UID: "4612b7e8-a507-4c57-989d-3411e4e302dd"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.460023 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4612b7e8-a507-4c57-989d-3411e4e302dd-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "4612b7e8-a507-4c57-989d-3411e4e302dd" (UID: "4612b7e8-a507-4c57-989d-3411e4e302dd"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.463042 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4612b7e8-a507-4c57-989d-3411e4e302dd-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "4612b7e8-a507-4c57-989d-3411e4e302dd" (UID: "4612b7e8-a507-4c57-989d-3411e4e302dd"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.476298 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4612b7e8-a507-4c57-989d-3411e4e302dd-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "4612b7e8-a507-4c57-989d-3411e4e302dd" (UID: "4612b7e8-a507-4c57-989d-3411e4e302dd"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.476315 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "persistence") pod "4612b7e8-a507-4c57-989d-3411e4e302dd" (UID: "4612b7e8-a507-4c57-989d-3411e4e302dd"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.476464 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4612b7e8-a507-4c57-989d-3411e4e302dd-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "4612b7e8-a507-4c57-989d-3411e4e302dd" (UID: "4612b7e8-a507-4c57-989d-3411e4e302dd"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.486648 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4612b7e8-a507-4c57-989d-3411e4e302dd-kube-api-access-x98v2" (OuterVolumeSpecName: "kube-api-access-x98v2") pod "4612b7e8-a507-4c57-989d-3411e4e302dd" (UID: "4612b7e8-a507-4c57-989d-3411e4e302dd"). InnerVolumeSpecName "kube-api-access-x98v2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.503379 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/4612b7e8-a507-4c57-989d-3411e4e302dd-pod-info" (OuterVolumeSpecName: "pod-info") pod "4612b7e8-a507-4c57-989d-3411e4e302dd" (UID: "4612b7e8-a507-4c57-989d-3411e4e302dd"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.508667 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4612b7e8-a507-4c57-989d-3411e4e302dd-config-data" (OuterVolumeSpecName: "config-data") pod "4612b7e8-a507-4c57-989d-3411e4e302dd" (UID: "4612b7e8-a507-4c57-989d-3411e4e302dd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.522415 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.539106 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4612b7e8-a507-4c57-989d-3411e4e302dd-server-conf" (OuterVolumeSpecName: "server-conf") pod "4612b7e8-a507-4c57-989d-3411e4e302dd" (UID: "4612b7e8-a507-4c57-989d-3411e4e302dd"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.558215 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2e036a1a-bc46-419f-88e4-312037490ec1-erlang-cookie-secret\") pod \"2e036a1a-bc46-419f-88e4-312037490ec1\" (UID: \"2e036a1a-bc46-419f-88e4-312037490ec1\") " Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.558384 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2e036a1a-bc46-419f-88e4-312037490ec1-rabbitmq-confd\") pod \"2e036a1a-bc46-419f-88e4-312037490ec1\" (UID: \"2e036a1a-bc46-419f-88e4-312037490ec1\") " Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.558416 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"2e036a1a-bc46-419f-88e4-312037490ec1\" (UID: \"2e036a1a-bc46-419f-88e4-312037490ec1\") " Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.558521 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2e036a1a-bc46-419f-88e4-312037490ec1-rabbitmq-erlang-cookie\") pod \"2e036a1a-bc46-419f-88e4-312037490ec1\" (UID: \"2e036a1a-bc46-419f-88e4-312037490ec1\") " Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.558566 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2e036a1a-bc46-419f-88e4-312037490ec1-pod-info\") pod \"2e036a1a-bc46-419f-88e4-312037490ec1\" (UID: \"2e036a1a-bc46-419f-88e4-312037490ec1\") " Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.558589 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2e036a1a-bc46-419f-88e4-312037490ec1-config-data\") pod \"2e036a1a-bc46-419f-88e4-312037490ec1\" (UID: \"2e036a1a-bc46-419f-88e4-312037490ec1\") " Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.558669 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2e036a1a-bc46-419f-88e4-312037490ec1-server-conf\") pod \"2e036a1a-bc46-419f-88e4-312037490ec1\" (UID: \"2e036a1a-bc46-419f-88e4-312037490ec1\") " Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.558794 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2e036a1a-bc46-419f-88e4-312037490ec1-plugins-conf\") pod \"2e036a1a-bc46-419f-88e4-312037490ec1\" (UID: \"2e036a1a-bc46-419f-88e4-312037490ec1\") " Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.558814 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2e036a1a-bc46-419f-88e4-312037490ec1-rabbitmq-plugins\") pod \"2e036a1a-bc46-419f-88e4-312037490ec1\" (UID: \"2e036a1a-bc46-419f-88e4-312037490ec1\") " Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.558961 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2e036a1a-bc46-419f-88e4-312037490ec1-rabbitmq-tls\") pod \"2e036a1a-bc46-419f-88e4-312037490ec1\" (UID: \"2e036a1a-bc46-419f-88e4-312037490ec1\") " Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.558992 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g59r\" (UniqueName: \"kubernetes.io/projected/2e036a1a-bc46-419f-88e4-312037490ec1-kube-api-access-6g59r\") pod \"2e036a1a-bc46-419f-88e4-312037490ec1\" (UID: \"2e036a1a-bc46-419f-88e4-312037490ec1\") " Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.560221 4599 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4612b7e8-a507-4c57-989d-3411e4e302dd-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.560267 4599 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4612b7e8-a507-4c57-989d-3411e4e302dd-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.560277 4599 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4612b7e8-a507-4c57-989d-3411e4e302dd-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.560286 4599 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4612b7e8-a507-4c57-989d-3411e4e302dd-server-conf\") on node \"crc\" DevicePath \"\"" Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.560314 4599 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.560364 4599 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4612b7e8-a507-4c57-989d-3411e4e302dd-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.560376 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x98v2\" (UniqueName: \"kubernetes.io/projected/4612b7e8-a507-4c57-989d-3411e4e302dd-kube-api-access-x98v2\") on node \"crc\" DevicePath \"\"" Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.560385 4599 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4612b7e8-a507-4c57-989d-3411e4e302dd-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.560393 4599 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4612b7e8-a507-4c57-989d-3411e4e302dd-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.560401 4599 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4612b7e8-a507-4c57-989d-3411e4e302dd-pod-info\") on node \"crc\" DevicePath \"\"" Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.561232 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e036a1a-bc46-419f-88e4-312037490ec1-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "2e036a1a-bc46-419f-88e4-312037490ec1" (UID: "2e036a1a-bc46-419f-88e4-312037490ec1"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.562813 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e036a1a-bc46-419f-88e4-312037490ec1-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "2e036a1a-bc46-419f-88e4-312037490ec1" (UID: "2e036a1a-bc46-419f-88e4-312037490ec1"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.563206 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e036a1a-bc46-419f-88e4-312037490ec1-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "2e036a1a-bc46-419f-88e4-312037490ec1" (UID: "2e036a1a-bc46-419f-88e4-312037490ec1"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.565839 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/2e036a1a-bc46-419f-88e4-312037490ec1-pod-info" (OuterVolumeSpecName: "pod-info") pod "2e036a1a-bc46-419f-88e4-312037490ec1" (UID: "2e036a1a-bc46-419f-88e4-312037490ec1"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.565886 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e036a1a-bc46-419f-88e4-312037490ec1-kube-api-access-6g59r" (OuterVolumeSpecName: "kube-api-access-6g59r") pod "2e036a1a-bc46-419f-88e4-312037490ec1" (UID: "2e036a1a-bc46-419f-88e4-312037490ec1"). InnerVolumeSpecName "kube-api-access-6g59r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.566147 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "2e036a1a-bc46-419f-88e4-312037490ec1" (UID: "2e036a1a-bc46-419f-88e4-312037490ec1"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.561354 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e036a1a-bc46-419f-88e4-312037490ec1-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "2e036a1a-bc46-419f-88e4-312037490ec1" (UID: "2e036a1a-bc46-419f-88e4-312037490ec1"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.571402 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e036a1a-bc46-419f-88e4-312037490ec1-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "2e036a1a-bc46-419f-88e4-312037490ec1" (UID: "2e036a1a-bc46-419f-88e4-312037490ec1"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.588026 4599 generic.go:334] "Generic (PLEG): container finished" podID="2e036a1a-bc46-419f-88e4-312037490ec1" containerID="ccf62acba6befb5b9bba752893021602b49c1f46c423bd607fea85c3143a2178" exitCode=0 Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.588091 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"2e036a1a-bc46-419f-88e4-312037490ec1","Type":"ContainerDied","Data":"ccf62acba6befb5b9bba752893021602b49c1f46c423bd607fea85c3143a2178"} Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.588122 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"2e036a1a-bc46-419f-88e4-312037490ec1","Type":"ContainerDied","Data":"8fe3e42b1dec5d7152fb84a3223b1d88da96ba8ea7f99ef467f1bf5d1e0cd136"} Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.588138 4599 scope.go:117] "RemoveContainer" containerID="ccf62acba6befb5b9bba752893021602b49c1f46c423bd607fea85c3143a2178" Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.588268 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.591600 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4612b7e8-a507-4c57-989d-3411e4e302dd-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "4612b7e8-a507-4c57-989d-3411e4e302dd" (UID: "4612b7e8-a507-4c57-989d-3411e4e302dd"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.592661 4599 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.600887 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e036a1a-bc46-419f-88e4-312037490ec1-config-data" (OuterVolumeSpecName: "config-data") pod "2e036a1a-bc46-419f-88e4-312037490ec1" (UID: "2e036a1a-bc46-419f-88e4-312037490ec1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.604173 4599 generic.go:334] "Generic (PLEG): container finished" podID="4612b7e8-a507-4c57-989d-3411e4e302dd" containerID="d136139635ab8d327cc5286d3b7c813647993b35ce4f04c45a4355e388e1551c" exitCode=0 Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.604221 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4612b7e8-a507-4c57-989d-3411e4e302dd","Type":"ContainerDied","Data":"d136139635ab8d327cc5286d3b7c813647993b35ce4f04c45a4355e388e1551c"} Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.604252 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4612b7e8-a507-4c57-989d-3411e4e302dd","Type":"ContainerDied","Data":"799743a26f0045717426a4c23e0ed1b66e3dd4db0002797f963c997b1a92d56d"} Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.604345 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.625504 4599 scope.go:117] "RemoveContainer" containerID="8fe013c90596b20721f39bf5f7fa9a8546584a44b02099db74e730f6de411d84" Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.662228 4599 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.662553 4599 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2e036a1a-bc46-419f-88e4-312037490ec1-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.662563 4599 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2e036a1a-bc46-419f-88e4-312037490ec1-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.662571 4599 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4612b7e8-a507-4c57-989d-3411e4e302dd-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.662579 4599 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2e036a1a-bc46-419f-88e4-312037490ec1-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.662588 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g59r\" (UniqueName: \"kubernetes.io/projected/2e036a1a-bc46-419f-88e4-312037490ec1-kube-api-access-6g59r\") on node \"crc\" DevicePath \"\"" Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.662605 4599 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2e036a1a-bc46-419f-88e4-312037490ec1-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.662630 4599 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.662639 4599 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2e036a1a-bc46-419f-88e4-312037490ec1-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.662646 4599 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2e036a1a-bc46-419f-88e4-312037490ec1-pod-info\") on node \"crc\" DevicePath \"\"" Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.662654 4599 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2e036a1a-bc46-419f-88e4-312037490ec1-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.701417 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e036a1a-bc46-419f-88e4-312037490ec1-server-conf" (OuterVolumeSpecName: "server-conf") pod "2e036a1a-bc46-419f-88e4-312037490ec1" (UID: "2e036a1a-bc46-419f-88e4-312037490ec1"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.712874 4599 scope.go:117] "RemoveContainer" containerID="ccf62acba6befb5b9bba752893021602b49c1f46c423bd607fea85c3143a2178" Oct 12 07:52:44 crc kubenswrapper[4599]: E1012 07:52:44.757555 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ccf62acba6befb5b9bba752893021602b49c1f46c423bd607fea85c3143a2178\": container with ID starting with ccf62acba6befb5b9bba752893021602b49c1f46c423bd607fea85c3143a2178 not found: ID does not exist" containerID="ccf62acba6befb5b9bba752893021602b49c1f46c423bd607fea85c3143a2178" Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.757614 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ccf62acba6befb5b9bba752893021602b49c1f46c423bd607fea85c3143a2178"} err="failed to get container status \"ccf62acba6befb5b9bba752893021602b49c1f46c423bd607fea85c3143a2178\": rpc error: code = NotFound desc = could not find container \"ccf62acba6befb5b9bba752893021602b49c1f46c423bd607fea85c3143a2178\": container with ID starting with ccf62acba6befb5b9bba752893021602b49c1f46c423bd607fea85c3143a2178 not found: ID does not exist" Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.757644 4599 scope.go:117] "RemoveContainer" containerID="8fe013c90596b20721f39bf5f7fa9a8546584a44b02099db74e730f6de411d84" Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.757768 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.765932 4599 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2e036a1a-bc46-419f-88e4-312037490ec1-server-conf\") on node \"crc\" DevicePath \"\"" Oct 12 07:52:44 crc kubenswrapper[4599]: E1012 07:52:44.766035 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8fe013c90596b20721f39bf5f7fa9a8546584a44b02099db74e730f6de411d84\": container with ID starting with 8fe013c90596b20721f39bf5f7fa9a8546584a44b02099db74e730f6de411d84 not found: ID does not exist" containerID="8fe013c90596b20721f39bf5f7fa9a8546584a44b02099db74e730f6de411d84" Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.766058 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8fe013c90596b20721f39bf5f7fa9a8546584a44b02099db74e730f6de411d84"} err="failed to get container status \"8fe013c90596b20721f39bf5f7fa9a8546584a44b02099db74e730f6de411d84\": rpc error: code = NotFound desc = could not find container \"8fe013c90596b20721f39bf5f7fa9a8546584a44b02099db74e730f6de411d84\": container with ID starting with 8fe013c90596b20721f39bf5f7fa9a8546584a44b02099db74e730f6de411d84 not found: ID does not exist" Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.766084 4599 scope.go:117] "RemoveContainer" containerID="d136139635ab8d327cc5286d3b7c813647993b35ce4f04c45a4355e388e1551c" Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.810607 4599 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.817928 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.839088 4599 scope.go:117] "RemoveContainer" containerID="5dcb020f2b51b1d9b71cd795b65e6f8bc68eb5b0896d6d3e8026f5bbb0c957cb" Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.841988 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 12 07:52:44 crc kubenswrapper[4599]: E1012 07:52:44.842419 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e036a1a-bc46-419f-88e4-312037490ec1" containerName="setup-container" Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.842438 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e036a1a-bc46-419f-88e4-312037490ec1" containerName="setup-container" Oct 12 07:52:44 crc kubenswrapper[4599]: E1012 07:52:44.842466 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e036a1a-bc46-419f-88e4-312037490ec1" containerName="rabbitmq" Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.842472 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e036a1a-bc46-419f-88e4-312037490ec1" containerName="rabbitmq" Oct 12 07:52:44 crc kubenswrapper[4599]: E1012 07:52:44.842491 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4612b7e8-a507-4c57-989d-3411e4e302dd" containerName="rabbitmq" Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.842497 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="4612b7e8-a507-4c57-989d-3411e4e302dd" containerName="rabbitmq" Oct 12 07:52:44 crc kubenswrapper[4599]: E1012 07:52:44.842506 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4612b7e8-a507-4c57-989d-3411e4e302dd" containerName="setup-container" Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.842512 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="4612b7e8-a507-4c57-989d-3411e4e302dd" containerName="setup-container" Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.842704 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="4612b7e8-a507-4c57-989d-3411e4e302dd" containerName="rabbitmq" Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.842736 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e036a1a-bc46-419f-88e4-312037490ec1" containerName="rabbitmq" Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.844476 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.847078 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.847407 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.847742 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.848004 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-gj9zn" Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.848204 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.848411 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.848596 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.853391 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.864397 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e036a1a-bc46-419f-88e4-312037490ec1-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "2e036a1a-bc46-419f-88e4-312037490ec1" (UID: "2e036a1a-bc46-419f-88e4-312037490ec1"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.868819 4599 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2e036a1a-bc46-419f-88e4-312037490ec1-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.868844 4599 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.879133 4599 scope.go:117] "RemoveContainer" containerID="d136139635ab8d327cc5286d3b7c813647993b35ce4f04c45a4355e388e1551c" Oct 12 07:52:44 crc kubenswrapper[4599]: E1012 07:52:44.880668 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d136139635ab8d327cc5286d3b7c813647993b35ce4f04c45a4355e388e1551c\": container with ID starting with d136139635ab8d327cc5286d3b7c813647993b35ce4f04c45a4355e388e1551c not found: ID does not exist" containerID="d136139635ab8d327cc5286d3b7c813647993b35ce4f04c45a4355e388e1551c" Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.880701 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d136139635ab8d327cc5286d3b7c813647993b35ce4f04c45a4355e388e1551c"} err="failed to get container status \"d136139635ab8d327cc5286d3b7c813647993b35ce4f04c45a4355e388e1551c\": rpc error: code = NotFound desc = could not find container \"d136139635ab8d327cc5286d3b7c813647993b35ce4f04c45a4355e388e1551c\": container with ID starting with d136139635ab8d327cc5286d3b7c813647993b35ce4f04c45a4355e388e1551c not found: ID does not exist" Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.880737 4599 scope.go:117] "RemoveContainer" containerID="5dcb020f2b51b1d9b71cd795b65e6f8bc68eb5b0896d6d3e8026f5bbb0c957cb" Oct 12 07:52:44 crc kubenswrapper[4599]: E1012 07:52:44.882458 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5dcb020f2b51b1d9b71cd795b65e6f8bc68eb5b0896d6d3e8026f5bbb0c957cb\": container with ID starting with 5dcb020f2b51b1d9b71cd795b65e6f8bc68eb5b0896d6d3e8026f5bbb0c957cb not found: ID does not exist" containerID="5dcb020f2b51b1d9b71cd795b65e6f8bc68eb5b0896d6d3e8026f5bbb0c957cb" Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.882503 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5dcb020f2b51b1d9b71cd795b65e6f8bc68eb5b0896d6d3e8026f5bbb0c957cb"} err="failed to get container status \"5dcb020f2b51b1d9b71cd795b65e6f8bc68eb5b0896d6d3e8026f5bbb0c957cb\": rpc error: code = NotFound desc = could not find container \"5dcb020f2b51b1d9b71cd795b65e6f8bc68eb5b0896d6d3e8026f5bbb0c957cb\": container with ID starting with 5dcb020f2b51b1d9b71cd795b65e6f8bc68eb5b0896d6d3e8026f5bbb0c957cb not found: ID does not exist" Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.916676 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.930496 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.940977 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.946553 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.950528 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.950595 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.950882 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.951210 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.951464 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-rgq2c" Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.951853 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.951882 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.952141 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.970126 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ab388f40-a761-44f6-812f-df5cf4b02b73-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"ab388f40-a761-44f6-812f-df5cf4b02b73\") " pod="openstack/rabbitmq-server-0" Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.970193 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ab388f40-a761-44f6-812f-df5cf4b02b73-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"ab388f40-a761-44f6-812f-df5cf4b02b73\") " pod="openstack/rabbitmq-server-0" Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.970218 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ab388f40-a761-44f6-812f-df5cf4b02b73-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"ab388f40-a761-44f6-812f-df5cf4b02b73\") " pod="openstack/rabbitmq-server-0" Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.970236 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ab388f40-a761-44f6-812f-df5cf4b02b73-pod-info\") pod \"rabbitmq-server-0\" (UID: \"ab388f40-a761-44f6-812f-df5cf4b02b73\") " pod="openstack/rabbitmq-server-0" Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.970263 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7k547\" (UniqueName: \"kubernetes.io/projected/ab388f40-a761-44f6-812f-df5cf4b02b73-kube-api-access-7k547\") pod \"rabbitmq-server-0\" (UID: \"ab388f40-a761-44f6-812f-df5cf4b02b73\") " pod="openstack/rabbitmq-server-0" Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.970282 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ab388f40-a761-44f6-812f-df5cf4b02b73-server-conf\") pod \"rabbitmq-server-0\" (UID: \"ab388f40-a761-44f6-812f-df5cf4b02b73\") " pod="openstack/rabbitmq-server-0" Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.970299 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"ab388f40-a761-44f6-812f-df5cf4b02b73\") " pod="openstack/rabbitmq-server-0" Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.970377 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ab388f40-a761-44f6-812f-df5cf4b02b73-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"ab388f40-a761-44f6-812f-df5cf4b02b73\") " pod="openstack/rabbitmq-server-0" Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.970398 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ab388f40-a761-44f6-812f-df5cf4b02b73-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"ab388f40-a761-44f6-812f-df5cf4b02b73\") " pod="openstack/rabbitmq-server-0" Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.970433 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ab388f40-a761-44f6-812f-df5cf4b02b73-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"ab388f40-a761-44f6-812f-df5cf4b02b73\") " pod="openstack/rabbitmq-server-0" Oct 12 07:52:44 crc kubenswrapper[4599]: I1012 07:52:44.970450 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ab388f40-a761-44f6-812f-df5cf4b02b73-config-data\") pod \"rabbitmq-server-0\" (UID: \"ab388f40-a761-44f6-812f-df5cf4b02b73\") " pod="openstack/rabbitmq-server-0" Oct 12 07:52:45 crc kubenswrapper[4599]: I1012 07:52:45.073115 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ab388f40-a761-44f6-812f-df5cf4b02b73-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"ab388f40-a761-44f6-812f-df5cf4b02b73\") " pod="openstack/rabbitmq-server-0" Oct 12 07:52:45 crc kubenswrapper[4599]: I1012 07:52:45.073206 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vx6d9\" (UniqueName: \"kubernetes.io/projected/a9c90266-89e1-4527-8fa2-91826cbcc778-kube-api-access-vx6d9\") pod \"rabbitmq-cell1-server-0\" (UID: \"a9c90266-89e1-4527-8fa2-91826cbcc778\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 07:52:45 crc kubenswrapper[4599]: I1012 07:52:45.073252 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a9c90266-89e1-4527-8fa2-91826cbcc778-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"a9c90266-89e1-4527-8fa2-91826cbcc778\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 07:52:45 crc kubenswrapper[4599]: I1012 07:52:45.073279 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ab388f40-a761-44f6-812f-df5cf4b02b73-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"ab388f40-a761-44f6-812f-df5cf4b02b73\") " pod="openstack/rabbitmq-server-0" Oct 12 07:52:45 crc kubenswrapper[4599]: I1012 07:52:45.073302 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ab388f40-a761-44f6-812f-df5cf4b02b73-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"ab388f40-a761-44f6-812f-df5cf4b02b73\") " pod="openstack/rabbitmq-server-0" Oct 12 07:52:45 crc kubenswrapper[4599]: I1012 07:52:45.073489 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ab388f40-a761-44f6-812f-df5cf4b02b73-pod-info\") pod \"rabbitmq-server-0\" (UID: \"ab388f40-a761-44f6-812f-df5cf4b02b73\") " pod="openstack/rabbitmq-server-0" Oct 12 07:52:45 crc kubenswrapper[4599]: I1012 07:52:45.073574 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a9c90266-89e1-4527-8fa2-91826cbcc778-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"a9c90266-89e1-4527-8fa2-91826cbcc778\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 07:52:45 crc kubenswrapper[4599]: I1012 07:52:45.073681 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7k547\" (UniqueName: \"kubernetes.io/projected/ab388f40-a761-44f6-812f-df5cf4b02b73-kube-api-access-7k547\") pod \"rabbitmq-server-0\" (UID: \"ab388f40-a761-44f6-812f-df5cf4b02b73\") " pod="openstack/rabbitmq-server-0" Oct 12 07:52:45 crc kubenswrapper[4599]: I1012 07:52:45.073890 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a9c90266-89e1-4527-8fa2-91826cbcc778-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a9c90266-89e1-4527-8fa2-91826cbcc778\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 07:52:45 crc kubenswrapper[4599]: I1012 07:52:45.074083 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ab388f40-a761-44f6-812f-df5cf4b02b73-server-conf\") pod \"rabbitmq-server-0\" (UID: \"ab388f40-a761-44f6-812f-df5cf4b02b73\") " pod="openstack/rabbitmq-server-0" Oct 12 07:52:45 crc kubenswrapper[4599]: I1012 07:52:45.074183 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a9c90266-89e1-4527-8fa2-91826cbcc778-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a9c90266-89e1-4527-8fa2-91826cbcc778\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 07:52:45 crc kubenswrapper[4599]: I1012 07:52:45.074300 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"ab388f40-a761-44f6-812f-df5cf4b02b73\") " pod="openstack/rabbitmq-server-0" Oct 12 07:52:45 crc kubenswrapper[4599]: I1012 07:52:45.074704 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ab388f40-a761-44f6-812f-df5cf4b02b73-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"ab388f40-a761-44f6-812f-df5cf4b02b73\") " pod="openstack/rabbitmq-server-0" Oct 12 07:52:45 crc kubenswrapper[4599]: I1012 07:52:45.074797 4599 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"ab388f40-a761-44f6-812f-df5cf4b02b73\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/rabbitmq-server-0" Oct 12 07:52:45 crc kubenswrapper[4599]: I1012 07:52:45.074814 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"a9c90266-89e1-4527-8fa2-91826cbcc778\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 07:52:45 crc kubenswrapper[4599]: I1012 07:52:45.074925 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a9c90266-89e1-4527-8fa2-91826cbcc778-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a9c90266-89e1-4527-8fa2-91826cbcc778\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 07:52:45 crc kubenswrapper[4599]: I1012 07:52:45.075011 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ab388f40-a761-44f6-812f-df5cf4b02b73-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"ab388f40-a761-44f6-812f-df5cf4b02b73\") " pod="openstack/rabbitmq-server-0" Oct 12 07:52:45 crc kubenswrapper[4599]: I1012 07:52:45.075097 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a9c90266-89e1-4527-8fa2-91826cbcc778-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a9c90266-89e1-4527-8fa2-91826cbcc778\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 07:52:45 crc kubenswrapper[4599]: I1012 07:52:45.075182 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a9c90266-89e1-4527-8fa2-91826cbcc778-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a9c90266-89e1-4527-8fa2-91826cbcc778\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 07:52:45 crc kubenswrapper[4599]: I1012 07:52:45.075407 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ab388f40-a761-44f6-812f-df5cf4b02b73-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"ab388f40-a761-44f6-812f-df5cf4b02b73\") " pod="openstack/rabbitmq-server-0" Oct 12 07:52:45 crc kubenswrapper[4599]: I1012 07:52:45.075516 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ab388f40-a761-44f6-812f-df5cf4b02b73-config-data\") pod \"rabbitmq-server-0\" (UID: \"ab388f40-a761-44f6-812f-df5cf4b02b73\") " pod="openstack/rabbitmq-server-0" Oct 12 07:52:45 crc kubenswrapper[4599]: I1012 07:52:45.075589 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a9c90266-89e1-4527-8fa2-91826cbcc778-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a9c90266-89e1-4527-8fa2-91826cbcc778\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 07:52:45 crc kubenswrapper[4599]: I1012 07:52:45.075456 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ab388f40-a761-44f6-812f-df5cf4b02b73-server-conf\") pod \"rabbitmq-server-0\" (UID: \"ab388f40-a761-44f6-812f-df5cf4b02b73\") " pod="openstack/rabbitmq-server-0" Oct 12 07:52:45 crc kubenswrapper[4599]: I1012 07:52:45.075778 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ab388f40-a761-44f6-812f-df5cf4b02b73-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"ab388f40-a761-44f6-812f-df5cf4b02b73\") " pod="openstack/rabbitmq-server-0" Oct 12 07:52:45 crc kubenswrapper[4599]: I1012 07:52:45.075806 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a9c90266-89e1-4527-8fa2-91826cbcc778-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a9c90266-89e1-4527-8fa2-91826cbcc778\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 07:52:45 crc kubenswrapper[4599]: I1012 07:52:45.075847 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ab388f40-a761-44f6-812f-df5cf4b02b73-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"ab388f40-a761-44f6-812f-df5cf4b02b73\") " pod="openstack/rabbitmq-server-0" Oct 12 07:52:45 crc kubenswrapper[4599]: I1012 07:52:45.076202 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ab388f40-a761-44f6-812f-df5cf4b02b73-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"ab388f40-a761-44f6-812f-df5cf4b02b73\") " pod="openstack/rabbitmq-server-0" Oct 12 07:52:45 crc kubenswrapper[4599]: I1012 07:52:45.076487 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ab388f40-a761-44f6-812f-df5cf4b02b73-config-data\") pod \"rabbitmq-server-0\" (UID: \"ab388f40-a761-44f6-812f-df5cf4b02b73\") " pod="openstack/rabbitmq-server-0" Oct 12 07:52:45 crc kubenswrapper[4599]: I1012 07:52:45.079540 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ab388f40-a761-44f6-812f-df5cf4b02b73-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"ab388f40-a761-44f6-812f-df5cf4b02b73\") " pod="openstack/rabbitmq-server-0" Oct 12 07:52:45 crc kubenswrapper[4599]: I1012 07:52:45.079889 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ab388f40-a761-44f6-812f-df5cf4b02b73-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"ab388f40-a761-44f6-812f-df5cf4b02b73\") " pod="openstack/rabbitmq-server-0" Oct 12 07:52:45 crc kubenswrapper[4599]: I1012 07:52:45.081371 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ab388f40-a761-44f6-812f-df5cf4b02b73-pod-info\") pod \"rabbitmq-server-0\" (UID: \"ab388f40-a761-44f6-812f-df5cf4b02b73\") " pod="openstack/rabbitmq-server-0" Oct 12 07:52:45 crc kubenswrapper[4599]: I1012 07:52:45.081889 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ab388f40-a761-44f6-812f-df5cf4b02b73-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"ab388f40-a761-44f6-812f-df5cf4b02b73\") " pod="openstack/rabbitmq-server-0" Oct 12 07:52:45 crc kubenswrapper[4599]: I1012 07:52:45.091916 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7k547\" (UniqueName: \"kubernetes.io/projected/ab388f40-a761-44f6-812f-df5cf4b02b73-kube-api-access-7k547\") pod \"rabbitmq-server-0\" (UID: \"ab388f40-a761-44f6-812f-df5cf4b02b73\") " pod="openstack/rabbitmq-server-0" Oct 12 07:52:45 crc kubenswrapper[4599]: I1012 07:52:45.113711 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"ab388f40-a761-44f6-812f-df5cf4b02b73\") " pod="openstack/rabbitmq-server-0" Oct 12 07:52:45 crc kubenswrapper[4599]: I1012 07:52:45.178425 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vx6d9\" (UniqueName: \"kubernetes.io/projected/a9c90266-89e1-4527-8fa2-91826cbcc778-kube-api-access-vx6d9\") pod \"rabbitmq-cell1-server-0\" (UID: \"a9c90266-89e1-4527-8fa2-91826cbcc778\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 07:52:45 crc kubenswrapper[4599]: I1012 07:52:45.178482 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a9c90266-89e1-4527-8fa2-91826cbcc778-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"a9c90266-89e1-4527-8fa2-91826cbcc778\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 07:52:45 crc kubenswrapper[4599]: I1012 07:52:45.178516 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a9c90266-89e1-4527-8fa2-91826cbcc778-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"a9c90266-89e1-4527-8fa2-91826cbcc778\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 07:52:45 crc kubenswrapper[4599]: I1012 07:52:45.178546 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a9c90266-89e1-4527-8fa2-91826cbcc778-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a9c90266-89e1-4527-8fa2-91826cbcc778\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 07:52:45 crc kubenswrapper[4599]: I1012 07:52:45.178561 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a9c90266-89e1-4527-8fa2-91826cbcc778-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a9c90266-89e1-4527-8fa2-91826cbcc778\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 07:52:45 crc kubenswrapper[4599]: I1012 07:52:45.178619 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"a9c90266-89e1-4527-8fa2-91826cbcc778\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 07:52:45 crc kubenswrapper[4599]: I1012 07:52:45.178634 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a9c90266-89e1-4527-8fa2-91826cbcc778-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a9c90266-89e1-4527-8fa2-91826cbcc778\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 07:52:45 crc kubenswrapper[4599]: I1012 07:52:45.178654 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a9c90266-89e1-4527-8fa2-91826cbcc778-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a9c90266-89e1-4527-8fa2-91826cbcc778\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 07:52:45 crc kubenswrapper[4599]: I1012 07:52:45.178671 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a9c90266-89e1-4527-8fa2-91826cbcc778-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a9c90266-89e1-4527-8fa2-91826cbcc778\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 07:52:45 crc kubenswrapper[4599]: I1012 07:52:45.178703 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a9c90266-89e1-4527-8fa2-91826cbcc778-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a9c90266-89e1-4527-8fa2-91826cbcc778\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 07:52:45 crc kubenswrapper[4599]: I1012 07:52:45.178729 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a9c90266-89e1-4527-8fa2-91826cbcc778-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a9c90266-89e1-4527-8fa2-91826cbcc778\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 07:52:45 crc kubenswrapper[4599]: I1012 07:52:45.178888 4599 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"a9c90266-89e1-4527-8fa2-91826cbcc778\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-cell1-server-0" Oct 12 07:52:45 crc kubenswrapper[4599]: I1012 07:52:45.180184 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a9c90266-89e1-4527-8fa2-91826cbcc778-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a9c90266-89e1-4527-8fa2-91826cbcc778\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 07:52:45 crc kubenswrapper[4599]: I1012 07:52:45.180223 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a9c90266-89e1-4527-8fa2-91826cbcc778-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a9c90266-89e1-4527-8fa2-91826cbcc778\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 07:52:45 crc kubenswrapper[4599]: I1012 07:52:45.180615 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a9c90266-89e1-4527-8fa2-91826cbcc778-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"a9c90266-89e1-4527-8fa2-91826cbcc778\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 07:52:45 crc kubenswrapper[4599]: I1012 07:52:45.180803 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a9c90266-89e1-4527-8fa2-91826cbcc778-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a9c90266-89e1-4527-8fa2-91826cbcc778\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 07:52:45 crc kubenswrapper[4599]: I1012 07:52:45.180984 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 12 07:52:45 crc kubenswrapper[4599]: I1012 07:52:45.182513 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a9c90266-89e1-4527-8fa2-91826cbcc778-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a9c90266-89e1-4527-8fa2-91826cbcc778\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 07:52:45 crc kubenswrapper[4599]: I1012 07:52:45.183596 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a9c90266-89e1-4527-8fa2-91826cbcc778-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a9c90266-89e1-4527-8fa2-91826cbcc778\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 07:52:45 crc kubenswrapper[4599]: I1012 07:52:45.184108 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a9c90266-89e1-4527-8fa2-91826cbcc778-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a9c90266-89e1-4527-8fa2-91826cbcc778\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 07:52:45 crc kubenswrapper[4599]: I1012 07:52:45.189579 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a9c90266-89e1-4527-8fa2-91826cbcc778-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a9c90266-89e1-4527-8fa2-91826cbcc778\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 07:52:45 crc kubenswrapper[4599]: I1012 07:52:45.189969 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a9c90266-89e1-4527-8fa2-91826cbcc778-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"a9c90266-89e1-4527-8fa2-91826cbcc778\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 07:52:45 crc kubenswrapper[4599]: I1012 07:52:45.201757 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vx6d9\" (UniqueName: \"kubernetes.io/projected/a9c90266-89e1-4527-8fa2-91826cbcc778-kube-api-access-vx6d9\") pod \"rabbitmq-cell1-server-0\" (UID: \"a9c90266-89e1-4527-8fa2-91826cbcc778\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 07:52:45 crc kubenswrapper[4599]: I1012 07:52:45.213505 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"a9c90266-89e1-4527-8fa2-91826cbcc778\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 07:52:45 crc kubenswrapper[4599]: I1012 07:52:45.264073 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 12 07:52:45 crc kubenswrapper[4599]: I1012 07:52:45.557692 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e036a1a-bc46-419f-88e4-312037490ec1" path="/var/lib/kubelet/pods/2e036a1a-bc46-419f-88e4-312037490ec1/volumes" Oct 12 07:52:45 crc kubenswrapper[4599]: I1012 07:52:45.558522 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4612b7e8-a507-4c57-989d-3411e4e302dd" path="/var/lib/kubelet/pods/4612b7e8-a507-4c57-989d-3411e4e302dd/volumes" Oct 12 07:52:45 crc kubenswrapper[4599]: I1012 07:52:45.593776 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 12 07:52:45 crc kubenswrapper[4599]: I1012 07:52:45.627795 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ab388f40-a761-44f6-812f-df5cf4b02b73","Type":"ContainerStarted","Data":"cb14c3845066239322f8faf20c027e806edf5687ca396f47c00ed346872dd6db"} Oct 12 07:52:45 crc kubenswrapper[4599]: I1012 07:52:45.677456 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 12 07:52:45 crc kubenswrapper[4599]: I1012 07:52:45.792608 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7c979c585f-grz99"] Oct 12 07:52:45 crc kubenswrapper[4599]: I1012 07:52:45.794180 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c979c585f-grz99" Oct 12 07:52:45 crc kubenswrapper[4599]: I1012 07:52:45.796043 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Oct 12 07:52:45 crc kubenswrapper[4599]: I1012 07:52:45.827569 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c979c585f-grz99"] Oct 12 07:52:45 crc kubenswrapper[4599]: I1012 07:52:45.896139 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b88d7e52-0543-47be-adf8-5294aeabb3e9-ovsdbserver-nb\") pod \"dnsmasq-dns-7c979c585f-grz99\" (UID: \"b88d7e52-0543-47be-adf8-5294aeabb3e9\") " pod="openstack/dnsmasq-dns-7c979c585f-grz99" Oct 12 07:52:45 crc kubenswrapper[4599]: I1012 07:52:45.896479 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckqzx\" (UniqueName: \"kubernetes.io/projected/b88d7e52-0543-47be-adf8-5294aeabb3e9-kube-api-access-ckqzx\") pod \"dnsmasq-dns-7c979c585f-grz99\" (UID: \"b88d7e52-0543-47be-adf8-5294aeabb3e9\") " pod="openstack/dnsmasq-dns-7c979c585f-grz99" Oct 12 07:52:45 crc kubenswrapper[4599]: I1012 07:52:45.896752 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b88d7e52-0543-47be-adf8-5294aeabb3e9-ovsdbserver-sb\") pod \"dnsmasq-dns-7c979c585f-grz99\" (UID: \"b88d7e52-0543-47be-adf8-5294aeabb3e9\") " pod="openstack/dnsmasq-dns-7c979c585f-grz99" Oct 12 07:52:45 crc kubenswrapper[4599]: I1012 07:52:45.897024 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/b88d7e52-0543-47be-adf8-5294aeabb3e9-openstack-edpm-ipam\") pod \"dnsmasq-dns-7c979c585f-grz99\" (UID: \"b88d7e52-0543-47be-adf8-5294aeabb3e9\") " pod="openstack/dnsmasq-dns-7c979c585f-grz99" Oct 12 07:52:45 crc kubenswrapper[4599]: I1012 07:52:45.897296 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b88d7e52-0543-47be-adf8-5294aeabb3e9-dns-svc\") pod \"dnsmasq-dns-7c979c585f-grz99\" (UID: \"b88d7e52-0543-47be-adf8-5294aeabb3e9\") " pod="openstack/dnsmasq-dns-7c979c585f-grz99" Oct 12 07:52:45 crc kubenswrapper[4599]: I1012 07:52:45.897365 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b88d7e52-0543-47be-adf8-5294aeabb3e9-config\") pod \"dnsmasq-dns-7c979c585f-grz99\" (UID: \"b88d7e52-0543-47be-adf8-5294aeabb3e9\") " pod="openstack/dnsmasq-dns-7c979c585f-grz99" Oct 12 07:52:45 crc kubenswrapper[4599]: I1012 07:52:45.897474 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b88d7e52-0543-47be-adf8-5294aeabb3e9-dns-swift-storage-0\") pod \"dnsmasq-dns-7c979c585f-grz99\" (UID: \"b88d7e52-0543-47be-adf8-5294aeabb3e9\") " pod="openstack/dnsmasq-dns-7c979c585f-grz99" Oct 12 07:52:45 crc kubenswrapper[4599]: I1012 07:52:45.999887 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b88d7e52-0543-47be-adf8-5294aeabb3e9-dns-svc\") pod \"dnsmasq-dns-7c979c585f-grz99\" (UID: \"b88d7e52-0543-47be-adf8-5294aeabb3e9\") " pod="openstack/dnsmasq-dns-7c979c585f-grz99" Oct 12 07:52:45 crc kubenswrapper[4599]: I1012 07:52:45.999958 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b88d7e52-0543-47be-adf8-5294aeabb3e9-config\") pod \"dnsmasq-dns-7c979c585f-grz99\" (UID: \"b88d7e52-0543-47be-adf8-5294aeabb3e9\") " pod="openstack/dnsmasq-dns-7c979c585f-grz99" Oct 12 07:52:46 crc kubenswrapper[4599]: I1012 07:52:45.999995 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b88d7e52-0543-47be-adf8-5294aeabb3e9-dns-swift-storage-0\") pod \"dnsmasq-dns-7c979c585f-grz99\" (UID: \"b88d7e52-0543-47be-adf8-5294aeabb3e9\") " pod="openstack/dnsmasq-dns-7c979c585f-grz99" Oct 12 07:52:46 crc kubenswrapper[4599]: I1012 07:52:46.000026 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b88d7e52-0543-47be-adf8-5294aeabb3e9-ovsdbserver-nb\") pod \"dnsmasq-dns-7c979c585f-grz99\" (UID: \"b88d7e52-0543-47be-adf8-5294aeabb3e9\") " pod="openstack/dnsmasq-dns-7c979c585f-grz99" Oct 12 07:52:46 crc kubenswrapper[4599]: I1012 07:52:46.000121 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckqzx\" (UniqueName: \"kubernetes.io/projected/b88d7e52-0543-47be-adf8-5294aeabb3e9-kube-api-access-ckqzx\") pod \"dnsmasq-dns-7c979c585f-grz99\" (UID: \"b88d7e52-0543-47be-adf8-5294aeabb3e9\") " pod="openstack/dnsmasq-dns-7c979c585f-grz99" Oct 12 07:52:46 crc kubenswrapper[4599]: I1012 07:52:46.000216 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b88d7e52-0543-47be-adf8-5294aeabb3e9-ovsdbserver-sb\") pod \"dnsmasq-dns-7c979c585f-grz99\" (UID: \"b88d7e52-0543-47be-adf8-5294aeabb3e9\") " pod="openstack/dnsmasq-dns-7c979c585f-grz99" Oct 12 07:52:46 crc kubenswrapper[4599]: I1012 07:52:46.000391 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/b88d7e52-0543-47be-adf8-5294aeabb3e9-openstack-edpm-ipam\") pod \"dnsmasq-dns-7c979c585f-grz99\" (UID: \"b88d7e52-0543-47be-adf8-5294aeabb3e9\") " pod="openstack/dnsmasq-dns-7c979c585f-grz99" Oct 12 07:52:46 crc kubenswrapper[4599]: I1012 07:52:46.001007 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b88d7e52-0543-47be-adf8-5294aeabb3e9-dns-svc\") pod \"dnsmasq-dns-7c979c585f-grz99\" (UID: \"b88d7e52-0543-47be-adf8-5294aeabb3e9\") " pod="openstack/dnsmasq-dns-7c979c585f-grz99" Oct 12 07:52:46 crc kubenswrapper[4599]: I1012 07:52:46.001006 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b88d7e52-0543-47be-adf8-5294aeabb3e9-config\") pod \"dnsmasq-dns-7c979c585f-grz99\" (UID: \"b88d7e52-0543-47be-adf8-5294aeabb3e9\") " pod="openstack/dnsmasq-dns-7c979c585f-grz99" Oct 12 07:52:46 crc kubenswrapper[4599]: I1012 07:52:46.001540 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b88d7e52-0543-47be-adf8-5294aeabb3e9-dns-swift-storage-0\") pod \"dnsmasq-dns-7c979c585f-grz99\" (UID: \"b88d7e52-0543-47be-adf8-5294aeabb3e9\") " pod="openstack/dnsmasq-dns-7c979c585f-grz99" Oct 12 07:52:46 crc kubenswrapper[4599]: I1012 07:52:46.001569 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b88d7e52-0543-47be-adf8-5294aeabb3e9-ovsdbserver-sb\") pod \"dnsmasq-dns-7c979c585f-grz99\" (UID: \"b88d7e52-0543-47be-adf8-5294aeabb3e9\") " pod="openstack/dnsmasq-dns-7c979c585f-grz99" Oct 12 07:52:46 crc kubenswrapper[4599]: I1012 07:52:46.001561 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b88d7e52-0543-47be-adf8-5294aeabb3e9-ovsdbserver-nb\") pod \"dnsmasq-dns-7c979c585f-grz99\" (UID: \"b88d7e52-0543-47be-adf8-5294aeabb3e9\") " pod="openstack/dnsmasq-dns-7c979c585f-grz99" Oct 12 07:52:46 crc kubenswrapper[4599]: I1012 07:52:46.002088 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/b88d7e52-0543-47be-adf8-5294aeabb3e9-openstack-edpm-ipam\") pod \"dnsmasq-dns-7c979c585f-grz99\" (UID: \"b88d7e52-0543-47be-adf8-5294aeabb3e9\") " pod="openstack/dnsmasq-dns-7c979c585f-grz99" Oct 12 07:52:46 crc kubenswrapper[4599]: I1012 07:52:46.017663 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckqzx\" (UniqueName: \"kubernetes.io/projected/b88d7e52-0543-47be-adf8-5294aeabb3e9-kube-api-access-ckqzx\") pod \"dnsmasq-dns-7c979c585f-grz99\" (UID: \"b88d7e52-0543-47be-adf8-5294aeabb3e9\") " pod="openstack/dnsmasq-dns-7c979c585f-grz99" Oct 12 07:52:46 crc kubenswrapper[4599]: I1012 07:52:46.107241 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c979c585f-grz99" Oct 12 07:52:46 crc kubenswrapper[4599]: I1012 07:52:46.514238 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c979c585f-grz99"] Oct 12 07:52:46 crc kubenswrapper[4599]: I1012 07:52:46.652225 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a9c90266-89e1-4527-8fa2-91826cbcc778","Type":"ContainerStarted","Data":"08f38652ccd452d1ecfe6bb3670e63e72a8770296e17fd208cbdf68a9752ca1d"} Oct 12 07:52:46 crc kubenswrapper[4599]: I1012 07:52:46.654667 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c979c585f-grz99" event={"ID":"b88d7e52-0543-47be-adf8-5294aeabb3e9","Type":"ContainerStarted","Data":"e5bb3c60561b74be804c04956e9cddcdc77edc3cb1f210d2ea7ab0963f117285"} Oct 12 07:52:47 crc kubenswrapper[4599]: I1012 07:52:47.665224 4599 generic.go:334] "Generic (PLEG): container finished" podID="b88d7e52-0543-47be-adf8-5294aeabb3e9" containerID="b6a74b069ddf1c7fb5f9192bf4a793d89ecffb0c2f5669fd5fef53e4e605772e" exitCode=0 Oct 12 07:52:47 crc kubenswrapper[4599]: I1012 07:52:47.665296 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c979c585f-grz99" event={"ID":"b88d7e52-0543-47be-adf8-5294aeabb3e9","Type":"ContainerDied","Data":"b6a74b069ddf1c7fb5f9192bf4a793d89ecffb0c2f5669fd5fef53e4e605772e"} Oct 12 07:52:47 crc kubenswrapper[4599]: I1012 07:52:47.668130 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ab388f40-a761-44f6-812f-df5cf4b02b73","Type":"ContainerStarted","Data":"c4b19f96e01d826bb426c38c06cf8755d36155ec7b5e90e7c266acc0b8ee0862"} Oct 12 07:52:47 crc kubenswrapper[4599]: I1012 07:52:47.670542 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a9c90266-89e1-4527-8fa2-91826cbcc778","Type":"ContainerStarted","Data":"15a649d5d27b21eae8af830f3b839f1526f77aae3a25d24e57d477809ecf2dee"} Oct 12 07:52:48 crc kubenswrapper[4599]: I1012 07:52:48.679754 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c979c585f-grz99" event={"ID":"b88d7e52-0543-47be-adf8-5294aeabb3e9","Type":"ContainerStarted","Data":"8c151d672fd6b4d3f644b4a15d5c8268d28d4ec50593f73e190e1056788a3dd6"} Oct 12 07:52:48 crc kubenswrapper[4599]: I1012 07:52:48.704915 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7c979c585f-grz99" podStartSLOduration=3.7048932150000002 podStartE2EDuration="3.704893215s" podCreationTimestamp="2025-10-12 07:52:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:52:48.69573241 +0000 UTC m=+1065.484927912" watchObservedRunningTime="2025-10-12 07:52:48.704893215 +0000 UTC m=+1065.494088717" Oct 12 07:52:49 crc kubenswrapper[4599]: I1012 07:52:49.691259 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7c979c585f-grz99" Oct 12 07:52:56 crc kubenswrapper[4599]: I1012 07:52:56.108579 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7c979c585f-grz99" Oct 12 07:52:56 crc kubenswrapper[4599]: I1012 07:52:56.152209 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57db76b469-s5l9q"] Oct 12 07:52:56 crc kubenswrapper[4599]: I1012 07:52:56.152473 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57db76b469-s5l9q" podUID="2f4ce98b-b927-4e97-8066-e8eac5fa0c5e" containerName="dnsmasq-dns" containerID="cri-o://12b9fccc3bac8c37f063d62ea18186c7beb8665965052758fae59d0afa547fc5" gracePeriod=10 Oct 12 07:52:56 crc kubenswrapper[4599]: I1012 07:52:56.266766 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5d885c8d8c-mm9q2"] Oct 12 07:52:56 crc kubenswrapper[4599]: I1012 07:52:56.268613 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d885c8d8c-mm9q2" Oct 12 07:52:56 crc kubenswrapper[4599]: I1012 07:52:56.276918 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d885c8d8c-mm9q2"] Oct 12 07:52:56 crc kubenswrapper[4599]: I1012 07:52:56.423574 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aa8d5577-ea50-40e9-8549-7c7ad4da7ee6-ovsdbserver-sb\") pod \"dnsmasq-dns-5d885c8d8c-mm9q2\" (UID: \"aa8d5577-ea50-40e9-8549-7c7ad4da7ee6\") " pod="openstack/dnsmasq-dns-5d885c8d8c-mm9q2" Oct 12 07:52:56 crc kubenswrapper[4599]: I1012 07:52:56.423668 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bvdh\" (UniqueName: \"kubernetes.io/projected/aa8d5577-ea50-40e9-8549-7c7ad4da7ee6-kube-api-access-5bvdh\") pod \"dnsmasq-dns-5d885c8d8c-mm9q2\" (UID: \"aa8d5577-ea50-40e9-8549-7c7ad4da7ee6\") " pod="openstack/dnsmasq-dns-5d885c8d8c-mm9q2" Oct 12 07:52:56 crc kubenswrapper[4599]: I1012 07:52:56.423709 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aa8d5577-ea50-40e9-8549-7c7ad4da7ee6-dns-swift-storage-0\") pod \"dnsmasq-dns-5d885c8d8c-mm9q2\" (UID: \"aa8d5577-ea50-40e9-8549-7c7ad4da7ee6\") " pod="openstack/dnsmasq-dns-5d885c8d8c-mm9q2" Oct 12 07:52:56 crc kubenswrapper[4599]: I1012 07:52:56.423731 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aa8d5577-ea50-40e9-8549-7c7ad4da7ee6-ovsdbserver-nb\") pod \"dnsmasq-dns-5d885c8d8c-mm9q2\" (UID: \"aa8d5577-ea50-40e9-8549-7c7ad4da7ee6\") " pod="openstack/dnsmasq-dns-5d885c8d8c-mm9q2" Oct 12 07:52:56 crc kubenswrapper[4599]: I1012 07:52:56.423776 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa8d5577-ea50-40e9-8549-7c7ad4da7ee6-config\") pod \"dnsmasq-dns-5d885c8d8c-mm9q2\" (UID: \"aa8d5577-ea50-40e9-8549-7c7ad4da7ee6\") " pod="openstack/dnsmasq-dns-5d885c8d8c-mm9q2" Oct 12 07:52:56 crc kubenswrapper[4599]: I1012 07:52:56.423822 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/aa8d5577-ea50-40e9-8549-7c7ad4da7ee6-openstack-edpm-ipam\") pod \"dnsmasq-dns-5d885c8d8c-mm9q2\" (UID: \"aa8d5577-ea50-40e9-8549-7c7ad4da7ee6\") " pod="openstack/dnsmasq-dns-5d885c8d8c-mm9q2" Oct 12 07:52:56 crc kubenswrapper[4599]: I1012 07:52:56.423850 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa8d5577-ea50-40e9-8549-7c7ad4da7ee6-dns-svc\") pod \"dnsmasq-dns-5d885c8d8c-mm9q2\" (UID: \"aa8d5577-ea50-40e9-8549-7c7ad4da7ee6\") " pod="openstack/dnsmasq-dns-5d885c8d8c-mm9q2" Oct 12 07:52:56 crc kubenswrapper[4599]: I1012 07:52:56.525715 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa8d5577-ea50-40e9-8549-7c7ad4da7ee6-config\") pod \"dnsmasq-dns-5d885c8d8c-mm9q2\" (UID: \"aa8d5577-ea50-40e9-8549-7c7ad4da7ee6\") " pod="openstack/dnsmasq-dns-5d885c8d8c-mm9q2" Oct 12 07:52:56 crc kubenswrapper[4599]: I1012 07:52:56.526109 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/aa8d5577-ea50-40e9-8549-7c7ad4da7ee6-openstack-edpm-ipam\") pod \"dnsmasq-dns-5d885c8d8c-mm9q2\" (UID: \"aa8d5577-ea50-40e9-8549-7c7ad4da7ee6\") " pod="openstack/dnsmasq-dns-5d885c8d8c-mm9q2" Oct 12 07:52:56 crc kubenswrapper[4599]: I1012 07:52:56.526178 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa8d5577-ea50-40e9-8549-7c7ad4da7ee6-dns-svc\") pod \"dnsmasq-dns-5d885c8d8c-mm9q2\" (UID: \"aa8d5577-ea50-40e9-8549-7c7ad4da7ee6\") " pod="openstack/dnsmasq-dns-5d885c8d8c-mm9q2" Oct 12 07:52:56 crc kubenswrapper[4599]: I1012 07:52:56.526227 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aa8d5577-ea50-40e9-8549-7c7ad4da7ee6-ovsdbserver-sb\") pod \"dnsmasq-dns-5d885c8d8c-mm9q2\" (UID: \"aa8d5577-ea50-40e9-8549-7c7ad4da7ee6\") " pod="openstack/dnsmasq-dns-5d885c8d8c-mm9q2" Oct 12 07:52:56 crc kubenswrapper[4599]: I1012 07:52:56.526370 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bvdh\" (UniqueName: \"kubernetes.io/projected/aa8d5577-ea50-40e9-8549-7c7ad4da7ee6-kube-api-access-5bvdh\") pod \"dnsmasq-dns-5d885c8d8c-mm9q2\" (UID: \"aa8d5577-ea50-40e9-8549-7c7ad4da7ee6\") " pod="openstack/dnsmasq-dns-5d885c8d8c-mm9q2" Oct 12 07:52:56 crc kubenswrapper[4599]: I1012 07:52:56.526444 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aa8d5577-ea50-40e9-8549-7c7ad4da7ee6-dns-swift-storage-0\") pod \"dnsmasq-dns-5d885c8d8c-mm9q2\" (UID: \"aa8d5577-ea50-40e9-8549-7c7ad4da7ee6\") " pod="openstack/dnsmasq-dns-5d885c8d8c-mm9q2" Oct 12 07:52:56 crc kubenswrapper[4599]: I1012 07:52:56.526481 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aa8d5577-ea50-40e9-8549-7c7ad4da7ee6-ovsdbserver-nb\") pod \"dnsmasq-dns-5d885c8d8c-mm9q2\" (UID: \"aa8d5577-ea50-40e9-8549-7c7ad4da7ee6\") " pod="openstack/dnsmasq-dns-5d885c8d8c-mm9q2" Oct 12 07:52:56 crc kubenswrapper[4599]: I1012 07:52:56.528153 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aa8d5577-ea50-40e9-8549-7c7ad4da7ee6-ovsdbserver-sb\") pod \"dnsmasq-dns-5d885c8d8c-mm9q2\" (UID: \"aa8d5577-ea50-40e9-8549-7c7ad4da7ee6\") " pod="openstack/dnsmasq-dns-5d885c8d8c-mm9q2" Oct 12 07:52:56 crc kubenswrapper[4599]: I1012 07:52:56.532864 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aa8d5577-ea50-40e9-8549-7c7ad4da7ee6-dns-swift-storage-0\") pod \"dnsmasq-dns-5d885c8d8c-mm9q2\" (UID: \"aa8d5577-ea50-40e9-8549-7c7ad4da7ee6\") " pod="openstack/dnsmasq-dns-5d885c8d8c-mm9q2" Oct 12 07:52:56 crc kubenswrapper[4599]: I1012 07:52:56.534822 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa8d5577-ea50-40e9-8549-7c7ad4da7ee6-config\") pod \"dnsmasq-dns-5d885c8d8c-mm9q2\" (UID: \"aa8d5577-ea50-40e9-8549-7c7ad4da7ee6\") " pod="openstack/dnsmasq-dns-5d885c8d8c-mm9q2" Oct 12 07:52:56 crc kubenswrapper[4599]: I1012 07:52:56.537890 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/aa8d5577-ea50-40e9-8549-7c7ad4da7ee6-openstack-edpm-ipam\") pod \"dnsmasq-dns-5d885c8d8c-mm9q2\" (UID: \"aa8d5577-ea50-40e9-8549-7c7ad4da7ee6\") " pod="openstack/dnsmasq-dns-5d885c8d8c-mm9q2" Oct 12 07:52:56 crc kubenswrapper[4599]: I1012 07:52:56.538431 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aa8d5577-ea50-40e9-8549-7c7ad4da7ee6-ovsdbserver-nb\") pod \"dnsmasq-dns-5d885c8d8c-mm9q2\" (UID: \"aa8d5577-ea50-40e9-8549-7c7ad4da7ee6\") " pod="openstack/dnsmasq-dns-5d885c8d8c-mm9q2" Oct 12 07:52:56 crc kubenswrapper[4599]: I1012 07:52:56.540779 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa8d5577-ea50-40e9-8549-7c7ad4da7ee6-dns-svc\") pod \"dnsmasq-dns-5d885c8d8c-mm9q2\" (UID: \"aa8d5577-ea50-40e9-8549-7c7ad4da7ee6\") " pod="openstack/dnsmasq-dns-5d885c8d8c-mm9q2" Oct 12 07:52:56 crc kubenswrapper[4599]: I1012 07:52:56.558015 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bvdh\" (UniqueName: \"kubernetes.io/projected/aa8d5577-ea50-40e9-8549-7c7ad4da7ee6-kube-api-access-5bvdh\") pod \"dnsmasq-dns-5d885c8d8c-mm9q2\" (UID: \"aa8d5577-ea50-40e9-8549-7c7ad4da7ee6\") " pod="openstack/dnsmasq-dns-5d885c8d8c-mm9q2" Oct 12 07:52:56 crc kubenswrapper[4599]: I1012 07:52:56.606283 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57db76b469-s5l9q" Oct 12 07:52:56 crc kubenswrapper[4599]: I1012 07:52:56.613914 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d885c8d8c-mm9q2" Oct 12 07:52:56 crc kubenswrapper[4599]: I1012 07:52:56.730432 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-64zv7\" (UniqueName: \"kubernetes.io/projected/2f4ce98b-b927-4e97-8066-e8eac5fa0c5e-kube-api-access-64zv7\") pod \"2f4ce98b-b927-4e97-8066-e8eac5fa0c5e\" (UID: \"2f4ce98b-b927-4e97-8066-e8eac5fa0c5e\") " Oct 12 07:52:56 crc kubenswrapper[4599]: I1012 07:52:56.730477 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2f4ce98b-b927-4e97-8066-e8eac5fa0c5e-ovsdbserver-nb\") pod \"2f4ce98b-b927-4e97-8066-e8eac5fa0c5e\" (UID: \"2f4ce98b-b927-4e97-8066-e8eac5fa0c5e\") " Oct 12 07:52:56 crc kubenswrapper[4599]: I1012 07:52:56.730751 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f4ce98b-b927-4e97-8066-e8eac5fa0c5e-config\") pod \"2f4ce98b-b927-4e97-8066-e8eac5fa0c5e\" (UID: \"2f4ce98b-b927-4e97-8066-e8eac5fa0c5e\") " Oct 12 07:52:56 crc kubenswrapper[4599]: I1012 07:52:56.730781 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2f4ce98b-b927-4e97-8066-e8eac5fa0c5e-ovsdbserver-sb\") pod \"2f4ce98b-b927-4e97-8066-e8eac5fa0c5e\" (UID: \"2f4ce98b-b927-4e97-8066-e8eac5fa0c5e\") " Oct 12 07:52:56 crc kubenswrapper[4599]: I1012 07:52:56.730897 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2f4ce98b-b927-4e97-8066-e8eac5fa0c5e-dns-svc\") pod \"2f4ce98b-b927-4e97-8066-e8eac5fa0c5e\" (UID: \"2f4ce98b-b927-4e97-8066-e8eac5fa0c5e\") " Oct 12 07:52:56 crc kubenswrapper[4599]: I1012 07:52:56.730928 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2f4ce98b-b927-4e97-8066-e8eac5fa0c5e-dns-swift-storage-0\") pod \"2f4ce98b-b927-4e97-8066-e8eac5fa0c5e\" (UID: \"2f4ce98b-b927-4e97-8066-e8eac5fa0c5e\") " Oct 12 07:52:56 crc kubenswrapper[4599]: I1012 07:52:56.736505 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f4ce98b-b927-4e97-8066-e8eac5fa0c5e-kube-api-access-64zv7" (OuterVolumeSpecName: "kube-api-access-64zv7") pod "2f4ce98b-b927-4e97-8066-e8eac5fa0c5e" (UID: "2f4ce98b-b927-4e97-8066-e8eac5fa0c5e"). InnerVolumeSpecName "kube-api-access-64zv7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:52:56 crc kubenswrapper[4599]: I1012 07:52:56.762472 4599 generic.go:334] "Generic (PLEG): container finished" podID="2f4ce98b-b927-4e97-8066-e8eac5fa0c5e" containerID="12b9fccc3bac8c37f063d62ea18186c7beb8665965052758fae59d0afa547fc5" exitCode=0 Oct 12 07:52:56 crc kubenswrapper[4599]: I1012 07:52:56.762517 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57db76b469-s5l9q" event={"ID":"2f4ce98b-b927-4e97-8066-e8eac5fa0c5e","Type":"ContainerDied","Data":"12b9fccc3bac8c37f063d62ea18186c7beb8665965052758fae59d0afa547fc5"} Oct 12 07:52:56 crc kubenswrapper[4599]: I1012 07:52:56.762547 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57db76b469-s5l9q" event={"ID":"2f4ce98b-b927-4e97-8066-e8eac5fa0c5e","Type":"ContainerDied","Data":"c907c305df36461dc94f5a11064d1a23d5a79a8d9f3de80a8f529e4d6d810682"} Oct 12 07:52:56 crc kubenswrapper[4599]: I1012 07:52:56.762568 4599 scope.go:117] "RemoveContainer" containerID="12b9fccc3bac8c37f063d62ea18186c7beb8665965052758fae59d0afa547fc5" Oct 12 07:52:56 crc kubenswrapper[4599]: I1012 07:52:56.762698 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57db76b469-s5l9q" Oct 12 07:52:56 crc kubenswrapper[4599]: I1012 07:52:56.780956 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f4ce98b-b927-4e97-8066-e8eac5fa0c5e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2f4ce98b-b927-4e97-8066-e8eac5fa0c5e" (UID: "2f4ce98b-b927-4e97-8066-e8eac5fa0c5e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:52:56 crc kubenswrapper[4599]: I1012 07:52:56.781213 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f4ce98b-b927-4e97-8066-e8eac5fa0c5e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2f4ce98b-b927-4e97-8066-e8eac5fa0c5e" (UID: "2f4ce98b-b927-4e97-8066-e8eac5fa0c5e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:52:56 crc kubenswrapper[4599]: I1012 07:52:56.781372 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f4ce98b-b927-4e97-8066-e8eac5fa0c5e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "2f4ce98b-b927-4e97-8066-e8eac5fa0c5e" (UID: "2f4ce98b-b927-4e97-8066-e8eac5fa0c5e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:52:56 crc kubenswrapper[4599]: I1012 07:52:56.783276 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f4ce98b-b927-4e97-8066-e8eac5fa0c5e-config" (OuterVolumeSpecName: "config") pod "2f4ce98b-b927-4e97-8066-e8eac5fa0c5e" (UID: "2f4ce98b-b927-4e97-8066-e8eac5fa0c5e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:52:56 crc kubenswrapper[4599]: I1012 07:52:56.785239 4599 scope.go:117] "RemoveContainer" containerID="2464fe787b05c66b60ac6861e434544738b2c96cdc2b4245742a7a2fb93983fd" Oct 12 07:52:56 crc kubenswrapper[4599]: I1012 07:52:56.788780 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f4ce98b-b927-4e97-8066-e8eac5fa0c5e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2f4ce98b-b927-4e97-8066-e8eac5fa0c5e" (UID: "2f4ce98b-b927-4e97-8066-e8eac5fa0c5e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:52:56 crc kubenswrapper[4599]: I1012 07:52:56.803456 4599 scope.go:117] "RemoveContainer" containerID="12b9fccc3bac8c37f063d62ea18186c7beb8665965052758fae59d0afa547fc5" Oct 12 07:52:56 crc kubenswrapper[4599]: E1012 07:52:56.805381 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12b9fccc3bac8c37f063d62ea18186c7beb8665965052758fae59d0afa547fc5\": container with ID starting with 12b9fccc3bac8c37f063d62ea18186c7beb8665965052758fae59d0afa547fc5 not found: ID does not exist" containerID="12b9fccc3bac8c37f063d62ea18186c7beb8665965052758fae59d0afa547fc5" Oct 12 07:52:56 crc kubenswrapper[4599]: I1012 07:52:56.805418 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12b9fccc3bac8c37f063d62ea18186c7beb8665965052758fae59d0afa547fc5"} err="failed to get container status \"12b9fccc3bac8c37f063d62ea18186c7beb8665965052758fae59d0afa547fc5\": rpc error: code = NotFound desc = could not find container \"12b9fccc3bac8c37f063d62ea18186c7beb8665965052758fae59d0afa547fc5\": container with ID starting with 12b9fccc3bac8c37f063d62ea18186c7beb8665965052758fae59d0afa547fc5 not found: ID does not exist" Oct 12 07:52:56 crc kubenswrapper[4599]: I1012 07:52:56.805446 4599 scope.go:117] "RemoveContainer" containerID="2464fe787b05c66b60ac6861e434544738b2c96cdc2b4245742a7a2fb93983fd" Oct 12 07:52:56 crc kubenswrapper[4599]: E1012 07:52:56.805780 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2464fe787b05c66b60ac6861e434544738b2c96cdc2b4245742a7a2fb93983fd\": container with ID starting with 2464fe787b05c66b60ac6861e434544738b2c96cdc2b4245742a7a2fb93983fd not found: ID does not exist" containerID="2464fe787b05c66b60ac6861e434544738b2c96cdc2b4245742a7a2fb93983fd" Oct 12 07:52:56 crc kubenswrapper[4599]: I1012 07:52:56.805826 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2464fe787b05c66b60ac6861e434544738b2c96cdc2b4245742a7a2fb93983fd"} err="failed to get container status \"2464fe787b05c66b60ac6861e434544738b2c96cdc2b4245742a7a2fb93983fd\": rpc error: code = NotFound desc = could not find container \"2464fe787b05c66b60ac6861e434544738b2c96cdc2b4245742a7a2fb93983fd\": container with ID starting with 2464fe787b05c66b60ac6861e434544738b2c96cdc2b4245742a7a2fb93983fd not found: ID does not exist" Oct 12 07:52:56 crc kubenswrapper[4599]: I1012 07:52:56.833882 4599 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f4ce98b-b927-4e97-8066-e8eac5fa0c5e-config\") on node \"crc\" DevicePath \"\"" Oct 12 07:52:56 crc kubenswrapper[4599]: I1012 07:52:56.833908 4599 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2f4ce98b-b927-4e97-8066-e8eac5fa0c5e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 12 07:52:56 crc kubenswrapper[4599]: I1012 07:52:56.833918 4599 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2f4ce98b-b927-4e97-8066-e8eac5fa0c5e-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 12 07:52:56 crc kubenswrapper[4599]: I1012 07:52:56.833927 4599 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2f4ce98b-b927-4e97-8066-e8eac5fa0c5e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 12 07:52:56 crc kubenswrapper[4599]: I1012 07:52:56.833937 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-64zv7\" (UniqueName: \"kubernetes.io/projected/2f4ce98b-b927-4e97-8066-e8eac5fa0c5e-kube-api-access-64zv7\") on node \"crc\" DevicePath \"\"" Oct 12 07:52:56 crc kubenswrapper[4599]: I1012 07:52:56.833946 4599 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2f4ce98b-b927-4e97-8066-e8eac5fa0c5e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 12 07:52:57 crc kubenswrapper[4599]: I1012 07:52:57.030077 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d885c8d8c-mm9q2"] Oct 12 07:52:57 crc kubenswrapper[4599]: I1012 07:52:57.096584 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57db76b469-s5l9q"] Oct 12 07:52:57 crc kubenswrapper[4599]: I1012 07:52:57.103031 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57db76b469-s5l9q"] Oct 12 07:52:57 crc kubenswrapper[4599]: I1012 07:52:57.556096 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f4ce98b-b927-4e97-8066-e8eac5fa0c5e" path="/var/lib/kubelet/pods/2f4ce98b-b927-4e97-8066-e8eac5fa0c5e/volumes" Oct 12 07:52:57 crc kubenswrapper[4599]: I1012 07:52:57.773990 4599 generic.go:334] "Generic (PLEG): container finished" podID="aa8d5577-ea50-40e9-8549-7c7ad4da7ee6" containerID="12d489b41a03fbe441f874923d7e784e2f07a8bb3e8189754ed5244631c0bf7b" exitCode=0 Oct 12 07:52:57 crc kubenswrapper[4599]: I1012 07:52:57.774041 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d885c8d8c-mm9q2" event={"ID":"aa8d5577-ea50-40e9-8549-7c7ad4da7ee6","Type":"ContainerDied","Data":"12d489b41a03fbe441f874923d7e784e2f07a8bb3e8189754ed5244631c0bf7b"} Oct 12 07:52:57 crc kubenswrapper[4599]: I1012 07:52:57.774078 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d885c8d8c-mm9q2" event={"ID":"aa8d5577-ea50-40e9-8549-7c7ad4da7ee6","Type":"ContainerStarted","Data":"2200aebf89d4130133b1563ba2c27f03f02e7279bcca55d10c67a5adca906594"} Oct 12 07:52:58 crc kubenswrapper[4599]: I1012 07:52:58.322091 4599 patch_prober.go:28] interesting pod/machine-config-daemon-5mz5c container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 07:52:58 crc kubenswrapper[4599]: I1012 07:52:58.322440 4599 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 07:52:58 crc kubenswrapper[4599]: I1012 07:52:58.788704 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d885c8d8c-mm9q2" event={"ID":"aa8d5577-ea50-40e9-8549-7c7ad4da7ee6","Type":"ContainerStarted","Data":"22b85e54492eb794bdd885470d87737418451832fc030c6432e10fcf706f7229"} Oct 12 07:52:58 crc kubenswrapper[4599]: I1012 07:52:58.790089 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5d885c8d8c-mm9q2" Oct 12 07:52:58 crc kubenswrapper[4599]: I1012 07:52:58.809683 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5d885c8d8c-mm9q2" podStartSLOduration=2.8096601359999998 podStartE2EDuration="2.809660136s" podCreationTimestamp="2025-10-12 07:52:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:52:58.804994213 +0000 UTC m=+1075.594189715" watchObservedRunningTime="2025-10-12 07:52:58.809660136 +0000 UTC m=+1075.598855638" Oct 12 07:53:06 crc kubenswrapper[4599]: I1012 07:53:06.615527 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5d885c8d8c-mm9q2" Oct 12 07:53:06 crc kubenswrapper[4599]: I1012 07:53:06.664166 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c979c585f-grz99"] Oct 12 07:53:06 crc kubenswrapper[4599]: I1012 07:53:06.664438 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7c979c585f-grz99" podUID="b88d7e52-0543-47be-adf8-5294aeabb3e9" containerName="dnsmasq-dns" containerID="cri-o://8c151d672fd6b4d3f644b4a15d5c8268d28d4ec50593f73e190e1056788a3dd6" gracePeriod=10 Oct 12 07:53:06 crc kubenswrapper[4599]: I1012 07:53:06.862305 4599 generic.go:334] "Generic (PLEG): container finished" podID="b88d7e52-0543-47be-adf8-5294aeabb3e9" containerID="8c151d672fd6b4d3f644b4a15d5c8268d28d4ec50593f73e190e1056788a3dd6" exitCode=0 Oct 12 07:53:06 crc kubenswrapper[4599]: I1012 07:53:06.862402 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c979c585f-grz99" event={"ID":"b88d7e52-0543-47be-adf8-5294aeabb3e9","Type":"ContainerDied","Data":"8c151d672fd6b4d3f644b4a15d5c8268d28d4ec50593f73e190e1056788a3dd6"} Oct 12 07:53:07 crc kubenswrapper[4599]: I1012 07:53:07.066866 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c979c585f-grz99" Oct 12 07:53:07 crc kubenswrapper[4599]: I1012 07:53:07.180627 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckqzx\" (UniqueName: \"kubernetes.io/projected/b88d7e52-0543-47be-adf8-5294aeabb3e9-kube-api-access-ckqzx\") pod \"b88d7e52-0543-47be-adf8-5294aeabb3e9\" (UID: \"b88d7e52-0543-47be-adf8-5294aeabb3e9\") " Oct 12 07:53:07 crc kubenswrapper[4599]: I1012 07:53:07.180731 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b88d7e52-0543-47be-adf8-5294aeabb3e9-dns-swift-storage-0\") pod \"b88d7e52-0543-47be-adf8-5294aeabb3e9\" (UID: \"b88d7e52-0543-47be-adf8-5294aeabb3e9\") " Oct 12 07:53:07 crc kubenswrapper[4599]: I1012 07:53:07.181166 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/b88d7e52-0543-47be-adf8-5294aeabb3e9-openstack-edpm-ipam\") pod \"b88d7e52-0543-47be-adf8-5294aeabb3e9\" (UID: \"b88d7e52-0543-47be-adf8-5294aeabb3e9\") " Oct 12 07:53:07 crc kubenswrapper[4599]: I1012 07:53:07.181257 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b88d7e52-0543-47be-adf8-5294aeabb3e9-ovsdbserver-sb\") pod \"b88d7e52-0543-47be-adf8-5294aeabb3e9\" (UID: \"b88d7e52-0543-47be-adf8-5294aeabb3e9\") " Oct 12 07:53:07 crc kubenswrapper[4599]: I1012 07:53:07.181350 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b88d7e52-0543-47be-adf8-5294aeabb3e9-config\") pod \"b88d7e52-0543-47be-adf8-5294aeabb3e9\" (UID: \"b88d7e52-0543-47be-adf8-5294aeabb3e9\") " Oct 12 07:53:07 crc kubenswrapper[4599]: I1012 07:53:07.181383 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b88d7e52-0543-47be-adf8-5294aeabb3e9-ovsdbserver-nb\") pod \"b88d7e52-0543-47be-adf8-5294aeabb3e9\" (UID: \"b88d7e52-0543-47be-adf8-5294aeabb3e9\") " Oct 12 07:53:07 crc kubenswrapper[4599]: I1012 07:53:07.181418 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b88d7e52-0543-47be-adf8-5294aeabb3e9-dns-svc\") pod \"b88d7e52-0543-47be-adf8-5294aeabb3e9\" (UID: \"b88d7e52-0543-47be-adf8-5294aeabb3e9\") " Oct 12 07:53:07 crc kubenswrapper[4599]: I1012 07:53:07.188792 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b88d7e52-0543-47be-adf8-5294aeabb3e9-kube-api-access-ckqzx" (OuterVolumeSpecName: "kube-api-access-ckqzx") pod "b88d7e52-0543-47be-adf8-5294aeabb3e9" (UID: "b88d7e52-0543-47be-adf8-5294aeabb3e9"). InnerVolumeSpecName "kube-api-access-ckqzx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:53:07 crc kubenswrapper[4599]: I1012 07:53:07.229981 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b88d7e52-0543-47be-adf8-5294aeabb3e9-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b88d7e52-0543-47be-adf8-5294aeabb3e9" (UID: "b88d7e52-0543-47be-adf8-5294aeabb3e9"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:53:07 crc kubenswrapper[4599]: I1012 07:53:07.230486 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b88d7e52-0543-47be-adf8-5294aeabb3e9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b88d7e52-0543-47be-adf8-5294aeabb3e9" (UID: "b88d7e52-0543-47be-adf8-5294aeabb3e9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:53:07 crc kubenswrapper[4599]: I1012 07:53:07.231456 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b88d7e52-0543-47be-adf8-5294aeabb3e9-config" (OuterVolumeSpecName: "config") pod "b88d7e52-0543-47be-adf8-5294aeabb3e9" (UID: "b88d7e52-0543-47be-adf8-5294aeabb3e9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:53:07 crc kubenswrapper[4599]: I1012 07:53:07.233265 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b88d7e52-0543-47be-adf8-5294aeabb3e9-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "b88d7e52-0543-47be-adf8-5294aeabb3e9" (UID: "b88d7e52-0543-47be-adf8-5294aeabb3e9"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:53:07 crc kubenswrapper[4599]: I1012 07:53:07.235711 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b88d7e52-0543-47be-adf8-5294aeabb3e9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b88d7e52-0543-47be-adf8-5294aeabb3e9" (UID: "b88d7e52-0543-47be-adf8-5294aeabb3e9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:53:07 crc kubenswrapper[4599]: I1012 07:53:07.237257 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b88d7e52-0543-47be-adf8-5294aeabb3e9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b88d7e52-0543-47be-adf8-5294aeabb3e9" (UID: "b88d7e52-0543-47be-adf8-5294aeabb3e9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:53:07 crc kubenswrapper[4599]: I1012 07:53:07.283901 4599 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b88d7e52-0543-47be-adf8-5294aeabb3e9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 12 07:53:07 crc kubenswrapper[4599]: I1012 07:53:07.283932 4599 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b88d7e52-0543-47be-adf8-5294aeabb3e9-config\") on node \"crc\" DevicePath \"\"" Oct 12 07:53:07 crc kubenswrapper[4599]: I1012 07:53:07.283942 4599 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b88d7e52-0543-47be-adf8-5294aeabb3e9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 12 07:53:07 crc kubenswrapper[4599]: I1012 07:53:07.283951 4599 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b88d7e52-0543-47be-adf8-5294aeabb3e9-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 12 07:53:07 crc kubenswrapper[4599]: I1012 07:53:07.283961 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckqzx\" (UniqueName: \"kubernetes.io/projected/b88d7e52-0543-47be-adf8-5294aeabb3e9-kube-api-access-ckqzx\") on node \"crc\" DevicePath \"\"" Oct 12 07:53:07 crc kubenswrapper[4599]: I1012 07:53:07.283971 4599 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b88d7e52-0543-47be-adf8-5294aeabb3e9-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 12 07:53:07 crc kubenswrapper[4599]: I1012 07:53:07.283980 4599 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/b88d7e52-0543-47be-adf8-5294aeabb3e9-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 12 07:53:07 crc kubenswrapper[4599]: I1012 07:53:07.874005 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c979c585f-grz99" event={"ID":"b88d7e52-0543-47be-adf8-5294aeabb3e9","Type":"ContainerDied","Data":"e5bb3c60561b74be804c04956e9cddcdc77edc3cb1f210d2ea7ab0963f117285"} Oct 12 07:53:07 crc kubenswrapper[4599]: I1012 07:53:07.874464 4599 scope.go:117] "RemoveContainer" containerID="8c151d672fd6b4d3f644b4a15d5c8268d28d4ec50593f73e190e1056788a3dd6" Oct 12 07:53:07 crc kubenswrapper[4599]: I1012 07:53:07.874098 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c979c585f-grz99" Oct 12 07:53:07 crc kubenswrapper[4599]: I1012 07:53:07.901172 4599 scope.go:117] "RemoveContainer" containerID="b6a74b069ddf1c7fb5f9192bf4a793d89ecffb0c2f5669fd5fef53e4e605772e" Oct 12 07:53:07 crc kubenswrapper[4599]: I1012 07:53:07.901560 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c979c585f-grz99"] Oct 12 07:53:07 crc kubenswrapper[4599]: I1012 07:53:07.907728 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7c979c585f-grz99"] Oct 12 07:53:09 crc kubenswrapper[4599]: I1012 07:53:09.553908 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b88d7e52-0543-47be-adf8-5294aeabb3e9" path="/var/lib/kubelet/pods/b88d7e52-0543-47be-adf8-5294aeabb3e9/volumes" Oct 12 07:53:18 crc kubenswrapper[4599]: I1012 07:53:18.975523 4599 generic.go:334] "Generic (PLEG): container finished" podID="ab388f40-a761-44f6-812f-df5cf4b02b73" containerID="c4b19f96e01d826bb426c38c06cf8755d36155ec7b5e90e7c266acc0b8ee0862" exitCode=0 Oct 12 07:53:18 crc kubenswrapper[4599]: I1012 07:53:18.975614 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ab388f40-a761-44f6-812f-df5cf4b02b73","Type":"ContainerDied","Data":"c4b19f96e01d826bb426c38c06cf8755d36155ec7b5e90e7c266acc0b8ee0862"} Oct 12 07:53:18 crc kubenswrapper[4599]: I1012 07:53:18.977596 4599 generic.go:334] "Generic (PLEG): container finished" podID="a9c90266-89e1-4527-8fa2-91826cbcc778" containerID="15a649d5d27b21eae8af830f3b839f1526f77aae3a25d24e57d477809ecf2dee" exitCode=0 Oct 12 07:53:18 crc kubenswrapper[4599]: I1012 07:53:18.977625 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a9c90266-89e1-4527-8fa2-91826cbcc778","Type":"ContainerDied","Data":"15a649d5d27b21eae8af830f3b839f1526f77aae3a25d24e57d477809ecf2dee"} Oct 12 07:53:19 crc kubenswrapper[4599]: I1012 07:53:19.989556 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a9c90266-89e1-4527-8fa2-91826cbcc778","Type":"ContainerStarted","Data":"e73ea94e04160c34c4c5f80ca41a65ccdb313f3fa0cd02f216accc3004295e1a"} Oct 12 07:53:19 crc kubenswrapper[4599]: I1012 07:53:19.990204 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 12 07:53:19 crc kubenswrapper[4599]: I1012 07:53:19.992738 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ab388f40-a761-44f6-812f-df5cf4b02b73","Type":"ContainerStarted","Data":"72118e0e9d03d547c43a8e364cb48547a33c3ef5d57b81b128139b05631d9016"} Oct 12 07:53:19 crc kubenswrapper[4599]: I1012 07:53:19.993083 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 12 07:53:20 crc kubenswrapper[4599]: I1012 07:53:20.012196 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.012173119 podStartE2EDuration="36.012173119s" podCreationTimestamp="2025-10-12 07:52:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:53:20.007945043 +0000 UTC m=+1096.797140545" watchObservedRunningTime="2025-10-12 07:53:20.012173119 +0000 UTC m=+1096.801368622" Oct 12 07:53:20 crc kubenswrapper[4599]: I1012 07:53:20.036657 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.036635295 podStartE2EDuration="36.036635295s" podCreationTimestamp="2025-10-12 07:52:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:53:20.031900593 +0000 UTC m=+1096.821096094" watchObservedRunningTime="2025-10-12 07:53:20.036635295 +0000 UTC m=+1096.825830798" Oct 12 07:53:23 crc kubenswrapper[4599]: I1012 07:53:23.407591 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gmzjl"] Oct 12 07:53:23 crc kubenswrapper[4599]: E1012 07:53:23.408457 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f4ce98b-b927-4e97-8066-e8eac5fa0c5e" containerName="dnsmasq-dns" Oct 12 07:53:23 crc kubenswrapper[4599]: I1012 07:53:23.408471 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f4ce98b-b927-4e97-8066-e8eac5fa0c5e" containerName="dnsmasq-dns" Oct 12 07:53:23 crc kubenswrapper[4599]: E1012 07:53:23.408496 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b88d7e52-0543-47be-adf8-5294aeabb3e9" containerName="dnsmasq-dns" Oct 12 07:53:23 crc kubenswrapper[4599]: I1012 07:53:23.408501 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="b88d7e52-0543-47be-adf8-5294aeabb3e9" containerName="dnsmasq-dns" Oct 12 07:53:23 crc kubenswrapper[4599]: E1012 07:53:23.408511 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b88d7e52-0543-47be-adf8-5294aeabb3e9" containerName="init" Oct 12 07:53:23 crc kubenswrapper[4599]: I1012 07:53:23.408516 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="b88d7e52-0543-47be-adf8-5294aeabb3e9" containerName="init" Oct 12 07:53:23 crc kubenswrapper[4599]: E1012 07:53:23.408524 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f4ce98b-b927-4e97-8066-e8eac5fa0c5e" containerName="init" Oct 12 07:53:23 crc kubenswrapper[4599]: I1012 07:53:23.408531 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f4ce98b-b927-4e97-8066-e8eac5fa0c5e" containerName="init" Oct 12 07:53:23 crc kubenswrapper[4599]: I1012 07:53:23.408706 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f4ce98b-b927-4e97-8066-e8eac5fa0c5e" containerName="dnsmasq-dns" Oct 12 07:53:23 crc kubenswrapper[4599]: I1012 07:53:23.408718 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="b88d7e52-0543-47be-adf8-5294aeabb3e9" containerName="dnsmasq-dns" Oct 12 07:53:23 crc kubenswrapper[4599]: I1012 07:53:23.409298 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gmzjl" Oct 12 07:53:23 crc kubenswrapper[4599]: I1012 07:53:23.417932 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 12 07:53:23 crc kubenswrapper[4599]: I1012 07:53:23.418051 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 12 07:53:23 crc kubenswrapper[4599]: I1012 07:53:23.418125 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-x82pw" Oct 12 07:53:23 crc kubenswrapper[4599]: I1012 07:53:23.418872 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 12 07:53:23 crc kubenswrapper[4599]: I1012 07:53:23.422819 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gmzjl"] Oct 12 07:53:23 crc kubenswrapper[4599]: I1012 07:53:23.545802 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqt68\" (UniqueName: \"kubernetes.io/projected/dc14ab80-62e8-47ec-bf5b-370ccfd95eff-kube-api-access-nqt68\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gmzjl\" (UID: \"dc14ab80-62e8-47ec-bf5b-370ccfd95eff\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gmzjl" Oct 12 07:53:23 crc kubenswrapper[4599]: I1012 07:53:23.545882 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dc14ab80-62e8-47ec-bf5b-370ccfd95eff-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gmzjl\" (UID: \"dc14ab80-62e8-47ec-bf5b-370ccfd95eff\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gmzjl" Oct 12 07:53:23 crc kubenswrapper[4599]: I1012 07:53:23.546131 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc14ab80-62e8-47ec-bf5b-370ccfd95eff-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gmzjl\" (UID: \"dc14ab80-62e8-47ec-bf5b-370ccfd95eff\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gmzjl" Oct 12 07:53:23 crc kubenswrapper[4599]: I1012 07:53:23.546207 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dc14ab80-62e8-47ec-bf5b-370ccfd95eff-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gmzjl\" (UID: \"dc14ab80-62e8-47ec-bf5b-370ccfd95eff\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gmzjl" Oct 12 07:53:23 crc kubenswrapper[4599]: I1012 07:53:23.648641 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dc14ab80-62e8-47ec-bf5b-370ccfd95eff-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gmzjl\" (UID: \"dc14ab80-62e8-47ec-bf5b-370ccfd95eff\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gmzjl" Oct 12 07:53:23 crc kubenswrapper[4599]: I1012 07:53:23.648722 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc14ab80-62e8-47ec-bf5b-370ccfd95eff-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gmzjl\" (UID: \"dc14ab80-62e8-47ec-bf5b-370ccfd95eff\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gmzjl" Oct 12 07:53:23 crc kubenswrapper[4599]: I1012 07:53:23.648755 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dc14ab80-62e8-47ec-bf5b-370ccfd95eff-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gmzjl\" (UID: \"dc14ab80-62e8-47ec-bf5b-370ccfd95eff\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gmzjl" Oct 12 07:53:23 crc kubenswrapper[4599]: I1012 07:53:23.648854 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqt68\" (UniqueName: \"kubernetes.io/projected/dc14ab80-62e8-47ec-bf5b-370ccfd95eff-kube-api-access-nqt68\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gmzjl\" (UID: \"dc14ab80-62e8-47ec-bf5b-370ccfd95eff\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gmzjl" Oct 12 07:53:23 crc kubenswrapper[4599]: I1012 07:53:23.657240 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc14ab80-62e8-47ec-bf5b-370ccfd95eff-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gmzjl\" (UID: \"dc14ab80-62e8-47ec-bf5b-370ccfd95eff\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gmzjl" Oct 12 07:53:23 crc kubenswrapper[4599]: I1012 07:53:23.658033 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dc14ab80-62e8-47ec-bf5b-370ccfd95eff-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gmzjl\" (UID: \"dc14ab80-62e8-47ec-bf5b-370ccfd95eff\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gmzjl" Oct 12 07:53:23 crc kubenswrapper[4599]: I1012 07:53:23.658215 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dc14ab80-62e8-47ec-bf5b-370ccfd95eff-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gmzjl\" (UID: \"dc14ab80-62e8-47ec-bf5b-370ccfd95eff\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gmzjl" Oct 12 07:53:23 crc kubenswrapper[4599]: I1012 07:53:23.664830 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqt68\" (UniqueName: \"kubernetes.io/projected/dc14ab80-62e8-47ec-bf5b-370ccfd95eff-kube-api-access-nqt68\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gmzjl\" (UID: \"dc14ab80-62e8-47ec-bf5b-370ccfd95eff\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gmzjl" Oct 12 07:53:23 crc kubenswrapper[4599]: I1012 07:53:23.727388 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gmzjl" Oct 12 07:53:24 crc kubenswrapper[4599]: I1012 07:53:24.219714 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gmzjl"] Oct 12 07:53:24 crc kubenswrapper[4599]: I1012 07:53:24.227187 4599 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 12 07:53:25 crc kubenswrapper[4599]: I1012 07:53:25.034659 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gmzjl" event={"ID":"dc14ab80-62e8-47ec-bf5b-370ccfd95eff","Type":"ContainerStarted","Data":"1af625a7f5c4f3adf249dbfd5611d099f3ee294984906c5cf81fc40ac09c5834"} Oct 12 07:53:28 crc kubenswrapper[4599]: I1012 07:53:28.322060 4599 patch_prober.go:28] interesting pod/machine-config-daemon-5mz5c container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 07:53:28 crc kubenswrapper[4599]: I1012 07:53:28.322982 4599 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 07:53:28 crc kubenswrapper[4599]: I1012 07:53:28.323087 4599 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" Oct 12 07:53:28 crc kubenswrapper[4599]: I1012 07:53:28.324160 4599 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"409590f6935d88c6579c0848195b75ccd573f94456a7d800342528199b5f70c8"} pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 12 07:53:28 crc kubenswrapper[4599]: I1012 07:53:28.324215 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" containerName="machine-config-daemon" containerID="cri-o://409590f6935d88c6579c0848195b75ccd573f94456a7d800342528199b5f70c8" gracePeriod=600 Oct 12 07:53:29 crc kubenswrapper[4599]: I1012 07:53:29.076853 4599 generic.go:334] "Generic (PLEG): container finished" podID="cc694bce-8c25-4729-b452-29d44d3efe6e" containerID="409590f6935d88c6579c0848195b75ccd573f94456a7d800342528199b5f70c8" exitCode=0 Oct 12 07:53:29 crc kubenswrapper[4599]: I1012 07:53:29.076919 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" event={"ID":"cc694bce-8c25-4729-b452-29d44d3efe6e","Type":"ContainerDied","Data":"409590f6935d88c6579c0848195b75ccd573f94456a7d800342528199b5f70c8"} Oct 12 07:53:29 crc kubenswrapper[4599]: I1012 07:53:29.077366 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" event={"ID":"cc694bce-8c25-4729-b452-29d44d3efe6e","Type":"ContainerStarted","Data":"55f9a891f5b2d1aee66eb372fd29e5ae6e5d430252e3a9f1e9416dab9bddaf9f"} Oct 12 07:53:29 crc kubenswrapper[4599]: I1012 07:53:29.077392 4599 scope.go:117] "RemoveContainer" containerID="f791fc6fe233d5a2dcb3bd14d2fd8d76369bf4f0ae51317c4f0bb3b0e75a17de" Oct 12 07:53:35 crc kubenswrapper[4599]: I1012 07:53:35.150427 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gmzjl" event={"ID":"dc14ab80-62e8-47ec-bf5b-370ccfd95eff","Type":"ContainerStarted","Data":"9eab6a0642ccc3b482d7e1548ce000e4092d4e606878401a33cb3e838343e46e"} Oct 12 07:53:35 crc kubenswrapper[4599]: I1012 07:53:35.172794 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gmzjl" podStartSLOduration=1.926038199 podStartE2EDuration="12.172776988s" podCreationTimestamp="2025-10-12 07:53:23 +0000 UTC" firstStartedPulling="2025-10-12 07:53:24.22694454 +0000 UTC m=+1101.016140032" lastFinishedPulling="2025-10-12 07:53:34.473683319 +0000 UTC m=+1111.262878821" observedRunningTime="2025-10-12 07:53:35.165245408 +0000 UTC m=+1111.954440911" watchObservedRunningTime="2025-10-12 07:53:35.172776988 +0000 UTC m=+1111.961972490" Oct 12 07:53:35 crc kubenswrapper[4599]: I1012 07:53:35.184555 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 12 07:53:35 crc kubenswrapper[4599]: I1012 07:53:35.275577 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 12 07:53:46 crc kubenswrapper[4599]: I1012 07:53:46.255177 4599 generic.go:334] "Generic (PLEG): container finished" podID="dc14ab80-62e8-47ec-bf5b-370ccfd95eff" containerID="9eab6a0642ccc3b482d7e1548ce000e4092d4e606878401a33cb3e838343e46e" exitCode=0 Oct 12 07:53:46 crc kubenswrapper[4599]: I1012 07:53:46.255272 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gmzjl" event={"ID":"dc14ab80-62e8-47ec-bf5b-370ccfd95eff","Type":"ContainerDied","Data":"9eab6a0642ccc3b482d7e1548ce000e4092d4e606878401a33cb3e838343e46e"} Oct 12 07:53:47 crc kubenswrapper[4599]: I1012 07:53:47.620796 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gmzjl" Oct 12 07:53:47 crc kubenswrapper[4599]: I1012 07:53:47.822392 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dc14ab80-62e8-47ec-bf5b-370ccfd95eff-inventory\") pod \"dc14ab80-62e8-47ec-bf5b-370ccfd95eff\" (UID: \"dc14ab80-62e8-47ec-bf5b-370ccfd95eff\") " Oct 12 07:53:47 crc kubenswrapper[4599]: I1012 07:53:47.822559 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dc14ab80-62e8-47ec-bf5b-370ccfd95eff-ssh-key\") pod \"dc14ab80-62e8-47ec-bf5b-370ccfd95eff\" (UID: \"dc14ab80-62e8-47ec-bf5b-370ccfd95eff\") " Oct 12 07:53:47 crc kubenswrapper[4599]: I1012 07:53:47.822617 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc14ab80-62e8-47ec-bf5b-370ccfd95eff-repo-setup-combined-ca-bundle\") pod \"dc14ab80-62e8-47ec-bf5b-370ccfd95eff\" (UID: \"dc14ab80-62e8-47ec-bf5b-370ccfd95eff\") " Oct 12 07:53:47 crc kubenswrapper[4599]: I1012 07:53:47.822797 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqt68\" (UniqueName: \"kubernetes.io/projected/dc14ab80-62e8-47ec-bf5b-370ccfd95eff-kube-api-access-nqt68\") pod \"dc14ab80-62e8-47ec-bf5b-370ccfd95eff\" (UID: \"dc14ab80-62e8-47ec-bf5b-370ccfd95eff\") " Oct 12 07:53:47 crc kubenswrapper[4599]: I1012 07:53:47.831470 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc14ab80-62e8-47ec-bf5b-370ccfd95eff-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "dc14ab80-62e8-47ec-bf5b-370ccfd95eff" (UID: "dc14ab80-62e8-47ec-bf5b-370ccfd95eff"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:53:47 crc kubenswrapper[4599]: I1012 07:53:47.834577 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc14ab80-62e8-47ec-bf5b-370ccfd95eff-kube-api-access-nqt68" (OuterVolumeSpecName: "kube-api-access-nqt68") pod "dc14ab80-62e8-47ec-bf5b-370ccfd95eff" (UID: "dc14ab80-62e8-47ec-bf5b-370ccfd95eff"). InnerVolumeSpecName "kube-api-access-nqt68". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:53:47 crc kubenswrapper[4599]: I1012 07:53:47.849250 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc14ab80-62e8-47ec-bf5b-370ccfd95eff-inventory" (OuterVolumeSpecName: "inventory") pod "dc14ab80-62e8-47ec-bf5b-370ccfd95eff" (UID: "dc14ab80-62e8-47ec-bf5b-370ccfd95eff"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:53:47 crc kubenswrapper[4599]: I1012 07:53:47.850409 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc14ab80-62e8-47ec-bf5b-370ccfd95eff-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "dc14ab80-62e8-47ec-bf5b-370ccfd95eff" (UID: "dc14ab80-62e8-47ec-bf5b-370ccfd95eff"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:53:47 crc kubenswrapper[4599]: I1012 07:53:47.925851 4599 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dc14ab80-62e8-47ec-bf5b-370ccfd95eff-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 12 07:53:47 crc kubenswrapper[4599]: I1012 07:53:47.926109 4599 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc14ab80-62e8-47ec-bf5b-370ccfd95eff-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 07:53:47 crc kubenswrapper[4599]: I1012 07:53:47.926171 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nqt68\" (UniqueName: \"kubernetes.io/projected/dc14ab80-62e8-47ec-bf5b-370ccfd95eff-kube-api-access-nqt68\") on node \"crc\" DevicePath \"\"" Oct 12 07:53:47 crc kubenswrapper[4599]: I1012 07:53:47.926226 4599 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dc14ab80-62e8-47ec-bf5b-370ccfd95eff-inventory\") on node \"crc\" DevicePath \"\"" Oct 12 07:53:48 crc kubenswrapper[4599]: I1012 07:53:48.274392 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gmzjl" event={"ID":"dc14ab80-62e8-47ec-bf5b-370ccfd95eff","Type":"ContainerDied","Data":"1af625a7f5c4f3adf249dbfd5611d099f3ee294984906c5cf81fc40ac09c5834"} Oct 12 07:53:48 crc kubenswrapper[4599]: I1012 07:53:48.274475 4599 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1af625a7f5c4f3adf249dbfd5611d099f3ee294984906c5cf81fc40ac09c5834" Oct 12 07:53:48 crc kubenswrapper[4599]: I1012 07:53:48.274749 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gmzjl" Oct 12 07:53:48 crc kubenswrapper[4599]: I1012 07:53:48.337606 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-sx5tv"] Oct 12 07:53:48 crc kubenswrapper[4599]: E1012 07:53:48.337990 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc14ab80-62e8-47ec-bf5b-370ccfd95eff" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 12 07:53:48 crc kubenswrapper[4599]: I1012 07:53:48.338011 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc14ab80-62e8-47ec-bf5b-370ccfd95eff" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 12 07:53:48 crc kubenswrapper[4599]: I1012 07:53:48.338210 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc14ab80-62e8-47ec-bf5b-370ccfd95eff" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 12 07:53:48 crc kubenswrapper[4599]: I1012 07:53:48.338870 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-sx5tv" Oct 12 07:53:48 crc kubenswrapper[4599]: I1012 07:53:48.341677 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 12 07:53:48 crc kubenswrapper[4599]: I1012 07:53:48.341699 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 12 07:53:48 crc kubenswrapper[4599]: I1012 07:53:48.341796 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-x82pw" Oct 12 07:53:48 crc kubenswrapper[4599]: I1012 07:53:48.341962 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 12 07:53:48 crc kubenswrapper[4599]: I1012 07:53:48.352557 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-sx5tv"] Oct 12 07:53:48 crc kubenswrapper[4599]: I1012 07:53:48.436532 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pvc4\" (UniqueName: \"kubernetes.io/projected/39f84a87-f390-4864-85f1-d4df13fe6b93-kube-api-access-7pvc4\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-sx5tv\" (UID: \"39f84a87-f390-4864-85f1-d4df13fe6b93\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-sx5tv" Oct 12 07:53:48 crc kubenswrapper[4599]: I1012 07:53:48.436628 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/39f84a87-f390-4864-85f1-d4df13fe6b93-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-sx5tv\" (UID: \"39f84a87-f390-4864-85f1-d4df13fe6b93\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-sx5tv" Oct 12 07:53:48 crc kubenswrapper[4599]: I1012 07:53:48.436825 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/39f84a87-f390-4864-85f1-d4df13fe6b93-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-sx5tv\" (UID: \"39f84a87-f390-4864-85f1-d4df13fe6b93\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-sx5tv" Oct 12 07:53:48 crc kubenswrapper[4599]: I1012 07:53:48.539529 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7pvc4\" (UniqueName: \"kubernetes.io/projected/39f84a87-f390-4864-85f1-d4df13fe6b93-kube-api-access-7pvc4\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-sx5tv\" (UID: \"39f84a87-f390-4864-85f1-d4df13fe6b93\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-sx5tv" Oct 12 07:53:48 crc kubenswrapper[4599]: I1012 07:53:48.539604 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/39f84a87-f390-4864-85f1-d4df13fe6b93-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-sx5tv\" (UID: \"39f84a87-f390-4864-85f1-d4df13fe6b93\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-sx5tv" Oct 12 07:53:48 crc kubenswrapper[4599]: I1012 07:53:48.539636 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/39f84a87-f390-4864-85f1-d4df13fe6b93-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-sx5tv\" (UID: \"39f84a87-f390-4864-85f1-d4df13fe6b93\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-sx5tv" Oct 12 07:53:48 crc kubenswrapper[4599]: I1012 07:53:48.544485 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/39f84a87-f390-4864-85f1-d4df13fe6b93-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-sx5tv\" (UID: \"39f84a87-f390-4864-85f1-d4df13fe6b93\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-sx5tv" Oct 12 07:53:48 crc kubenswrapper[4599]: I1012 07:53:48.546466 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/39f84a87-f390-4864-85f1-d4df13fe6b93-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-sx5tv\" (UID: \"39f84a87-f390-4864-85f1-d4df13fe6b93\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-sx5tv" Oct 12 07:53:48 crc kubenswrapper[4599]: I1012 07:53:48.555676 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pvc4\" (UniqueName: \"kubernetes.io/projected/39f84a87-f390-4864-85f1-d4df13fe6b93-kube-api-access-7pvc4\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-sx5tv\" (UID: \"39f84a87-f390-4864-85f1-d4df13fe6b93\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-sx5tv" Oct 12 07:53:48 crc kubenswrapper[4599]: I1012 07:53:48.654825 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-sx5tv" Oct 12 07:53:49 crc kubenswrapper[4599]: I1012 07:53:49.144501 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-sx5tv"] Oct 12 07:53:49 crc kubenswrapper[4599]: W1012 07:53:49.147901 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod39f84a87_f390_4864_85f1_d4df13fe6b93.slice/crio-eccda89b671ba710f7aabadc76fa0ccebd0be86b367663da961516c2331e4e61 WatchSource:0}: Error finding container eccda89b671ba710f7aabadc76fa0ccebd0be86b367663da961516c2331e4e61: Status 404 returned error can't find the container with id eccda89b671ba710f7aabadc76fa0ccebd0be86b367663da961516c2331e4e61 Oct 12 07:53:49 crc kubenswrapper[4599]: I1012 07:53:49.285870 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-sx5tv" event={"ID":"39f84a87-f390-4864-85f1-d4df13fe6b93","Type":"ContainerStarted","Data":"eccda89b671ba710f7aabadc76fa0ccebd0be86b367663da961516c2331e4e61"} Oct 12 07:53:50 crc kubenswrapper[4599]: I1012 07:53:50.298197 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-sx5tv" event={"ID":"39f84a87-f390-4864-85f1-d4df13fe6b93","Type":"ContainerStarted","Data":"ed23e6196f441f4864bafdb88bbdc0473fe972a69247a6cfe7fa11892e8fb668"} Oct 12 07:53:50 crc kubenswrapper[4599]: I1012 07:53:50.317564 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-sx5tv" podStartSLOduration=1.7007567670000001 podStartE2EDuration="2.317541407s" podCreationTimestamp="2025-10-12 07:53:48 +0000 UTC" firstStartedPulling="2025-10-12 07:53:49.151153399 +0000 UTC m=+1125.940348911" lastFinishedPulling="2025-10-12 07:53:49.767938049 +0000 UTC m=+1126.557133551" observedRunningTime="2025-10-12 07:53:50.315007767 +0000 UTC m=+1127.104203269" watchObservedRunningTime="2025-10-12 07:53:50.317541407 +0000 UTC m=+1127.106736899" Oct 12 07:53:52 crc kubenswrapper[4599]: I1012 07:53:52.322185 4599 generic.go:334] "Generic (PLEG): container finished" podID="39f84a87-f390-4864-85f1-d4df13fe6b93" containerID="ed23e6196f441f4864bafdb88bbdc0473fe972a69247a6cfe7fa11892e8fb668" exitCode=0 Oct 12 07:53:52 crc kubenswrapper[4599]: I1012 07:53:52.322315 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-sx5tv" event={"ID":"39f84a87-f390-4864-85f1-d4df13fe6b93","Type":"ContainerDied","Data":"ed23e6196f441f4864bafdb88bbdc0473fe972a69247a6cfe7fa11892e8fb668"} Oct 12 07:53:53 crc kubenswrapper[4599]: I1012 07:53:53.740255 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-sx5tv" Oct 12 07:53:53 crc kubenswrapper[4599]: I1012 07:53:53.779220 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/39f84a87-f390-4864-85f1-d4df13fe6b93-inventory\") pod \"39f84a87-f390-4864-85f1-d4df13fe6b93\" (UID: \"39f84a87-f390-4864-85f1-d4df13fe6b93\") " Oct 12 07:53:53 crc kubenswrapper[4599]: I1012 07:53:53.779286 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7pvc4\" (UniqueName: \"kubernetes.io/projected/39f84a87-f390-4864-85f1-d4df13fe6b93-kube-api-access-7pvc4\") pod \"39f84a87-f390-4864-85f1-d4df13fe6b93\" (UID: \"39f84a87-f390-4864-85f1-d4df13fe6b93\") " Oct 12 07:53:53 crc kubenswrapper[4599]: I1012 07:53:53.779355 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/39f84a87-f390-4864-85f1-d4df13fe6b93-ssh-key\") pod \"39f84a87-f390-4864-85f1-d4df13fe6b93\" (UID: \"39f84a87-f390-4864-85f1-d4df13fe6b93\") " Oct 12 07:53:53 crc kubenswrapper[4599]: I1012 07:53:53.784719 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39f84a87-f390-4864-85f1-d4df13fe6b93-kube-api-access-7pvc4" (OuterVolumeSpecName: "kube-api-access-7pvc4") pod "39f84a87-f390-4864-85f1-d4df13fe6b93" (UID: "39f84a87-f390-4864-85f1-d4df13fe6b93"). InnerVolumeSpecName "kube-api-access-7pvc4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:53:53 crc kubenswrapper[4599]: I1012 07:53:53.803092 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39f84a87-f390-4864-85f1-d4df13fe6b93-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "39f84a87-f390-4864-85f1-d4df13fe6b93" (UID: "39f84a87-f390-4864-85f1-d4df13fe6b93"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:53:53 crc kubenswrapper[4599]: I1012 07:53:53.804869 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39f84a87-f390-4864-85f1-d4df13fe6b93-inventory" (OuterVolumeSpecName: "inventory") pod "39f84a87-f390-4864-85f1-d4df13fe6b93" (UID: "39f84a87-f390-4864-85f1-d4df13fe6b93"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:53:53 crc kubenswrapper[4599]: I1012 07:53:53.882394 4599 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/39f84a87-f390-4864-85f1-d4df13fe6b93-inventory\") on node \"crc\" DevicePath \"\"" Oct 12 07:53:53 crc kubenswrapper[4599]: I1012 07:53:53.882724 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7pvc4\" (UniqueName: \"kubernetes.io/projected/39f84a87-f390-4864-85f1-d4df13fe6b93-kube-api-access-7pvc4\") on node \"crc\" DevicePath \"\"" Oct 12 07:53:53 crc kubenswrapper[4599]: I1012 07:53:53.882741 4599 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/39f84a87-f390-4864-85f1-d4df13fe6b93-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 12 07:53:54 crc kubenswrapper[4599]: I1012 07:53:54.341132 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-sx5tv" event={"ID":"39f84a87-f390-4864-85f1-d4df13fe6b93","Type":"ContainerDied","Data":"eccda89b671ba710f7aabadc76fa0ccebd0be86b367663da961516c2331e4e61"} Oct 12 07:53:54 crc kubenswrapper[4599]: I1012 07:53:54.341187 4599 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eccda89b671ba710f7aabadc76fa0ccebd0be86b367663da961516c2331e4e61" Oct 12 07:53:54 crc kubenswrapper[4599]: I1012 07:53:54.341501 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-sx5tv" Oct 12 07:53:54 crc kubenswrapper[4599]: I1012 07:53:54.405033 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2jgtd"] Oct 12 07:53:54 crc kubenswrapper[4599]: E1012 07:53:54.405474 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39f84a87-f390-4864-85f1-d4df13fe6b93" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Oct 12 07:53:54 crc kubenswrapper[4599]: I1012 07:53:54.405495 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="39f84a87-f390-4864-85f1-d4df13fe6b93" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Oct 12 07:53:54 crc kubenswrapper[4599]: I1012 07:53:54.405706 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="39f84a87-f390-4864-85f1-d4df13fe6b93" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Oct 12 07:53:54 crc kubenswrapper[4599]: I1012 07:53:54.406321 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2jgtd" Oct 12 07:53:54 crc kubenswrapper[4599]: I1012 07:53:54.408688 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-x82pw" Oct 12 07:53:54 crc kubenswrapper[4599]: I1012 07:53:54.408717 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 12 07:53:54 crc kubenswrapper[4599]: I1012 07:53:54.408698 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 12 07:53:54 crc kubenswrapper[4599]: I1012 07:53:54.408932 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 12 07:53:54 crc kubenswrapper[4599]: I1012 07:53:54.421613 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2jgtd"] Oct 12 07:53:54 crc kubenswrapper[4599]: I1012 07:53:54.490654 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f2098dde-6e8b-4a07-80d7-fc8e6d2c665e-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-2jgtd\" (UID: \"f2098dde-6e8b-4a07-80d7-fc8e6d2c665e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2jgtd" Oct 12 07:53:54 crc kubenswrapper[4599]: I1012 07:53:54.490709 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xx6lb\" (UniqueName: \"kubernetes.io/projected/f2098dde-6e8b-4a07-80d7-fc8e6d2c665e-kube-api-access-xx6lb\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-2jgtd\" (UID: \"f2098dde-6e8b-4a07-80d7-fc8e6d2c665e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2jgtd" Oct 12 07:53:54 crc kubenswrapper[4599]: I1012 07:53:54.490738 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2098dde-6e8b-4a07-80d7-fc8e6d2c665e-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-2jgtd\" (UID: \"f2098dde-6e8b-4a07-80d7-fc8e6d2c665e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2jgtd" Oct 12 07:53:54 crc kubenswrapper[4599]: I1012 07:53:54.490984 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f2098dde-6e8b-4a07-80d7-fc8e6d2c665e-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-2jgtd\" (UID: \"f2098dde-6e8b-4a07-80d7-fc8e6d2c665e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2jgtd" Oct 12 07:53:54 crc kubenswrapper[4599]: I1012 07:53:54.593633 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xx6lb\" (UniqueName: \"kubernetes.io/projected/f2098dde-6e8b-4a07-80d7-fc8e6d2c665e-kube-api-access-xx6lb\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-2jgtd\" (UID: \"f2098dde-6e8b-4a07-80d7-fc8e6d2c665e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2jgtd" Oct 12 07:53:54 crc kubenswrapper[4599]: I1012 07:53:54.593709 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2098dde-6e8b-4a07-80d7-fc8e6d2c665e-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-2jgtd\" (UID: \"f2098dde-6e8b-4a07-80d7-fc8e6d2c665e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2jgtd" Oct 12 07:53:54 crc kubenswrapper[4599]: I1012 07:53:54.593790 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f2098dde-6e8b-4a07-80d7-fc8e6d2c665e-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-2jgtd\" (UID: \"f2098dde-6e8b-4a07-80d7-fc8e6d2c665e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2jgtd" Oct 12 07:53:54 crc kubenswrapper[4599]: I1012 07:53:54.593898 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f2098dde-6e8b-4a07-80d7-fc8e6d2c665e-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-2jgtd\" (UID: \"f2098dde-6e8b-4a07-80d7-fc8e6d2c665e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2jgtd" Oct 12 07:53:54 crc kubenswrapper[4599]: I1012 07:53:54.599162 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f2098dde-6e8b-4a07-80d7-fc8e6d2c665e-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-2jgtd\" (UID: \"f2098dde-6e8b-4a07-80d7-fc8e6d2c665e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2jgtd" Oct 12 07:53:54 crc kubenswrapper[4599]: I1012 07:53:54.599254 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f2098dde-6e8b-4a07-80d7-fc8e6d2c665e-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-2jgtd\" (UID: \"f2098dde-6e8b-4a07-80d7-fc8e6d2c665e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2jgtd" Oct 12 07:53:54 crc kubenswrapper[4599]: I1012 07:53:54.599398 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2098dde-6e8b-4a07-80d7-fc8e6d2c665e-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-2jgtd\" (UID: \"f2098dde-6e8b-4a07-80d7-fc8e6d2c665e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2jgtd" Oct 12 07:53:54 crc kubenswrapper[4599]: I1012 07:53:54.610686 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xx6lb\" (UniqueName: \"kubernetes.io/projected/f2098dde-6e8b-4a07-80d7-fc8e6d2c665e-kube-api-access-xx6lb\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-2jgtd\" (UID: \"f2098dde-6e8b-4a07-80d7-fc8e6d2c665e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2jgtd" Oct 12 07:53:54 crc kubenswrapper[4599]: I1012 07:53:54.723186 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2jgtd" Oct 12 07:53:55 crc kubenswrapper[4599]: I1012 07:53:55.153135 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2jgtd"] Oct 12 07:53:55 crc kubenswrapper[4599]: I1012 07:53:55.354713 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2jgtd" event={"ID":"f2098dde-6e8b-4a07-80d7-fc8e6d2c665e","Type":"ContainerStarted","Data":"60a4f86efbea3b1a4758041a285b80d618ade755a74ac9a1df7c97050fddd487"} Oct 12 07:53:56 crc kubenswrapper[4599]: I1012 07:53:56.364667 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2jgtd" event={"ID":"f2098dde-6e8b-4a07-80d7-fc8e6d2c665e","Type":"ContainerStarted","Data":"49cd377323cc69c2f55ef6a38ea34ccc368cc0d79e70152ad1c92ef10e1a5905"} Oct 12 07:53:56 crc kubenswrapper[4599]: I1012 07:53:56.383149 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2jgtd" podStartSLOduration=1.793907483 podStartE2EDuration="2.38313248s" podCreationTimestamp="2025-10-12 07:53:54 +0000 UTC" firstStartedPulling="2025-10-12 07:53:55.158619667 +0000 UTC m=+1131.947815169" lastFinishedPulling="2025-10-12 07:53:55.747844664 +0000 UTC m=+1132.537040166" observedRunningTime="2025-10-12 07:53:56.377627434 +0000 UTC m=+1133.166822936" watchObservedRunningTime="2025-10-12 07:53:56.38313248 +0000 UTC m=+1133.172327982" Oct 12 07:55:05 crc kubenswrapper[4599]: I1012 07:55:05.806077 4599 scope.go:117] "RemoveContainer" containerID="7cf832d2071e6d4f5e705a45d863f0a6493487c0a45db06ab75702f3c66f37f1" Oct 12 07:55:05 crc kubenswrapper[4599]: I1012 07:55:05.840214 4599 scope.go:117] "RemoveContainer" containerID="fe7e32371e0aaaf494f9280241171cf51b492232bc1f28959847c3d245f89a83" Oct 12 07:55:28 crc kubenswrapper[4599]: I1012 07:55:28.322378 4599 patch_prober.go:28] interesting pod/machine-config-daemon-5mz5c container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 07:55:28 crc kubenswrapper[4599]: I1012 07:55:28.323222 4599 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 07:55:58 crc kubenswrapper[4599]: I1012 07:55:58.323358 4599 patch_prober.go:28] interesting pod/machine-config-daemon-5mz5c container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 07:55:58 crc kubenswrapper[4599]: I1012 07:55:58.324080 4599 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 07:56:05 crc kubenswrapper[4599]: I1012 07:56:05.926090 4599 scope.go:117] "RemoveContainer" containerID="e42beb0063eaf01d7c5f602a38d181a55b0b694e53c7f9a9dd627c1a22637b2d" Oct 12 07:56:28 crc kubenswrapper[4599]: I1012 07:56:28.322123 4599 patch_prober.go:28] interesting pod/machine-config-daemon-5mz5c container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 07:56:28 crc kubenswrapper[4599]: I1012 07:56:28.322837 4599 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 07:56:28 crc kubenswrapper[4599]: I1012 07:56:28.322897 4599 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" Oct 12 07:56:28 crc kubenswrapper[4599]: I1012 07:56:28.323478 4599 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"55f9a891f5b2d1aee66eb372fd29e5ae6e5d430252e3a9f1e9416dab9bddaf9f"} pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 12 07:56:28 crc kubenswrapper[4599]: I1012 07:56:28.323536 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" containerName="machine-config-daemon" containerID="cri-o://55f9a891f5b2d1aee66eb372fd29e5ae6e5d430252e3a9f1e9416dab9bddaf9f" gracePeriod=600 Oct 12 07:56:28 crc kubenswrapper[4599]: I1012 07:56:28.797880 4599 generic.go:334] "Generic (PLEG): container finished" podID="cc694bce-8c25-4729-b452-29d44d3efe6e" containerID="55f9a891f5b2d1aee66eb372fd29e5ae6e5d430252e3a9f1e9416dab9bddaf9f" exitCode=0 Oct 12 07:56:28 crc kubenswrapper[4599]: I1012 07:56:28.797954 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" event={"ID":"cc694bce-8c25-4729-b452-29d44d3efe6e","Type":"ContainerDied","Data":"55f9a891f5b2d1aee66eb372fd29e5ae6e5d430252e3a9f1e9416dab9bddaf9f"} Oct 12 07:56:28 crc kubenswrapper[4599]: I1012 07:56:28.798211 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" event={"ID":"cc694bce-8c25-4729-b452-29d44d3efe6e","Type":"ContainerStarted","Data":"f369260da605ea09588ec62c0ee99215abbfa382e3bbd4d53aa9571e2652f968"} Oct 12 07:56:28 crc kubenswrapper[4599]: I1012 07:56:28.798245 4599 scope.go:117] "RemoveContainer" containerID="409590f6935d88c6579c0848195b75ccd573f94456a7d800342528199b5f70c8" Oct 12 07:56:48 crc kubenswrapper[4599]: I1012 07:56:48.991374 4599 generic.go:334] "Generic (PLEG): container finished" podID="f2098dde-6e8b-4a07-80d7-fc8e6d2c665e" containerID="49cd377323cc69c2f55ef6a38ea34ccc368cc0d79e70152ad1c92ef10e1a5905" exitCode=0 Oct 12 07:56:48 crc kubenswrapper[4599]: I1012 07:56:48.991399 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2jgtd" event={"ID":"f2098dde-6e8b-4a07-80d7-fc8e6d2c665e","Type":"ContainerDied","Data":"49cd377323cc69c2f55ef6a38ea34ccc368cc0d79e70152ad1c92ef10e1a5905"} Oct 12 07:56:50 crc kubenswrapper[4599]: I1012 07:56:50.320795 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2jgtd" Oct 12 07:56:50 crc kubenswrapper[4599]: I1012 07:56:50.503004 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f2098dde-6e8b-4a07-80d7-fc8e6d2c665e-inventory\") pod \"f2098dde-6e8b-4a07-80d7-fc8e6d2c665e\" (UID: \"f2098dde-6e8b-4a07-80d7-fc8e6d2c665e\") " Oct 12 07:56:50 crc kubenswrapper[4599]: I1012 07:56:50.503153 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xx6lb\" (UniqueName: \"kubernetes.io/projected/f2098dde-6e8b-4a07-80d7-fc8e6d2c665e-kube-api-access-xx6lb\") pod \"f2098dde-6e8b-4a07-80d7-fc8e6d2c665e\" (UID: \"f2098dde-6e8b-4a07-80d7-fc8e6d2c665e\") " Oct 12 07:56:50 crc kubenswrapper[4599]: I1012 07:56:50.503200 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2098dde-6e8b-4a07-80d7-fc8e6d2c665e-bootstrap-combined-ca-bundle\") pod \"f2098dde-6e8b-4a07-80d7-fc8e6d2c665e\" (UID: \"f2098dde-6e8b-4a07-80d7-fc8e6d2c665e\") " Oct 12 07:56:50 crc kubenswrapper[4599]: I1012 07:56:50.503305 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f2098dde-6e8b-4a07-80d7-fc8e6d2c665e-ssh-key\") pod \"f2098dde-6e8b-4a07-80d7-fc8e6d2c665e\" (UID: \"f2098dde-6e8b-4a07-80d7-fc8e6d2c665e\") " Oct 12 07:56:50 crc kubenswrapper[4599]: I1012 07:56:50.508596 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2098dde-6e8b-4a07-80d7-fc8e6d2c665e-kube-api-access-xx6lb" (OuterVolumeSpecName: "kube-api-access-xx6lb") pod "f2098dde-6e8b-4a07-80d7-fc8e6d2c665e" (UID: "f2098dde-6e8b-4a07-80d7-fc8e6d2c665e"). InnerVolumeSpecName "kube-api-access-xx6lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:56:50 crc kubenswrapper[4599]: I1012 07:56:50.508702 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2098dde-6e8b-4a07-80d7-fc8e6d2c665e-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "f2098dde-6e8b-4a07-80d7-fc8e6d2c665e" (UID: "f2098dde-6e8b-4a07-80d7-fc8e6d2c665e"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:56:50 crc kubenswrapper[4599]: I1012 07:56:50.526072 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2098dde-6e8b-4a07-80d7-fc8e6d2c665e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f2098dde-6e8b-4a07-80d7-fc8e6d2c665e" (UID: "f2098dde-6e8b-4a07-80d7-fc8e6d2c665e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:56:50 crc kubenswrapper[4599]: I1012 07:56:50.526446 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2098dde-6e8b-4a07-80d7-fc8e6d2c665e-inventory" (OuterVolumeSpecName: "inventory") pod "f2098dde-6e8b-4a07-80d7-fc8e6d2c665e" (UID: "f2098dde-6e8b-4a07-80d7-fc8e6d2c665e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:56:50 crc kubenswrapper[4599]: I1012 07:56:50.605351 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xx6lb\" (UniqueName: \"kubernetes.io/projected/f2098dde-6e8b-4a07-80d7-fc8e6d2c665e-kube-api-access-xx6lb\") on node \"crc\" DevicePath \"\"" Oct 12 07:56:50 crc kubenswrapper[4599]: I1012 07:56:50.605400 4599 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2098dde-6e8b-4a07-80d7-fc8e6d2c665e-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 07:56:50 crc kubenswrapper[4599]: I1012 07:56:50.605414 4599 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f2098dde-6e8b-4a07-80d7-fc8e6d2c665e-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 12 07:56:50 crc kubenswrapper[4599]: I1012 07:56:50.605428 4599 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f2098dde-6e8b-4a07-80d7-fc8e6d2c665e-inventory\") on node \"crc\" DevicePath \"\"" Oct 12 07:56:51 crc kubenswrapper[4599]: I1012 07:56:51.008535 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2jgtd" event={"ID":"f2098dde-6e8b-4a07-80d7-fc8e6d2c665e","Type":"ContainerDied","Data":"60a4f86efbea3b1a4758041a285b80d618ade755a74ac9a1df7c97050fddd487"} Oct 12 07:56:51 crc kubenswrapper[4599]: I1012 07:56:51.008575 4599 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="60a4f86efbea3b1a4758041a285b80d618ade755a74ac9a1df7c97050fddd487" Oct 12 07:56:51 crc kubenswrapper[4599]: I1012 07:56:51.008602 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2jgtd" Oct 12 07:56:51 crc kubenswrapper[4599]: I1012 07:56:51.078593 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bphhh"] Oct 12 07:56:51 crc kubenswrapper[4599]: E1012 07:56:51.078958 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2098dde-6e8b-4a07-80d7-fc8e6d2c665e" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 12 07:56:51 crc kubenswrapper[4599]: I1012 07:56:51.078976 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2098dde-6e8b-4a07-80d7-fc8e6d2c665e" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 12 07:56:51 crc kubenswrapper[4599]: I1012 07:56:51.079148 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2098dde-6e8b-4a07-80d7-fc8e6d2c665e" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 12 07:56:51 crc kubenswrapper[4599]: I1012 07:56:51.079800 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bphhh" Oct 12 07:56:51 crc kubenswrapper[4599]: I1012 07:56:51.081845 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-x82pw" Oct 12 07:56:51 crc kubenswrapper[4599]: I1012 07:56:51.082046 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 12 07:56:51 crc kubenswrapper[4599]: I1012 07:56:51.082092 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 12 07:56:51 crc kubenswrapper[4599]: I1012 07:56:51.082138 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 12 07:56:51 crc kubenswrapper[4599]: I1012 07:56:51.090854 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bphhh"] Oct 12 07:56:51 crc kubenswrapper[4599]: I1012 07:56:51.119957 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f37be313-7217-4822-82a3-b1c6edd70a45-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-bphhh\" (UID: \"f37be313-7217-4822-82a3-b1c6edd70a45\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bphhh" Oct 12 07:56:51 crc kubenswrapper[4599]: I1012 07:56:51.120026 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbxm7\" (UniqueName: \"kubernetes.io/projected/f37be313-7217-4822-82a3-b1c6edd70a45-kube-api-access-jbxm7\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-bphhh\" (UID: \"f37be313-7217-4822-82a3-b1c6edd70a45\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bphhh" Oct 12 07:56:51 crc kubenswrapper[4599]: I1012 07:56:51.120172 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f37be313-7217-4822-82a3-b1c6edd70a45-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-bphhh\" (UID: \"f37be313-7217-4822-82a3-b1c6edd70a45\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bphhh" Oct 12 07:56:51 crc kubenswrapper[4599]: I1012 07:56:51.222523 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f37be313-7217-4822-82a3-b1c6edd70a45-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-bphhh\" (UID: \"f37be313-7217-4822-82a3-b1c6edd70a45\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bphhh" Oct 12 07:56:51 crc kubenswrapper[4599]: I1012 07:56:51.222613 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f37be313-7217-4822-82a3-b1c6edd70a45-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-bphhh\" (UID: \"f37be313-7217-4822-82a3-b1c6edd70a45\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bphhh" Oct 12 07:56:51 crc kubenswrapper[4599]: I1012 07:56:51.222670 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbxm7\" (UniqueName: \"kubernetes.io/projected/f37be313-7217-4822-82a3-b1c6edd70a45-kube-api-access-jbxm7\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-bphhh\" (UID: \"f37be313-7217-4822-82a3-b1c6edd70a45\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bphhh" Oct 12 07:56:51 crc kubenswrapper[4599]: I1012 07:56:51.228115 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f37be313-7217-4822-82a3-b1c6edd70a45-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-bphhh\" (UID: \"f37be313-7217-4822-82a3-b1c6edd70a45\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bphhh" Oct 12 07:56:51 crc kubenswrapper[4599]: I1012 07:56:51.229202 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f37be313-7217-4822-82a3-b1c6edd70a45-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-bphhh\" (UID: \"f37be313-7217-4822-82a3-b1c6edd70a45\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bphhh" Oct 12 07:56:51 crc kubenswrapper[4599]: I1012 07:56:51.238069 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbxm7\" (UniqueName: \"kubernetes.io/projected/f37be313-7217-4822-82a3-b1c6edd70a45-kube-api-access-jbxm7\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-bphhh\" (UID: \"f37be313-7217-4822-82a3-b1c6edd70a45\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bphhh" Oct 12 07:56:51 crc kubenswrapper[4599]: I1012 07:56:51.394639 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bphhh" Oct 12 07:56:51 crc kubenswrapper[4599]: I1012 07:56:51.839961 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bphhh"] Oct 12 07:56:52 crc kubenswrapper[4599]: I1012 07:56:52.029649 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bphhh" event={"ID":"f37be313-7217-4822-82a3-b1c6edd70a45","Type":"ContainerStarted","Data":"b8561890006cdffa06e8e393a6a90090ce7ce727d9f4270aa1e7e208cc7f5a12"} Oct 12 07:56:53 crc kubenswrapper[4599]: I1012 07:56:53.041901 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bphhh" event={"ID":"f37be313-7217-4822-82a3-b1c6edd70a45","Type":"ContainerStarted","Data":"6c8e09c69482a8b52a30c843cdaf87575937ce2999e8c7ed5b3979df77531707"} Oct 12 07:56:53 crc kubenswrapper[4599]: I1012 07:56:53.065347 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bphhh" podStartSLOduration=1.477162118 podStartE2EDuration="2.065317267s" podCreationTimestamp="2025-10-12 07:56:51 +0000 UTC" firstStartedPulling="2025-10-12 07:56:51.84991437 +0000 UTC m=+1308.639109872" lastFinishedPulling="2025-10-12 07:56:52.438069518 +0000 UTC m=+1309.227265021" observedRunningTime="2025-10-12 07:56:53.060084565 +0000 UTC m=+1309.849280067" watchObservedRunningTime="2025-10-12 07:56:53.065317267 +0000 UTC m=+1309.854512769" Oct 12 07:58:05 crc kubenswrapper[4599]: I1012 07:58:05.054106 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zjr4s"] Oct 12 07:58:05 crc kubenswrapper[4599]: I1012 07:58:05.057275 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zjr4s" Oct 12 07:58:05 crc kubenswrapper[4599]: I1012 07:58:05.071735 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zjr4s"] Oct 12 07:58:05 crc kubenswrapper[4599]: I1012 07:58:05.082689 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2dc6d7e5-45fa-4d49-a123-0582bf6404b5-utilities\") pod \"redhat-operators-zjr4s\" (UID: \"2dc6d7e5-45fa-4d49-a123-0582bf6404b5\") " pod="openshift-marketplace/redhat-operators-zjr4s" Oct 12 07:58:05 crc kubenswrapper[4599]: I1012 07:58:05.082723 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2dc6d7e5-45fa-4d49-a123-0582bf6404b5-catalog-content\") pod \"redhat-operators-zjr4s\" (UID: \"2dc6d7e5-45fa-4d49-a123-0582bf6404b5\") " pod="openshift-marketplace/redhat-operators-zjr4s" Oct 12 07:58:05 crc kubenswrapper[4599]: I1012 07:58:05.082780 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cztxt\" (UniqueName: \"kubernetes.io/projected/2dc6d7e5-45fa-4d49-a123-0582bf6404b5-kube-api-access-cztxt\") pod \"redhat-operators-zjr4s\" (UID: \"2dc6d7e5-45fa-4d49-a123-0582bf6404b5\") " pod="openshift-marketplace/redhat-operators-zjr4s" Oct 12 07:58:05 crc kubenswrapper[4599]: I1012 07:58:05.184743 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2dc6d7e5-45fa-4d49-a123-0582bf6404b5-utilities\") pod \"redhat-operators-zjr4s\" (UID: \"2dc6d7e5-45fa-4d49-a123-0582bf6404b5\") " pod="openshift-marketplace/redhat-operators-zjr4s" Oct 12 07:58:05 crc kubenswrapper[4599]: I1012 07:58:05.184793 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2dc6d7e5-45fa-4d49-a123-0582bf6404b5-catalog-content\") pod \"redhat-operators-zjr4s\" (UID: \"2dc6d7e5-45fa-4d49-a123-0582bf6404b5\") " pod="openshift-marketplace/redhat-operators-zjr4s" Oct 12 07:58:05 crc kubenswrapper[4599]: I1012 07:58:05.184860 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cztxt\" (UniqueName: \"kubernetes.io/projected/2dc6d7e5-45fa-4d49-a123-0582bf6404b5-kube-api-access-cztxt\") pod \"redhat-operators-zjr4s\" (UID: \"2dc6d7e5-45fa-4d49-a123-0582bf6404b5\") " pod="openshift-marketplace/redhat-operators-zjr4s" Oct 12 07:58:05 crc kubenswrapper[4599]: I1012 07:58:05.185244 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2dc6d7e5-45fa-4d49-a123-0582bf6404b5-utilities\") pod \"redhat-operators-zjr4s\" (UID: \"2dc6d7e5-45fa-4d49-a123-0582bf6404b5\") " pod="openshift-marketplace/redhat-operators-zjr4s" Oct 12 07:58:05 crc kubenswrapper[4599]: I1012 07:58:05.185513 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2dc6d7e5-45fa-4d49-a123-0582bf6404b5-catalog-content\") pod \"redhat-operators-zjr4s\" (UID: \"2dc6d7e5-45fa-4d49-a123-0582bf6404b5\") " pod="openshift-marketplace/redhat-operators-zjr4s" Oct 12 07:58:05 crc kubenswrapper[4599]: I1012 07:58:05.204355 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cztxt\" (UniqueName: \"kubernetes.io/projected/2dc6d7e5-45fa-4d49-a123-0582bf6404b5-kube-api-access-cztxt\") pod \"redhat-operators-zjr4s\" (UID: \"2dc6d7e5-45fa-4d49-a123-0582bf6404b5\") " pod="openshift-marketplace/redhat-operators-zjr4s" Oct 12 07:58:05 crc kubenswrapper[4599]: I1012 07:58:05.372149 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zjr4s" Oct 12 07:58:05 crc kubenswrapper[4599]: I1012 07:58:05.803576 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zjr4s"] Oct 12 07:58:06 crc kubenswrapper[4599]: I1012 07:58:06.021559 4599 scope.go:117] "RemoveContainer" containerID="65fb22f48efa762e822f7330fbe6757c38f2eae38bc94dbe46fc002baf4228cd" Oct 12 07:58:06 crc kubenswrapper[4599]: I1012 07:58:06.107266 4599 scope.go:117] "RemoveContainer" containerID="fd084a8cf01892565881cff8cde7533a774ae578612d822ce20bea0fe2efc930" Oct 12 07:58:06 crc kubenswrapper[4599]: I1012 07:58:06.693240 4599 generic.go:334] "Generic (PLEG): container finished" podID="2dc6d7e5-45fa-4d49-a123-0582bf6404b5" containerID="2f685aca2bab1aeddad0910ed039d97f3ddf7c51a53c0d58ebdbd400c361b93b" exitCode=0 Oct 12 07:58:06 crc kubenswrapper[4599]: I1012 07:58:06.693302 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zjr4s" event={"ID":"2dc6d7e5-45fa-4d49-a123-0582bf6404b5","Type":"ContainerDied","Data":"2f685aca2bab1aeddad0910ed039d97f3ddf7c51a53c0d58ebdbd400c361b93b"} Oct 12 07:58:06 crc kubenswrapper[4599]: I1012 07:58:06.693346 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zjr4s" event={"ID":"2dc6d7e5-45fa-4d49-a123-0582bf6404b5","Type":"ContainerStarted","Data":"84133944bc7599aed4625f69ff814c2396b97de35bc00fe3c6be820cdfdd37bf"} Oct 12 07:58:07 crc kubenswrapper[4599]: I1012 07:58:07.702376 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zjr4s" event={"ID":"2dc6d7e5-45fa-4d49-a123-0582bf6404b5","Type":"ContainerStarted","Data":"7b6a6d1a352bd4ed8b10b5bdfdb2e99ec5193cfe55b256d83a78ffa3b0cd453c"} Oct 12 07:58:09 crc kubenswrapper[4599]: I1012 07:58:09.720440 4599 generic.go:334] "Generic (PLEG): container finished" podID="2dc6d7e5-45fa-4d49-a123-0582bf6404b5" containerID="7b6a6d1a352bd4ed8b10b5bdfdb2e99ec5193cfe55b256d83a78ffa3b0cd453c" exitCode=0 Oct 12 07:58:09 crc kubenswrapper[4599]: I1012 07:58:09.720520 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zjr4s" event={"ID":"2dc6d7e5-45fa-4d49-a123-0582bf6404b5","Type":"ContainerDied","Data":"7b6a6d1a352bd4ed8b10b5bdfdb2e99ec5193cfe55b256d83a78ffa3b0cd453c"} Oct 12 07:58:10 crc kubenswrapper[4599]: I1012 07:58:10.730446 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zjr4s" event={"ID":"2dc6d7e5-45fa-4d49-a123-0582bf6404b5","Type":"ContainerStarted","Data":"4c274ea0a76e4e5aa3e49d1db260492b0836def3351292147a129baa50b63a82"} Oct 12 07:58:10 crc kubenswrapper[4599]: I1012 07:58:10.745983 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zjr4s" podStartSLOduration=2.174688874 podStartE2EDuration="5.745971336s" podCreationTimestamp="2025-10-12 07:58:05 +0000 UTC" firstStartedPulling="2025-10-12 07:58:06.69483805 +0000 UTC m=+1383.484033551" lastFinishedPulling="2025-10-12 07:58:10.266120511 +0000 UTC m=+1387.055316013" observedRunningTime="2025-10-12 07:58:10.743819977 +0000 UTC m=+1387.533015480" watchObservedRunningTime="2025-10-12 07:58:10.745971336 +0000 UTC m=+1387.535166838" Oct 12 07:58:15 crc kubenswrapper[4599]: I1012 07:58:15.372757 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zjr4s" Oct 12 07:58:15 crc kubenswrapper[4599]: I1012 07:58:15.373277 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zjr4s" Oct 12 07:58:15 crc kubenswrapper[4599]: I1012 07:58:15.409104 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zjr4s" Oct 12 07:58:15 crc kubenswrapper[4599]: I1012 07:58:15.811350 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zjr4s" Oct 12 07:58:15 crc kubenswrapper[4599]: I1012 07:58:15.853293 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zjr4s"] Oct 12 07:58:17 crc kubenswrapper[4599]: I1012 07:58:17.790118 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zjr4s" podUID="2dc6d7e5-45fa-4d49-a123-0582bf6404b5" containerName="registry-server" containerID="cri-o://4c274ea0a76e4e5aa3e49d1db260492b0836def3351292147a129baa50b63a82" gracePeriod=2 Oct 12 07:58:18 crc kubenswrapper[4599]: I1012 07:58:18.168684 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zjr4s" Oct 12 07:58:18 crc kubenswrapper[4599]: I1012 07:58:18.218831 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cztxt\" (UniqueName: \"kubernetes.io/projected/2dc6d7e5-45fa-4d49-a123-0582bf6404b5-kube-api-access-cztxt\") pod \"2dc6d7e5-45fa-4d49-a123-0582bf6404b5\" (UID: \"2dc6d7e5-45fa-4d49-a123-0582bf6404b5\") " Oct 12 07:58:18 crc kubenswrapper[4599]: I1012 07:58:18.219030 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2dc6d7e5-45fa-4d49-a123-0582bf6404b5-utilities\") pod \"2dc6d7e5-45fa-4d49-a123-0582bf6404b5\" (UID: \"2dc6d7e5-45fa-4d49-a123-0582bf6404b5\") " Oct 12 07:58:18 crc kubenswrapper[4599]: I1012 07:58:18.219249 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2dc6d7e5-45fa-4d49-a123-0582bf6404b5-catalog-content\") pod \"2dc6d7e5-45fa-4d49-a123-0582bf6404b5\" (UID: \"2dc6d7e5-45fa-4d49-a123-0582bf6404b5\") " Oct 12 07:58:18 crc kubenswrapper[4599]: I1012 07:58:18.219666 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2dc6d7e5-45fa-4d49-a123-0582bf6404b5-utilities" (OuterVolumeSpecName: "utilities") pod "2dc6d7e5-45fa-4d49-a123-0582bf6404b5" (UID: "2dc6d7e5-45fa-4d49-a123-0582bf6404b5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:58:18 crc kubenswrapper[4599]: I1012 07:58:18.219823 4599 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2dc6d7e5-45fa-4d49-a123-0582bf6404b5-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 07:58:18 crc kubenswrapper[4599]: I1012 07:58:18.223857 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2dc6d7e5-45fa-4d49-a123-0582bf6404b5-kube-api-access-cztxt" (OuterVolumeSpecName: "kube-api-access-cztxt") pod "2dc6d7e5-45fa-4d49-a123-0582bf6404b5" (UID: "2dc6d7e5-45fa-4d49-a123-0582bf6404b5"). InnerVolumeSpecName "kube-api-access-cztxt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:58:18 crc kubenswrapper[4599]: I1012 07:58:18.283401 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2dc6d7e5-45fa-4d49-a123-0582bf6404b5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2dc6d7e5-45fa-4d49-a123-0582bf6404b5" (UID: "2dc6d7e5-45fa-4d49-a123-0582bf6404b5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:58:18 crc kubenswrapper[4599]: I1012 07:58:18.320931 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cztxt\" (UniqueName: \"kubernetes.io/projected/2dc6d7e5-45fa-4d49-a123-0582bf6404b5-kube-api-access-cztxt\") on node \"crc\" DevicePath \"\"" Oct 12 07:58:18 crc kubenswrapper[4599]: I1012 07:58:18.321095 4599 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2dc6d7e5-45fa-4d49-a123-0582bf6404b5-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 07:58:18 crc kubenswrapper[4599]: I1012 07:58:18.800566 4599 generic.go:334] "Generic (PLEG): container finished" podID="2dc6d7e5-45fa-4d49-a123-0582bf6404b5" containerID="4c274ea0a76e4e5aa3e49d1db260492b0836def3351292147a129baa50b63a82" exitCode=0 Oct 12 07:58:18 crc kubenswrapper[4599]: I1012 07:58:18.800618 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zjr4s" event={"ID":"2dc6d7e5-45fa-4d49-a123-0582bf6404b5","Type":"ContainerDied","Data":"4c274ea0a76e4e5aa3e49d1db260492b0836def3351292147a129baa50b63a82"} Oct 12 07:58:18 crc kubenswrapper[4599]: I1012 07:58:18.800648 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zjr4s" event={"ID":"2dc6d7e5-45fa-4d49-a123-0582bf6404b5","Type":"ContainerDied","Data":"84133944bc7599aed4625f69ff814c2396b97de35bc00fe3c6be820cdfdd37bf"} Oct 12 07:58:18 crc kubenswrapper[4599]: I1012 07:58:18.800670 4599 scope.go:117] "RemoveContainer" containerID="4c274ea0a76e4e5aa3e49d1db260492b0836def3351292147a129baa50b63a82" Oct 12 07:58:18 crc kubenswrapper[4599]: I1012 07:58:18.800808 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zjr4s" Oct 12 07:58:18 crc kubenswrapper[4599]: I1012 07:58:18.829073 4599 scope.go:117] "RemoveContainer" containerID="7b6a6d1a352bd4ed8b10b5bdfdb2e99ec5193cfe55b256d83a78ffa3b0cd453c" Oct 12 07:58:18 crc kubenswrapper[4599]: I1012 07:58:18.834736 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zjr4s"] Oct 12 07:58:18 crc kubenswrapper[4599]: I1012 07:58:18.842729 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zjr4s"] Oct 12 07:58:18 crc kubenswrapper[4599]: I1012 07:58:18.850732 4599 scope.go:117] "RemoveContainer" containerID="2f685aca2bab1aeddad0910ed039d97f3ddf7c51a53c0d58ebdbd400c361b93b" Oct 12 07:58:18 crc kubenswrapper[4599]: I1012 07:58:18.886783 4599 scope.go:117] "RemoveContainer" containerID="4c274ea0a76e4e5aa3e49d1db260492b0836def3351292147a129baa50b63a82" Oct 12 07:58:18 crc kubenswrapper[4599]: E1012 07:58:18.887190 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c274ea0a76e4e5aa3e49d1db260492b0836def3351292147a129baa50b63a82\": container with ID starting with 4c274ea0a76e4e5aa3e49d1db260492b0836def3351292147a129baa50b63a82 not found: ID does not exist" containerID="4c274ea0a76e4e5aa3e49d1db260492b0836def3351292147a129baa50b63a82" Oct 12 07:58:18 crc kubenswrapper[4599]: I1012 07:58:18.887221 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c274ea0a76e4e5aa3e49d1db260492b0836def3351292147a129baa50b63a82"} err="failed to get container status \"4c274ea0a76e4e5aa3e49d1db260492b0836def3351292147a129baa50b63a82\": rpc error: code = NotFound desc = could not find container \"4c274ea0a76e4e5aa3e49d1db260492b0836def3351292147a129baa50b63a82\": container with ID starting with 4c274ea0a76e4e5aa3e49d1db260492b0836def3351292147a129baa50b63a82 not found: ID does not exist" Oct 12 07:58:18 crc kubenswrapper[4599]: I1012 07:58:18.887245 4599 scope.go:117] "RemoveContainer" containerID="7b6a6d1a352bd4ed8b10b5bdfdb2e99ec5193cfe55b256d83a78ffa3b0cd453c" Oct 12 07:58:18 crc kubenswrapper[4599]: E1012 07:58:18.887843 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b6a6d1a352bd4ed8b10b5bdfdb2e99ec5193cfe55b256d83a78ffa3b0cd453c\": container with ID starting with 7b6a6d1a352bd4ed8b10b5bdfdb2e99ec5193cfe55b256d83a78ffa3b0cd453c not found: ID does not exist" containerID="7b6a6d1a352bd4ed8b10b5bdfdb2e99ec5193cfe55b256d83a78ffa3b0cd453c" Oct 12 07:58:18 crc kubenswrapper[4599]: I1012 07:58:18.887878 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b6a6d1a352bd4ed8b10b5bdfdb2e99ec5193cfe55b256d83a78ffa3b0cd453c"} err="failed to get container status \"7b6a6d1a352bd4ed8b10b5bdfdb2e99ec5193cfe55b256d83a78ffa3b0cd453c\": rpc error: code = NotFound desc = could not find container \"7b6a6d1a352bd4ed8b10b5bdfdb2e99ec5193cfe55b256d83a78ffa3b0cd453c\": container with ID starting with 7b6a6d1a352bd4ed8b10b5bdfdb2e99ec5193cfe55b256d83a78ffa3b0cd453c not found: ID does not exist" Oct 12 07:58:18 crc kubenswrapper[4599]: I1012 07:58:18.887901 4599 scope.go:117] "RemoveContainer" containerID="2f685aca2bab1aeddad0910ed039d97f3ddf7c51a53c0d58ebdbd400c361b93b" Oct 12 07:58:18 crc kubenswrapper[4599]: E1012 07:58:18.888166 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f685aca2bab1aeddad0910ed039d97f3ddf7c51a53c0d58ebdbd400c361b93b\": container with ID starting with 2f685aca2bab1aeddad0910ed039d97f3ddf7c51a53c0d58ebdbd400c361b93b not found: ID does not exist" containerID="2f685aca2bab1aeddad0910ed039d97f3ddf7c51a53c0d58ebdbd400c361b93b" Oct 12 07:58:18 crc kubenswrapper[4599]: I1012 07:58:18.888195 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f685aca2bab1aeddad0910ed039d97f3ddf7c51a53c0d58ebdbd400c361b93b"} err="failed to get container status \"2f685aca2bab1aeddad0910ed039d97f3ddf7c51a53c0d58ebdbd400c361b93b\": rpc error: code = NotFound desc = could not find container \"2f685aca2bab1aeddad0910ed039d97f3ddf7c51a53c0d58ebdbd400c361b93b\": container with ID starting with 2f685aca2bab1aeddad0910ed039d97f3ddf7c51a53c0d58ebdbd400c361b93b not found: ID does not exist" Oct 12 07:58:19 crc kubenswrapper[4599]: I1012 07:58:19.556206 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2dc6d7e5-45fa-4d49-a123-0582bf6404b5" path="/var/lib/kubelet/pods/2dc6d7e5-45fa-4d49-a123-0582bf6404b5/volumes" Oct 12 07:58:28 crc kubenswrapper[4599]: I1012 07:58:28.321742 4599 patch_prober.go:28] interesting pod/machine-config-daemon-5mz5c container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 07:58:28 crc kubenswrapper[4599]: I1012 07:58:28.322308 4599 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 07:58:29 crc kubenswrapper[4599]: I1012 07:58:29.034082 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-qbjjh"] Oct 12 07:58:29 crc kubenswrapper[4599]: I1012 07:58:29.040591 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-cdhl9"] Oct 12 07:58:29 crc kubenswrapper[4599]: I1012 07:58:29.047757 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-gbrxb"] Oct 12 07:58:29 crc kubenswrapper[4599]: I1012 07:58:29.054050 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-qbjjh"] Oct 12 07:58:29 crc kubenswrapper[4599]: I1012 07:58:29.059218 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-cdhl9"] Oct 12 07:58:29 crc kubenswrapper[4599]: I1012 07:58:29.064086 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-gbrxb"] Oct 12 07:58:29 crc kubenswrapper[4599]: I1012 07:58:29.553411 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a8d4eed-9d78-407f-a3fb-db0af1beac68" path="/var/lib/kubelet/pods/3a8d4eed-9d78-407f-a3fb-db0af1beac68/volumes" Oct 12 07:58:29 crc kubenswrapper[4599]: I1012 07:58:29.554045 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43961ba0-a4ba-4c09-875a-e461a27bac2a" path="/var/lib/kubelet/pods/43961ba0-a4ba-4c09-875a-e461a27bac2a/volumes" Oct 12 07:58:29 crc kubenswrapper[4599]: I1012 07:58:29.554542 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85dd307d-9c6b-4d7d-b639-5486431ba73f" path="/var/lib/kubelet/pods/85dd307d-9c6b-4d7d-b639-5486431ba73f/volumes" Oct 12 07:58:36 crc kubenswrapper[4599]: I1012 07:58:36.025895 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-b848-account-create-vx6cq"] Oct 12 07:58:36 crc kubenswrapper[4599]: I1012 07:58:36.031621 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-b848-account-create-vx6cq"] Oct 12 07:58:37 crc kubenswrapper[4599]: I1012 07:58:37.556218 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53b52909-5298-47ad-a169-91067086a742" path="/var/lib/kubelet/pods/53b52909-5298-47ad-a169-91067086a742/volumes" Oct 12 07:58:38 crc kubenswrapper[4599]: I1012 07:58:38.972781 4599 generic.go:334] "Generic (PLEG): container finished" podID="f37be313-7217-4822-82a3-b1c6edd70a45" containerID="6c8e09c69482a8b52a30c843cdaf87575937ce2999e8c7ed5b3979df77531707" exitCode=0 Oct 12 07:58:38 crc kubenswrapper[4599]: I1012 07:58:38.972865 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bphhh" event={"ID":"f37be313-7217-4822-82a3-b1c6edd70a45","Type":"ContainerDied","Data":"6c8e09c69482a8b52a30c843cdaf87575937ce2999e8c7ed5b3979df77531707"} Oct 12 07:58:40 crc kubenswrapper[4599]: I1012 07:58:40.318768 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bphhh" Oct 12 07:58:40 crc kubenswrapper[4599]: I1012 07:58:40.510290 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jbxm7\" (UniqueName: \"kubernetes.io/projected/f37be313-7217-4822-82a3-b1c6edd70a45-kube-api-access-jbxm7\") pod \"f37be313-7217-4822-82a3-b1c6edd70a45\" (UID: \"f37be313-7217-4822-82a3-b1c6edd70a45\") " Oct 12 07:58:40 crc kubenswrapper[4599]: I1012 07:58:40.510393 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f37be313-7217-4822-82a3-b1c6edd70a45-ssh-key\") pod \"f37be313-7217-4822-82a3-b1c6edd70a45\" (UID: \"f37be313-7217-4822-82a3-b1c6edd70a45\") " Oct 12 07:58:40 crc kubenswrapper[4599]: I1012 07:58:40.510451 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f37be313-7217-4822-82a3-b1c6edd70a45-inventory\") pod \"f37be313-7217-4822-82a3-b1c6edd70a45\" (UID: \"f37be313-7217-4822-82a3-b1c6edd70a45\") " Oct 12 07:58:40 crc kubenswrapper[4599]: I1012 07:58:40.529598 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f37be313-7217-4822-82a3-b1c6edd70a45-kube-api-access-jbxm7" (OuterVolumeSpecName: "kube-api-access-jbxm7") pod "f37be313-7217-4822-82a3-b1c6edd70a45" (UID: "f37be313-7217-4822-82a3-b1c6edd70a45"). InnerVolumeSpecName "kube-api-access-jbxm7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:58:40 crc kubenswrapper[4599]: I1012 07:58:40.533250 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f37be313-7217-4822-82a3-b1c6edd70a45-inventory" (OuterVolumeSpecName: "inventory") pod "f37be313-7217-4822-82a3-b1c6edd70a45" (UID: "f37be313-7217-4822-82a3-b1c6edd70a45"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:58:40 crc kubenswrapper[4599]: I1012 07:58:40.533646 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f37be313-7217-4822-82a3-b1c6edd70a45-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f37be313-7217-4822-82a3-b1c6edd70a45" (UID: "f37be313-7217-4822-82a3-b1c6edd70a45"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:58:40 crc kubenswrapper[4599]: I1012 07:58:40.613789 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jbxm7\" (UniqueName: \"kubernetes.io/projected/f37be313-7217-4822-82a3-b1c6edd70a45-kube-api-access-jbxm7\") on node \"crc\" DevicePath \"\"" Oct 12 07:58:40 crc kubenswrapper[4599]: I1012 07:58:40.614039 4599 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f37be313-7217-4822-82a3-b1c6edd70a45-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 12 07:58:40 crc kubenswrapper[4599]: I1012 07:58:40.614050 4599 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f37be313-7217-4822-82a3-b1c6edd70a45-inventory\") on node \"crc\" DevicePath \"\"" Oct 12 07:58:40 crc kubenswrapper[4599]: I1012 07:58:40.992570 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bphhh" event={"ID":"f37be313-7217-4822-82a3-b1c6edd70a45","Type":"ContainerDied","Data":"b8561890006cdffa06e8e393a6a90090ce7ce727d9f4270aa1e7e208cc7f5a12"} Oct 12 07:58:40 crc kubenswrapper[4599]: I1012 07:58:40.992632 4599 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b8561890006cdffa06e8e393a6a90090ce7ce727d9f4270aa1e7e208cc7f5a12" Oct 12 07:58:40 crc kubenswrapper[4599]: I1012 07:58:40.992656 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bphhh" Oct 12 07:58:41 crc kubenswrapper[4599]: I1012 07:58:41.063317 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s5xrx"] Oct 12 07:58:41 crc kubenswrapper[4599]: E1012 07:58:41.064039 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2dc6d7e5-45fa-4d49-a123-0582bf6404b5" containerName="extract-content" Oct 12 07:58:41 crc kubenswrapper[4599]: I1012 07:58:41.064065 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dc6d7e5-45fa-4d49-a123-0582bf6404b5" containerName="extract-content" Oct 12 07:58:41 crc kubenswrapper[4599]: E1012 07:58:41.064099 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2dc6d7e5-45fa-4d49-a123-0582bf6404b5" containerName="extract-utilities" Oct 12 07:58:41 crc kubenswrapper[4599]: I1012 07:58:41.064107 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dc6d7e5-45fa-4d49-a123-0582bf6404b5" containerName="extract-utilities" Oct 12 07:58:41 crc kubenswrapper[4599]: E1012 07:58:41.064132 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2dc6d7e5-45fa-4d49-a123-0582bf6404b5" containerName="registry-server" Oct 12 07:58:41 crc kubenswrapper[4599]: I1012 07:58:41.064138 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dc6d7e5-45fa-4d49-a123-0582bf6404b5" containerName="registry-server" Oct 12 07:58:41 crc kubenswrapper[4599]: E1012 07:58:41.064147 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f37be313-7217-4822-82a3-b1c6edd70a45" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Oct 12 07:58:41 crc kubenswrapper[4599]: I1012 07:58:41.064153 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="f37be313-7217-4822-82a3-b1c6edd70a45" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Oct 12 07:58:41 crc kubenswrapper[4599]: I1012 07:58:41.064438 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="2dc6d7e5-45fa-4d49-a123-0582bf6404b5" containerName="registry-server" Oct 12 07:58:41 crc kubenswrapper[4599]: I1012 07:58:41.064461 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="f37be313-7217-4822-82a3-b1c6edd70a45" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Oct 12 07:58:41 crc kubenswrapper[4599]: I1012 07:58:41.065237 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s5xrx" Oct 12 07:58:41 crc kubenswrapper[4599]: I1012 07:58:41.069215 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 12 07:58:41 crc kubenswrapper[4599]: I1012 07:58:41.069243 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 12 07:58:41 crc kubenswrapper[4599]: I1012 07:58:41.069475 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 12 07:58:41 crc kubenswrapper[4599]: I1012 07:58:41.071739 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-x82pw" Oct 12 07:58:41 crc kubenswrapper[4599]: I1012 07:58:41.075414 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s5xrx"] Oct 12 07:58:41 crc kubenswrapper[4599]: I1012 07:58:41.121756 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9988d105-d7f0-459a-a8d9-056ac0d3abab-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-s5xrx\" (UID: \"9988d105-d7f0-459a-a8d9-056ac0d3abab\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s5xrx" Oct 12 07:58:41 crc kubenswrapper[4599]: I1012 07:58:41.121904 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9988d105-d7f0-459a-a8d9-056ac0d3abab-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-s5xrx\" (UID: \"9988d105-d7f0-459a-a8d9-056ac0d3abab\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s5xrx" Oct 12 07:58:41 crc kubenswrapper[4599]: I1012 07:58:41.122062 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49pqb\" (UniqueName: \"kubernetes.io/projected/9988d105-d7f0-459a-a8d9-056ac0d3abab-kube-api-access-49pqb\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-s5xrx\" (UID: \"9988d105-d7f0-459a-a8d9-056ac0d3abab\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s5xrx" Oct 12 07:58:41 crc kubenswrapper[4599]: I1012 07:58:41.223774 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49pqb\" (UniqueName: \"kubernetes.io/projected/9988d105-d7f0-459a-a8d9-056ac0d3abab-kube-api-access-49pqb\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-s5xrx\" (UID: \"9988d105-d7f0-459a-a8d9-056ac0d3abab\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s5xrx" Oct 12 07:58:41 crc kubenswrapper[4599]: I1012 07:58:41.223984 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9988d105-d7f0-459a-a8d9-056ac0d3abab-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-s5xrx\" (UID: \"9988d105-d7f0-459a-a8d9-056ac0d3abab\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s5xrx" Oct 12 07:58:41 crc kubenswrapper[4599]: I1012 07:58:41.224138 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9988d105-d7f0-459a-a8d9-056ac0d3abab-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-s5xrx\" (UID: \"9988d105-d7f0-459a-a8d9-056ac0d3abab\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s5xrx" Oct 12 07:58:41 crc kubenswrapper[4599]: I1012 07:58:41.228128 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9988d105-d7f0-459a-a8d9-056ac0d3abab-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-s5xrx\" (UID: \"9988d105-d7f0-459a-a8d9-056ac0d3abab\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s5xrx" Oct 12 07:58:41 crc kubenswrapper[4599]: I1012 07:58:41.228199 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9988d105-d7f0-459a-a8d9-056ac0d3abab-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-s5xrx\" (UID: \"9988d105-d7f0-459a-a8d9-056ac0d3abab\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s5xrx" Oct 12 07:58:41 crc kubenswrapper[4599]: I1012 07:58:41.238610 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49pqb\" (UniqueName: \"kubernetes.io/projected/9988d105-d7f0-459a-a8d9-056ac0d3abab-kube-api-access-49pqb\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-s5xrx\" (UID: \"9988d105-d7f0-459a-a8d9-056ac0d3abab\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s5xrx" Oct 12 07:58:41 crc kubenswrapper[4599]: I1012 07:58:41.387841 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s5xrx" Oct 12 07:58:41 crc kubenswrapper[4599]: I1012 07:58:41.832466 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s5xrx"] Oct 12 07:58:41 crc kubenswrapper[4599]: I1012 07:58:41.836325 4599 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 12 07:58:42 crc kubenswrapper[4599]: I1012 07:58:42.002726 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s5xrx" event={"ID":"9988d105-d7f0-459a-a8d9-056ac0d3abab","Type":"ContainerStarted","Data":"ed16fd8b3921eec2015fac1898b5d5dac6ee39b4a9967bfc06ada806131b62ad"} Oct 12 07:58:43 crc kubenswrapper[4599]: I1012 07:58:43.012689 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s5xrx" event={"ID":"9988d105-d7f0-459a-a8d9-056ac0d3abab","Type":"ContainerStarted","Data":"5178afb38c7b1209f5b4b5f4b60a4f548de1af8f4cd55050225ef3bf19c0f6cc"} Oct 12 07:58:43 crc kubenswrapper[4599]: I1012 07:58:43.030324 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s5xrx" podStartSLOduration=1.390135878 podStartE2EDuration="2.030296017s" podCreationTimestamp="2025-10-12 07:58:41 +0000 UTC" firstStartedPulling="2025-10-12 07:58:41.836105974 +0000 UTC m=+1418.625301476" lastFinishedPulling="2025-10-12 07:58:42.476266113 +0000 UTC m=+1419.265461615" observedRunningTime="2025-10-12 07:58:43.025089053 +0000 UTC m=+1419.814284556" watchObservedRunningTime="2025-10-12 07:58:43.030296017 +0000 UTC m=+1419.819491519" Oct 12 07:58:46 crc kubenswrapper[4599]: I1012 07:58:46.025014 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-79f1-account-create-n2j4q"] Oct 12 07:58:46 crc kubenswrapper[4599]: I1012 07:58:46.032572 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-fe76-account-create-8pkc5"] Oct 12 07:58:46 crc kubenswrapper[4599]: I1012 07:58:46.040103 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-79f1-account-create-n2j4q"] Oct 12 07:58:46 crc kubenswrapper[4599]: I1012 07:58:46.046725 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-fe76-account-create-8pkc5"] Oct 12 07:58:47 crc kubenswrapper[4599]: I1012 07:58:47.553851 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc2fcc8c-96c9-4892-968d-f738570bc088" path="/var/lib/kubelet/pods/bc2fcc8c-96c9-4892-968d-f738570bc088/volumes" Oct 12 07:58:47 crc kubenswrapper[4599]: I1012 07:58:47.554605 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbd9c1bf-e0d7-4ee9-acfb-864371dd2237" path="/var/lib/kubelet/pods/cbd9c1bf-e0d7-4ee9-acfb-864371dd2237/volumes" Oct 12 07:58:58 crc kubenswrapper[4599]: I1012 07:58:58.322361 4599 patch_prober.go:28] interesting pod/machine-config-daemon-5mz5c container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 07:58:58 crc kubenswrapper[4599]: I1012 07:58:58.323113 4599 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 07:59:00 crc kubenswrapper[4599]: I1012 07:59:00.036293 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-mhctb"] Oct 12 07:59:00 crc kubenswrapper[4599]: I1012 07:59:00.041730 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-hmf2g"] Oct 12 07:59:00 crc kubenswrapper[4599]: I1012 07:59:00.046695 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-hmf2g"] Oct 12 07:59:00 crc kubenswrapper[4599]: I1012 07:59:00.051741 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-qdm5f"] Oct 12 07:59:00 crc kubenswrapper[4599]: I1012 07:59:00.056422 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-mhctb"] Oct 12 07:59:00 crc kubenswrapper[4599]: I1012 07:59:00.061010 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-qdm5f"] Oct 12 07:59:01 crc kubenswrapper[4599]: I1012 07:59:01.557831 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16cbe17e-851b-4275-9a2b-13ca14459b4a" path="/var/lib/kubelet/pods/16cbe17e-851b-4275-9a2b-13ca14459b4a/volumes" Oct 12 07:59:01 crc kubenswrapper[4599]: I1012 07:59:01.559021 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="797ba2c7-ef03-4514-b48c-d4267a650fbc" path="/var/lib/kubelet/pods/797ba2c7-ef03-4514-b48c-d4267a650fbc/volumes" Oct 12 07:59:01 crc kubenswrapper[4599]: I1012 07:59:01.559589 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f881966-2c70-43ee-bcbd-2fca447e0697" path="/var/lib/kubelet/pods/9f881966-2c70-43ee-bcbd-2fca447e0697/volumes" Oct 12 07:59:02 crc kubenswrapper[4599]: I1012 07:59:02.027804 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-p542w"] Oct 12 07:59:02 crc kubenswrapper[4599]: I1012 07:59:02.033544 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-p542w"] Oct 12 07:59:03 crc kubenswrapper[4599]: I1012 07:59:03.554399 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0041b3a-cda2-439c-96ae-673642206886" path="/var/lib/kubelet/pods/b0041b3a-cda2-439c-96ae-673642206886/volumes" Oct 12 07:59:05 crc kubenswrapper[4599]: I1012 07:59:05.027324 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-s54p2"] Oct 12 07:59:05 crc kubenswrapper[4599]: I1012 07:59:05.033665 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-s54p2"] Oct 12 07:59:05 crc kubenswrapper[4599]: I1012 07:59:05.568266 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff665457-da4e-4e70-a9be-a64b343bd4d0" path="/var/lib/kubelet/pods/ff665457-da4e-4e70-a9be-a64b343bd4d0/volumes" Oct 12 07:59:06 crc kubenswrapper[4599]: I1012 07:59:06.155244 4599 scope.go:117] "RemoveContainer" containerID="9225b2079f92039fddefdaea4f9004ffc1332886b48003d919e8eaaba5c27286" Oct 12 07:59:06 crc kubenswrapper[4599]: I1012 07:59:06.180604 4599 scope.go:117] "RemoveContainer" containerID="b0cbe86729cc44763ec50cdb04ac7a46a5f4f4828cee9ecd30479a94c0f0171d" Oct 12 07:59:06 crc kubenswrapper[4599]: I1012 07:59:06.223849 4599 scope.go:117] "RemoveContainer" containerID="924847be9fa5a8aa64d08a40d77dad3f80609e2a1894277131b7b8f46b054cac" Oct 12 07:59:06 crc kubenswrapper[4599]: I1012 07:59:06.276107 4599 scope.go:117] "RemoveContainer" containerID="62f62a406e902ea1e0d5391caf9693d29335193539fae3f9350db1cfaecc71ac" Oct 12 07:59:06 crc kubenswrapper[4599]: I1012 07:59:06.298404 4599 scope.go:117] "RemoveContainer" containerID="22506781a9494fc947aaf801d02f63f16e02f92ec872e0b38ed7e3b7fb31f1c3" Oct 12 07:59:06 crc kubenswrapper[4599]: I1012 07:59:06.340163 4599 scope.go:117] "RemoveContainer" containerID="64e1a086baf135ecc5b91d8a3df91b4d276ae6107b96740f0e62ff6c6eb47683" Oct 12 07:59:06 crc kubenswrapper[4599]: I1012 07:59:06.390553 4599 scope.go:117] "RemoveContainer" containerID="ae1828343217bb40457a4eb0c045f4d10b0868019b799f41da9b7b4a4713abac" Oct 12 07:59:06 crc kubenswrapper[4599]: I1012 07:59:06.421988 4599 scope.go:117] "RemoveContainer" containerID="9f6c0bcc87a2d60f702d94bc107a73f5890df0096454007664f20e3334776a9d" Oct 12 07:59:06 crc kubenswrapper[4599]: I1012 07:59:06.441074 4599 scope.go:117] "RemoveContainer" containerID="3ab6a014bd08df342161a19137d03a135cc021fc7bfbdd1778e28481079c111c" Oct 12 07:59:06 crc kubenswrapper[4599]: I1012 07:59:06.457055 4599 scope.go:117] "RemoveContainer" containerID="55963fb841c3f600fb80adce0627448e7a10416bcc3f410bd1453d2c01b3066d" Oct 12 07:59:06 crc kubenswrapper[4599]: I1012 07:59:06.476063 4599 scope.go:117] "RemoveContainer" containerID="79f4f29286f309ffef46334b4f8654641852a050e08565d35c5a53340cbc443c" Oct 12 07:59:12 crc kubenswrapper[4599]: I1012 07:59:12.083048 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-66mxc"] Oct 12 07:59:12 crc kubenswrapper[4599]: I1012 07:59:12.087229 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-66mxc" Oct 12 07:59:12 crc kubenswrapper[4599]: I1012 07:59:12.095274 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-66mxc"] Oct 12 07:59:12 crc kubenswrapper[4599]: I1012 07:59:12.134414 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8vnq\" (UniqueName: \"kubernetes.io/projected/e80719a9-0b5a-418f-b038-36c385847f8a-kube-api-access-w8vnq\") pod \"community-operators-66mxc\" (UID: \"e80719a9-0b5a-418f-b038-36c385847f8a\") " pod="openshift-marketplace/community-operators-66mxc" Oct 12 07:59:12 crc kubenswrapper[4599]: I1012 07:59:12.134472 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e80719a9-0b5a-418f-b038-36c385847f8a-catalog-content\") pod \"community-operators-66mxc\" (UID: \"e80719a9-0b5a-418f-b038-36c385847f8a\") " pod="openshift-marketplace/community-operators-66mxc" Oct 12 07:59:12 crc kubenswrapper[4599]: I1012 07:59:12.134532 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e80719a9-0b5a-418f-b038-36c385847f8a-utilities\") pod \"community-operators-66mxc\" (UID: \"e80719a9-0b5a-418f-b038-36c385847f8a\") " pod="openshift-marketplace/community-operators-66mxc" Oct 12 07:59:12 crc kubenswrapper[4599]: I1012 07:59:12.236226 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8vnq\" (UniqueName: \"kubernetes.io/projected/e80719a9-0b5a-418f-b038-36c385847f8a-kube-api-access-w8vnq\") pod \"community-operators-66mxc\" (UID: \"e80719a9-0b5a-418f-b038-36c385847f8a\") " pod="openshift-marketplace/community-operators-66mxc" Oct 12 07:59:12 crc kubenswrapper[4599]: I1012 07:59:12.236317 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e80719a9-0b5a-418f-b038-36c385847f8a-catalog-content\") pod \"community-operators-66mxc\" (UID: \"e80719a9-0b5a-418f-b038-36c385847f8a\") " pod="openshift-marketplace/community-operators-66mxc" Oct 12 07:59:12 crc kubenswrapper[4599]: I1012 07:59:12.236478 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e80719a9-0b5a-418f-b038-36c385847f8a-utilities\") pod \"community-operators-66mxc\" (UID: \"e80719a9-0b5a-418f-b038-36c385847f8a\") " pod="openshift-marketplace/community-operators-66mxc" Oct 12 07:59:12 crc kubenswrapper[4599]: I1012 07:59:12.236796 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e80719a9-0b5a-418f-b038-36c385847f8a-catalog-content\") pod \"community-operators-66mxc\" (UID: \"e80719a9-0b5a-418f-b038-36c385847f8a\") " pod="openshift-marketplace/community-operators-66mxc" Oct 12 07:59:12 crc kubenswrapper[4599]: I1012 07:59:12.236905 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e80719a9-0b5a-418f-b038-36c385847f8a-utilities\") pod \"community-operators-66mxc\" (UID: \"e80719a9-0b5a-418f-b038-36c385847f8a\") " pod="openshift-marketplace/community-operators-66mxc" Oct 12 07:59:12 crc kubenswrapper[4599]: I1012 07:59:12.253728 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8vnq\" (UniqueName: \"kubernetes.io/projected/e80719a9-0b5a-418f-b038-36c385847f8a-kube-api-access-w8vnq\") pod \"community-operators-66mxc\" (UID: \"e80719a9-0b5a-418f-b038-36c385847f8a\") " pod="openshift-marketplace/community-operators-66mxc" Oct 12 07:59:12 crc kubenswrapper[4599]: I1012 07:59:12.407179 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-66mxc" Oct 12 07:59:12 crc kubenswrapper[4599]: I1012 07:59:12.866617 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-66mxc"] Oct 12 07:59:13 crc kubenswrapper[4599]: I1012 07:59:13.294018 4599 generic.go:334] "Generic (PLEG): container finished" podID="e80719a9-0b5a-418f-b038-36c385847f8a" containerID="79b1e5512da66f7a07bb9738ba399332a265104058c05ee2ce3b03d6eafddfaa" exitCode=0 Oct 12 07:59:13 crc kubenswrapper[4599]: I1012 07:59:13.294128 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-66mxc" event={"ID":"e80719a9-0b5a-418f-b038-36c385847f8a","Type":"ContainerDied","Data":"79b1e5512da66f7a07bb9738ba399332a265104058c05ee2ce3b03d6eafddfaa"} Oct 12 07:59:13 crc kubenswrapper[4599]: I1012 07:59:13.294426 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-66mxc" event={"ID":"e80719a9-0b5a-418f-b038-36c385847f8a","Type":"ContainerStarted","Data":"496da70f0917c3a9607788a6f81dd3f2b6c78f8d90740903a4117149f49034d5"} Oct 12 07:59:14 crc kubenswrapper[4599]: I1012 07:59:14.023663 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-156a-account-create-6tgjb"] Oct 12 07:59:14 crc kubenswrapper[4599]: I1012 07:59:14.032428 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-156a-account-create-6tgjb"] Oct 12 07:59:14 crc kubenswrapper[4599]: I1012 07:59:14.038029 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-c095-account-create-4scdb"] Oct 12 07:59:14 crc kubenswrapper[4599]: I1012 07:59:14.043401 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-c095-account-create-4scdb"] Oct 12 07:59:14 crc kubenswrapper[4599]: I1012 07:59:14.305083 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-66mxc" event={"ID":"e80719a9-0b5a-418f-b038-36c385847f8a","Type":"ContainerStarted","Data":"27084f2c0586da03169458edf68a3a12dbe56aaf4938103f6d7b03cd04da2863"} Oct 12 07:59:15 crc kubenswrapper[4599]: I1012 07:59:15.316368 4599 generic.go:334] "Generic (PLEG): container finished" podID="e80719a9-0b5a-418f-b038-36c385847f8a" containerID="27084f2c0586da03169458edf68a3a12dbe56aaf4938103f6d7b03cd04da2863" exitCode=0 Oct 12 07:59:15 crc kubenswrapper[4599]: I1012 07:59:15.316414 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-66mxc" event={"ID":"e80719a9-0b5a-418f-b038-36c385847f8a","Type":"ContainerDied","Data":"27084f2c0586da03169458edf68a3a12dbe56aaf4938103f6d7b03cd04da2863"} Oct 12 07:59:15 crc kubenswrapper[4599]: I1012 07:59:15.554509 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3775409f-2e10-405a-b777-66ce4f084bd7" path="/var/lib/kubelet/pods/3775409f-2e10-405a-b777-66ce4f084bd7/volumes" Oct 12 07:59:15 crc kubenswrapper[4599]: I1012 07:59:15.555252 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef050d1a-fc88-429d-a19a-eb55d2933057" path="/var/lib/kubelet/pods/ef050d1a-fc88-429d-a19a-eb55d2933057/volumes" Oct 12 07:59:16 crc kubenswrapper[4599]: I1012 07:59:16.329416 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-66mxc" event={"ID":"e80719a9-0b5a-418f-b038-36c385847f8a","Type":"ContainerStarted","Data":"0d1e01e071042d630ab26c984378d91b3a8e6b04b0e33cbedcdc02adfa74d792"} Oct 12 07:59:16 crc kubenswrapper[4599]: I1012 07:59:16.356744 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-66mxc" podStartSLOduration=1.8048553090000001 podStartE2EDuration="4.356728005s" podCreationTimestamp="2025-10-12 07:59:12 +0000 UTC" firstStartedPulling="2025-10-12 07:59:13.296415376 +0000 UTC m=+1450.085610879" lastFinishedPulling="2025-10-12 07:59:15.848288073 +0000 UTC m=+1452.637483575" observedRunningTime="2025-10-12 07:59:16.349363471 +0000 UTC m=+1453.138558973" watchObservedRunningTime="2025-10-12 07:59:16.356728005 +0000 UTC m=+1453.145923507" Oct 12 07:59:22 crc kubenswrapper[4599]: I1012 07:59:22.407614 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-66mxc" Oct 12 07:59:22 crc kubenswrapper[4599]: I1012 07:59:22.408211 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-66mxc" Oct 12 07:59:22 crc kubenswrapper[4599]: I1012 07:59:22.448982 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-66mxc" Oct 12 07:59:23 crc kubenswrapper[4599]: I1012 07:59:23.033153 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-pkz8c"] Oct 12 07:59:23 crc kubenswrapper[4599]: I1012 07:59:23.039841 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-tsv76"] Oct 12 07:59:23 crc kubenswrapper[4599]: I1012 07:59:23.047837 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-0e9d-account-create-xlkmf"] Oct 12 07:59:23 crc kubenswrapper[4599]: I1012 07:59:23.055068 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-0e9d-account-create-xlkmf"] Oct 12 07:59:23 crc kubenswrapper[4599]: I1012 07:59:23.060307 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-tsv76"] Oct 12 07:59:23 crc kubenswrapper[4599]: I1012 07:59:23.086959 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-pkz8c"] Oct 12 07:59:23 crc kubenswrapper[4599]: I1012 07:59:23.434044 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-66mxc" Oct 12 07:59:23 crc kubenswrapper[4599]: I1012 07:59:23.479961 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-66mxc"] Oct 12 07:59:23 crc kubenswrapper[4599]: I1012 07:59:23.556089 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="616ae17d-da7f-4f46-9d1e-234bcb028377" path="/var/lib/kubelet/pods/616ae17d-da7f-4f46-9d1e-234bcb028377/volumes" Oct 12 07:59:23 crc kubenswrapper[4599]: I1012 07:59:23.556870 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d43ed11-7175-498a-8dc1-e15cfc41b5c8" path="/var/lib/kubelet/pods/9d43ed11-7175-498a-8dc1-e15cfc41b5c8/volumes" Oct 12 07:59:23 crc kubenswrapper[4599]: I1012 07:59:23.557461 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe135e30-f182-4517-8605-1097e7391663" path="/var/lib/kubelet/pods/fe135e30-f182-4517-8605-1097e7391663/volumes" Oct 12 07:59:25 crc kubenswrapper[4599]: I1012 07:59:25.415549 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-66mxc" podUID="e80719a9-0b5a-418f-b038-36c385847f8a" containerName="registry-server" containerID="cri-o://0d1e01e071042d630ab26c984378d91b3a8e6b04b0e33cbedcdc02adfa74d792" gracePeriod=2 Oct 12 07:59:25 crc kubenswrapper[4599]: I1012 07:59:25.807804 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-66mxc" Oct 12 07:59:25 crc kubenswrapper[4599]: I1012 07:59:25.935495 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8vnq\" (UniqueName: \"kubernetes.io/projected/e80719a9-0b5a-418f-b038-36c385847f8a-kube-api-access-w8vnq\") pod \"e80719a9-0b5a-418f-b038-36c385847f8a\" (UID: \"e80719a9-0b5a-418f-b038-36c385847f8a\") " Oct 12 07:59:25 crc kubenswrapper[4599]: I1012 07:59:25.936017 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e80719a9-0b5a-418f-b038-36c385847f8a-catalog-content\") pod \"e80719a9-0b5a-418f-b038-36c385847f8a\" (UID: \"e80719a9-0b5a-418f-b038-36c385847f8a\") " Oct 12 07:59:25 crc kubenswrapper[4599]: I1012 07:59:25.936207 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e80719a9-0b5a-418f-b038-36c385847f8a-utilities\") pod \"e80719a9-0b5a-418f-b038-36c385847f8a\" (UID: \"e80719a9-0b5a-418f-b038-36c385847f8a\") " Oct 12 07:59:25 crc kubenswrapper[4599]: I1012 07:59:25.936844 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e80719a9-0b5a-418f-b038-36c385847f8a-utilities" (OuterVolumeSpecName: "utilities") pod "e80719a9-0b5a-418f-b038-36c385847f8a" (UID: "e80719a9-0b5a-418f-b038-36c385847f8a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:59:25 crc kubenswrapper[4599]: I1012 07:59:25.941763 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e80719a9-0b5a-418f-b038-36c385847f8a-kube-api-access-w8vnq" (OuterVolumeSpecName: "kube-api-access-w8vnq") pod "e80719a9-0b5a-418f-b038-36c385847f8a" (UID: "e80719a9-0b5a-418f-b038-36c385847f8a"). InnerVolumeSpecName "kube-api-access-w8vnq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:59:25 crc kubenswrapper[4599]: I1012 07:59:25.984142 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e80719a9-0b5a-418f-b038-36c385847f8a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e80719a9-0b5a-418f-b038-36c385847f8a" (UID: "e80719a9-0b5a-418f-b038-36c385847f8a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:59:26 crc kubenswrapper[4599]: I1012 07:59:26.038254 4599 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e80719a9-0b5a-418f-b038-36c385847f8a-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 07:59:26 crc kubenswrapper[4599]: I1012 07:59:26.038398 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w8vnq\" (UniqueName: \"kubernetes.io/projected/e80719a9-0b5a-418f-b038-36c385847f8a-kube-api-access-w8vnq\") on node \"crc\" DevicePath \"\"" Oct 12 07:59:26 crc kubenswrapper[4599]: I1012 07:59:26.038461 4599 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e80719a9-0b5a-418f-b038-36c385847f8a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 07:59:26 crc kubenswrapper[4599]: I1012 07:59:26.426630 4599 generic.go:334] "Generic (PLEG): container finished" podID="e80719a9-0b5a-418f-b038-36c385847f8a" containerID="0d1e01e071042d630ab26c984378d91b3a8e6b04b0e33cbedcdc02adfa74d792" exitCode=0 Oct 12 07:59:26 crc kubenswrapper[4599]: I1012 07:59:26.426683 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-66mxc" Oct 12 07:59:26 crc kubenswrapper[4599]: I1012 07:59:26.426707 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-66mxc" event={"ID":"e80719a9-0b5a-418f-b038-36c385847f8a","Type":"ContainerDied","Data":"0d1e01e071042d630ab26c984378d91b3a8e6b04b0e33cbedcdc02adfa74d792"} Oct 12 07:59:26 crc kubenswrapper[4599]: I1012 07:59:26.427084 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-66mxc" event={"ID":"e80719a9-0b5a-418f-b038-36c385847f8a","Type":"ContainerDied","Data":"496da70f0917c3a9607788a6f81dd3f2b6c78f8d90740903a4117149f49034d5"} Oct 12 07:59:26 crc kubenswrapper[4599]: I1012 07:59:26.427108 4599 scope.go:117] "RemoveContainer" containerID="0d1e01e071042d630ab26c984378d91b3a8e6b04b0e33cbedcdc02adfa74d792" Oct 12 07:59:26 crc kubenswrapper[4599]: I1012 07:59:26.447981 4599 scope.go:117] "RemoveContainer" containerID="27084f2c0586da03169458edf68a3a12dbe56aaf4938103f6d7b03cd04da2863" Oct 12 07:59:26 crc kubenswrapper[4599]: I1012 07:59:26.463925 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-66mxc"] Oct 12 07:59:26 crc kubenswrapper[4599]: I1012 07:59:26.471008 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-66mxc"] Oct 12 07:59:26 crc kubenswrapper[4599]: I1012 07:59:26.481813 4599 scope.go:117] "RemoveContainer" containerID="79b1e5512da66f7a07bb9738ba399332a265104058c05ee2ce3b03d6eafddfaa" Oct 12 07:59:26 crc kubenswrapper[4599]: I1012 07:59:26.510433 4599 scope.go:117] "RemoveContainer" containerID="0d1e01e071042d630ab26c984378d91b3a8e6b04b0e33cbedcdc02adfa74d792" Oct 12 07:59:26 crc kubenswrapper[4599]: E1012 07:59:26.510984 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d1e01e071042d630ab26c984378d91b3a8e6b04b0e33cbedcdc02adfa74d792\": container with ID starting with 0d1e01e071042d630ab26c984378d91b3a8e6b04b0e33cbedcdc02adfa74d792 not found: ID does not exist" containerID="0d1e01e071042d630ab26c984378d91b3a8e6b04b0e33cbedcdc02adfa74d792" Oct 12 07:59:26 crc kubenswrapper[4599]: I1012 07:59:26.511066 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d1e01e071042d630ab26c984378d91b3a8e6b04b0e33cbedcdc02adfa74d792"} err="failed to get container status \"0d1e01e071042d630ab26c984378d91b3a8e6b04b0e33cbedcdc02adfa74d792\": rpc error: code = NotFound desc = could not find container \"0d1e01e071042d630ab26c984378d91b3a8e6b04b0e33cbedcdc02adfa74d792\": container with ID starting with 0d1e01e071042d630ab26c984378d91b3a8e6b04b0e33cbedcdc02adfa74d792 not found: ID does not exist" Oct 12 07:59:26 crc kubenswrapper[4599]: I1012 07:59:26.511116 4599 scope.go:117] "RemoveContainer" containerID="27084f2c0586da03169458edf68a3a12dbe56aaf4938103f6d7b03cd04da2863" Oct 12 07:59:26 crc kubenswrapper[4599]: E1012 07:59:26.511588 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27084f2c0586da03169458edf68a3a12dbe56aaf4938103f6d7b03cd04da2863\": container with ID starting with 27084f2c0586da03169458edf68a3a12dbe56aaf4938103f6d7b03cd04da2863 not found: ID does not exist" containerID="27084f2c0586da03169458edf68a3a12dbe56aaf4938103f6d7b03cd04da2863" Oct 12 07:59:26 crc kubenswrapper[4599]: I1012 07:59:26.511659 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27084f2c0586da03169458edf68a3a12dbe56aaf4938103f6d7b03cd04da2863"} err="failed to get container status \"27084f2c0586da03169458edf68a3a12dbe56aaf4938103f6d7b03cd04da2863\": rpc error: code = NotFound desc = could not find container \"27084f2c0586da03169458edf68a3a12dbe56aaf4938103f6d7b03cd04da2863\": container with ID starting with 27084f2c0586da03169458edf68a3a12dbe56aaf4938103f6d7b03cd04da2863 not found: ID does not exist" Oct 12 07:59:26 crc kubenswrapper[4599]: I1012 07:59:26.511700 4599 scope.go:117] "RemoveContainer" containerID="79b1e5512da66f7a07bb9738ba399332a265104058c05ee2ce3b03d6eafddfaa" Oct 12 07:59:26 crc kubenswrapper[4599]: E1012 07:59:26.512043 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79b1e5512da66f7a07bb9738ba399332a265104058c05ee2ce3b03d6eafddfaa\": container with ID starting with 79b1e5512da66f7a07bb9738ba399332a265104058c05ee2ce3b03d6eafddfaa not found: ID does not exist" containerID="79b1e5512da66f7a07bb9738ba399332a265104058c05ee2ce3b03d6eafddfaa" Oct 12 07:59:26 crc kubenswrapper[4599]: I1012 07:59:26.512081 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79b1e5512da66f7a07bb9738ba399332a265104058c05ee2ce3b03d6eafddfaa"} err="failed to get container status \"79b1e5512da66f7a07bb9738ba399332a265104058c05ee2ce3b03d6eafddfaa\": rpc error: code = NotFound desc = could not find container \"79b1e5512da66f7a07bb9738ba399332a265104058c05ee2ce3b03d6eafddfaa\": container with ID starting with 79b1e5512da66f7a07bb9738ba399332a265104058c05ee2ce3b03d6eafddfaa not found: ID does not exist" Oct 12 07:59:27 crc kubenswrapper[4599]: I1012 07:59:27.555660 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e80719a9-0b5a-418f-b038-36c385847f8a" path="/var/lib/kubelet/pods/e80719a9-0b5a-418f-b038-36c385847f8a/volumes" Oct 12 07:59:28 crc kubenswrapper[4599]: I1012 07:59:28.321755 4599 patch_prober.go:28] interesting pod/machine-config-daemon-5mz5c container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 07:59:28 crc kubenswrapper[4599]: I1012 07:59:28.321826 4599 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 07:59:28 crc kubenswrapper[4599]: I1012 07:59:28.321884 4599 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" Oct 12 07:59:28 crc kubenswrapper[4599]: I1012 07:59:28.322446 4599 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f369260da605ea09588ec62c0ee99215abbfa382e3bbd4d53aa9571e2652f968"} pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 12 07:59:28 crc kubenswrapper[4599]: I1012 07:59:28.322512 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" containerName="machine-config-daemon" containerID="cri-o://f369260da605ea09588ec62c0ee99215abbfa382e3bbd4d53aa9571e2652f968" gracePeriod=600 Oct 12 07:59:28 crc kubenswrapper[4599]: E1012 07:59:28.440511 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5mz5c_openshift-machine-config-operator(cc694bce-8c25-4729-b452-29d44d3efe6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" Oct 12 07:59:28 crc kubenswrapper[4599]: I1012 07:59:28.449732 4599 generic.go:334] "Generic (PLEG): container finished" podID="cc694bce-8c25-4729-b452-29d44d3efe6e" containerID="f369260da605ea09588ec62c0ee99215abbfa382e3bbd4d53aa9571e2652f968" exitCode=0 Oct 12 07:59:28 crc kubenswrapper[4599]: I1012 07:59:28.449780 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" event={"ID":"cc694bce-8c25-4729-b452-29d44d3efe6e","Type":"ContainerDied","Data":"f369260da605ea09588ec62c0ee99215abbfa382e3bbd4d53aa9571e2652f968"} Oct 12 07:59:28 crc kubenswrapper[4599]: I1012 07:59:28.449843 4599 scope.go:117] "RemoveContainer" containerID="55f9a891f5b2d1aee66eb372fd29e5ae6e5d430252e3a9f1e9416dab9bddaf9f" Oct 12 07:59:28 crc kubenswrapper[4599]: I1012 07:59:28.450565 4599 scope.go:117] "RemoveContainer" containerID="f369260da605ea09588ec62c0ee99215abbfa382e3bbd4d53aa9571e2652f968" Oct 12 07:59:28 crc kubenswrapper[4599]: E1012 07:59:28.450832 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5mz5c_openshift-machine-config-operator(cc694bce-8c25-4729-b452-29d44d3efe6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" Oct 12 07:59:35 crc kubenswrapper[4599]: I1012 07:59:35.523368 4599 generic.go:334] "Generic (PLEG): container finished" podID="9988d105-d7f0-459a-a8d9-056ac0d3abab" containerID="5178afb38c7b1209f5b4b5f4b60a4f548de1af8f4cd55050225ef3bf19c0f6cc" exitCode=0 Oct 12 07:59:35 crc kubenswrapper[4599]: I1012 07:59:35.523405 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s5xrx" event={"ID":"9988d105-d7f0-459a-a8d9-056ac0d3abab","Type":"ContainerDied","Data":"5178afb38c7b1209f5b4b5f4b60a4f548de1af8f4cd55050225ef3bf19c0f6cc"} Oct 12 07:59:36 crc kubenswrapper[4599]: I1012 07:59:36.871703 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s5xrx" Oct 12 07:59:37 crc kubenswrapper[4599]: I1012 07:59:37.027167 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-h6mzn"] Oct 12 07:59:37 crc kubenswrapper[4599]: I1012 07:59:37.035186 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-h6mzn"] Oct 12 07:59:37 crc kubenswrapper[4599]: I1012 07:59:37.049544 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9988d105-d7f0-459a-a8d9-056ac0d3abab-inventory\") pod \"9988d105-d7f0-459a-a8d9-056ac0d3abab\" (UID: \"9988d105-d7f0-459a-a8d9-056ac0d3abab\") " Oct 12 07:59:37 crc kubenswrapper[4599]: I1012 07:59:37.049729 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49pqb\" (UniqueName: \"kubernetes.io/projected/9988d105-d7f0-459a-a8d9-056ac0d3abab-kube-api-access-49pqb\") pod \"9988d105-d7f0-459a-a8d9-056ac0d3abab\" (UID: \"9988d105-d7f0-459a-a8d9-056ac0d3abab\") " Oct 12 07:59:37 crc kubenswrapper[4599]: I1012 07:59:37.049817 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9988d105-d7f0-459a-a8d9-056ac0d3abab-ssh-key\") pod \"9988d105-d7f0-459a-a8d9-056ac0d3abab\" (UID: \"9988d105-d7f0-459a-a8d9-056ac0d3abab\") " Oct 12 07:59:37 crc kubenswrapper[4599]: I1012 07:59:37.055675 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9988d105-d7f0-459a-a8d9-056ac0d3abab-kube-api-access-49pqb" (OuterVolumeSpecName: "kube-api-access-49pqb") pod "9988d105-d7f0-459a-a8d9-056ac0d3abab" (UID: "9988d105-d7f0-459a-a8d9-056ac0d3abab"). InnerVolumeSpecName "kube-api-access-49pqb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:59:37 crc kubenswrapper[4599]: E1012 07:59:37.071999 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9988d105-d7f0-459a-a8d9-056ac0d3abab-ssh-key podName:9988d105-d7f0-459a-a8d9-056ac0d3abab nodeName:}" failed. No retries permitted until 2025-10-12 07:59:37.571971521 +0000 UTC m=+1474.361167023 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "ssh-key" (UniqueName: "kubernetes.io/secret/9988d105-d7f0-459a-a8d9-056ac0d3abab-ssh-key") pod "9988d105-d7f0-459a-a8d9-056ac0d3abab" (UID: "9988d105-d7f0-459a-a8d9-056ac0d3abab") : error deleting /var/lib/kubelet/pods/9988d105-d7f0-459a-a8d9-056ac0d3abab/volume-subpaths: remove /var/lib/kubelet/pods/9988d105-d7f0-459a-a8d9-056ac0d3abab/volume-subpaths: no such file or directory Oct 12 07:59:37 crc kubenswrapper[4599]: I1012 07:59:37.074635 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9988d105-d7f0-459a-a8d9-056ac0d3abab-inventory" (OuterVolumeSpecName: "inventory") pod "9988d105-d7f0-459a-a8d9-056ac0d3abab" (UID: "9988d105-d7f0-459a-a8d9-056ac0d3abab"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:59:37 crc kubenswrapper[4599]: I1012 07:59:37.151292 4599 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9988d105-d7f0-459a-a8d9-056ac0d3abab-inventory\") on node \"crc\" DevicePath \"\"" Oct 12 07:59:37 crc kubenswrapper[4599]: I1012 07:59:37.151317 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-49pqb\" (UniqueName: \"kubernetes.io/projected/9988d105-d7f0-459a-a8d9-056ac0d3abab-kube-api-access-49pqb\") on node \"crc\" DevicePath \"\"" Oct 12 07:59:37 crc kubenswrapper[4599]: I1012 07:59:37.540726 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s5xrx" event={"ID":"9988d105-d7f0-459a-a8d9-056ac0d3abab","Type":"ContainerDied","Data":"ed16fd8b3921eec2015fac1898b5d5dac6ee39b4a9967bfc06ada806131b62ad"} Oct 12 07:59:37 crc kubenswrapper[4599]: I1012 07:59:37.540790 4599 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed16fd8b3921eec2015fac1898b5d5dac6ee39b4a9967bfc06ada806131b62ad" Oct 12 07:59:37 crc kubenswrapper[4599]: I1012 07:59:37.540786 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s5xrx" Oct 12 07:59:37 crc kubenswrapper[4599]: I1012 07:59:37.553128 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8468e667-fc53-4e39-939f-2221c78d4313" path="/var/lib/kubelet/pods/8468e667-fc53-4e39-939f-2221c78d4313/volumes" Oct 12 07:59:37 crc kubenswrapper[4599]: I1012 07:59:37.608984 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-d787p"] Oct 12 07:59:37 crc kubenswrapper[4599]: E1012 07:59:37.609448 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e80719a9-0b5a-418f-b038-36c385847f8a" containerName="registry-server" Oct 12 07:59:37 crc kubenswrapper[4599]: I1012 07:59:37.609468 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="e80719a9-0b5a-418f-b038-36c385847f8a" containerName="registry-server" Oct 12 07:59:37 crc kubenswrapper[4599]: E1012 07:59:37.609483 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e80719a9-0b5a-418f-b038-36c385847f8a" containerName="extract-utilities" Oct 12 07:59:37 crc kubenswrapper[4599]: I1012 07:59:37.609489 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="e80719a9-0b5a-418f-b038-36c385847f8a" containerName="extract-utilities" Oct 12 07:59:37 crc kubenswrapper[4599]: E1012 07:59:37.609526 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9988d105-d7f0-459a-a8d9-056ac0d3abab" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 12 07:59:37 crc kubenswrapper[4599]: I1012 07:59:37.609534 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="9988d105-d7f0-459a-a8d9-056ac0d3abab" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 12 07:59:37 crc kubenswrapper[4599]: E1012 07:59:37.609541 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e80719a9-0b5a-418f-b038-36c385847f8a" containerName="extract-content" Oct 12 07:59:37 crc kubenswrapper[4599]: I1012 07:59:37.609547 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="e80719a9-0b5a-418f-b038-36c385847f8a" containerName="extract-content" Oct 12 07:59:37 crc kubenswrapper[4599]: I1012 07:59:37.609769 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="e80719a9-0b5a-418f-b038-36c385847f8a" containerName="registry-server" Oct 12 07:59:37 crc kubenswrapper[4599]: I1012 07:59:37.609786 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="9988d105-d7f0-459a-a8d9-056ac0d3abab" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 12 07:59:37 crc kubenswrapper[4599]: I1012 07:59:37.610490 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-d787p" Oct 12 07:59:37 crc kubenswrapper[4599]: I1012 07:59:37.622578 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-d787p"] Oct 12 07:59:37 crc kubenswrapper[4599]: I1012 07:59:37.662104 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9988d105-d7f0-459a-a8d9-056ac0d3abab-ssh-key\") pod \"9988d105-d7f0-459a-a8d9-056ac0d3abab\" (UID: \"9988d105-d7f0-459a-a8d9-056ac0d3abab\") " Oct 12 07:59:37 crc kubenswrapper[4599]: I1012 07:59:37.662982 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkbj8\" (UniqueName: \"kubernetes.io/projected/0027b20a-21c6-437b-b807-50484ab21289-kube-api-access-xkbj8\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-d787p\" (UID: \"0027b20a-21c6-437b-b807-50484ab21289\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-d787p" Oct 12 07:59:37 crc kubenswrapper[4599]: I1012 07:59:37.663215 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0027b20a-21c6-437b-b807-50484ab21289-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-d787p\" (UID: \"0027b20a-21c6-437b-b807-50484ab21289\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-d787p" Oct 12 07:59:37 crc kubenswrapper[4599]: I1012 07:59:37.663298 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0027b20a-21c6-437b-b807-50484ab21289-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-d787p\" (UID: \"0027b20a-21c6-437b-b807-50484ab21289\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-d787p" Oct 12 07:59:37 crc kubenswrapper[4599]: I1012 07:59:37.666222 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9988d105-d7f0-459a-a8d9-056ac0d3abab-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9988d105-d7f0-459a-a8d9-056ac0d3abab" (UID: "9988d105-d7f0-459a-a8d9-056ac0d3abab"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:59:37 crc kubenswrapper[4599]: I1012 07:59:37.765501 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkbj8\" (UniqueName: \"kubernetes.io/projected/0027b20a-21c6-437b-b807-50484ab21289-kube-api-access-xkbj8\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-d787p\" (UID: \"0027b20a-21c6-437b-b807-50484ab21289\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-d787p" Oct 12 07:59:37 crc kubenswrapper[4599]: I1012 07:59:37.765747 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0027b20a-21c6-437b-b807-50484ab21289-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-d787p\" (UID: \"0027b20a-21c6-437b-b807-50484ab21289\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-d787p" Oct 12 07:59:37 crc kubenswrapper[4599]: I1012 07:59:37.765840 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0027b20a-21c6-437b-b807-50484ab21289-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-d787p\" (UID: \"0027b20a-21c6-437b-b807-50484ab21289\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-d787p" Oct 12 07:59:37 crc kubenswrapper[4599]: I1012 07:59:37.765920 4599 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9988d105-d7f0-459a-a8d9-056ac0d3abab-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 12 07:59:37 crc kubenswrapper[4599]: I1012 07:59:37.769955 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0027b20a-21c6-437b-b807-50484ab21289-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-d787p\" (UID: \"0027b20a-21c6-437b-b807-50484ab21289\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-d787p" Oct 12 07:59:37 crc kubenswrapper[4599]: I1012 07:59:37.771791 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0027b20a-21c6-437b-b807-50484ab21289-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-d787p\" (UID: \"0027b20a-21c6-437b-b807-50484ab21289\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-d787p" Oct 12 07:59:37 crc kubenswrapper[4599]: I1012 07:59:37.780998 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkbj8\" (UniqueName: \"kubernetes.io/projected/0027b20a-21c6-437b-b807-50484ab21289-kube-api-access-xkbj8\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-d787p\" (UID: \"0027b20a-21c6-437b-b807-50484ab21289\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-d787p" Oct 12 07:59:37 crc kubenswrapper[4599]: I1012 07:59:37.935009 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-d787p" Oct 12 07:59:38 crc kubenswrapper[4599]: I1012 07:59:38.401992 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-d787p"] Oct 12 07:59:38 crc kubenswrapper[4599]: I1012 07:59:38.550667 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-d787p" event={"ID":"0027b20a-21c6-437b-b807-50484ab21289","Type":"ContainerStarted","Data":"8c724e1be45de38b2e5ca914e0025a4fa446004d3efecb4842678ab4f207b38e"} Oct 12 07:59:39 crc kubenswrapper[4599]: I1012 07:59:39.561141 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-d787p" event={"ID":"0027b20a-21c6-437b-b807-50484ab21289","Type":"ContainerStarted","Data":"319432df22b4a42f6252727af14102a8ca8c6ecc42f7bda8e60d22924a09010c"} Oct 12 07:59:39 crc kubenswrapper[4599]: I1012 07:59:39.589560 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-d787p" podStartSLOduration=2.075431302 podStartE2EDuration="2.589544347s" podCreationTimestamp="2025-10-12 07:59:37 +0000 UTC" firstStartedPulling="2025-10-12 07:59:38.407215061 +0000 UTC m=+1475.196410563" lastFinishedPulling="2025-10-12 07:59:38.921328106 +0000 UTC m=+1475.710523608" observedRunningTime="2025-10-12 07:59:39.578937054 +0000 UTC m=+1476.368132557" watchObservedRunningTime="2025-10-12 07:59:39.589544347 +0000 UTC m=+1476.378739849" Oct 12 07:59:42 crc kubenswrapper[4599]: I1012 07:59:42.585421 4599 generic.go:334] "Generic (PLEG): container finished" podID="0027b20a-21c6-437b-b807-50484ab21289" containerID="319432df22b4a42f6252727af14102a8ca8c6ecc42f7bda8e60d22924a09010c" exitCode=0 Oct 12 07:59:42 crc kubenswrapper[4599]: I1012 07:59:42.585503 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-d787p" event={"ID":"0027b20a-21c6-437b-b807-50484ab21289","Type":"ContainerDied","Data":"319432df22b4a42f6252727af14102a8ca8c6ecc42f7bda8e60d22924a09010c"} Oct 12 07:59:43 crc kubenswrapper[4599]: I1012 07:59:43.551479 4599 scope.go:117] "RemoveContainer" containerID="f369260da605ea09588ec62c0ee99215abbfa382e3bbd4d53aa9571e2652f968" Oct 12 07:59:43 crc kubenswrapper[4599]: E1012 07:59:43.552046 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5mz5c_openshift-machine-config-operator(cc694bce-8c25-4729-b452-29d44d3efe6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" Oct 12 07:59:43 crc kubenswrapper[4599]: I1012 07:59:43.906813 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-d787p" Oct 12 07:59:44 crc kubenswrapper[4599]: I1012 07:59:44.024553 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-x47gg"] Oct 12 07:59:44 crc kubenswrapper[4599]: I1012 07:59:44.029924 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-x47gg"] Oct 12 07:59:44 crc kubenswrapper[4599]: I1012 07:59:44.083873 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkbj8\" (UniqueName: \"kubernetes.io/projected/0027b20a-21c6-437b-b807-50484ab21289-kube-api-access-xkbj8\") pod \"0027b20a-21c6-437b-b807-50484ab21289\" (UID: \"0027b20a-21c6-437b-b807-50484ab21289\") " Oct 12 07:59:44 crc kubenswrapper[4599]: I1012 07:59:44.083938 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0027b20a-21c6-437b-b807-50484ab21289-inventory\") pod \"0027b20a-21c6-437b-b807-50484ab21289\" (UID: \"0027b20a-21c6-437b-b807-50484ab21289\") " Oct 12 07:59:44 crc kubenswrapper[4599]: I1012 07:59:44.084137 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0027b20a-21c6-437b-b807-50484ab21289-ssh-key\") pod \"0027b20a-21c6-437b-b807-50484ab21289\" (UID: \"0027b20a-21c6-437b-b807-50484ab21289\") " Oct 12 07:59:44 crc kubenswrapper[4599]: I1012 07:59:44.089627 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0027b20a-21c6-437b-b807-50484ab21289-kube-api-access-xkbj8" (OuterVolumeSpecName: "kube-api-access-xkbj8") pod "0027b20a-21c6-437b-b807-50484ab21289" (UID: "0027b20a-21c6-437b-b807-50484ab21289"). InnerVolumeSpecName "kube-api-access-xkbj8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:59:44 crc kubenswrapper[4599]: E1012 07:59:44.105313 4599 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0027b20a-21c6-437b-b807-50484ab21289-ssh-key podName:0027b20a-21c6-437b-b807-50484ab21289 nodeName:}" failed. No retries permitted until 2025-10-12 07:59:44.605284317 +0000 UTC m=+1481.394479829 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "ssh-key" (UniqueName: "kubernetes.io/secret/0027b20a-21c6-437b-b807-50484ab21289-ssh-key") pod "0027b20a-21c6-437b-b807-50484ab21289" (UID: "0027b20a-21c6-437b-b807-50484ab21289") : error deleting /var/lib/kubelet/pods/0027b20a-21c6-437b-b807-50484ab21289/volume-subpaths: remove /var/lib/kubelet/pods/0027b20a-21c6-437b-b807-50484ab21289/volume-subpaths: no such file or directory Oct 12 07:59:44 crc kubenswrapper[4599]: I1012 07:59:44.108037 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0027b20a-21c6-437b-b807-50484ab21289-inventory" (OuterVolumeSpecName: "inventory") pod "0027b20a-21c6-437b-b807-50484ab21289" (UID: "0027b20a-21c6-437b-b807-50484ab21289"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:59:44 crc kubenswrapper[4599]: I1012 07:59:44.187103 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkbj8\" (UniqueName: \"kubernetes.io/projected/0027b20a-21c6-437b-b807-50484ab21289-kube-api-access-xkbj8\") on node \"crc\" DevicePath \"\"" Oct 12 07:59:44 crc kubenswrapper[4599]: I1012 07:59:44.187150 4599 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0027b20a-21c6-437b-b807-50484ab21289-inventory\") on node \"crc\" DevicePath \"\"" Oct 12 07:59:44 crc kubenswrapper[4599]: I1012 07:59:44.601579 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-d787p" event={"ID":"0027b20a-21c6-437b-b807-50484ab21289","Type":"ContainerDied","Data":"8c724e1be45de38b2e5ca914e0025a4fa446004d3efecb4842678ab4f207b38e"} Oct 12 07:59:44 crc kubenswrapper[4599]: I1012 07:59:44.601626 4599 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8c724e1be45de38b2e5ca914e0025a4fa446004d3efecb4842678ab4f207b38e" Oct 12 07:59:44 crc kubenswrapper[4599]: I1012 07:59:44.601642 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-d787p" Oct 12 07:59:44 crc kubenswrapper[4599]: I1012 07:59:44.651963 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-6ngmq"] Oct 12 07:59:44 crc kubenswrapper[4599]: E1012 07:59:44.652441 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0027b20a-21c6-437b-b807-50484ab21289" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 12 07:59:44 crc kubenswrapper[4599]: I1012 07:59:44.652463 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="0027b20a-21c6-437b-b807-50484ab21289" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 12 07:59:44 crc kubenswrapper[4599]: I1012 07:59:44.652676 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="0027b20a-21c6-437b-b807-50484ab21289" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 12 07:59:44 crc kubenswrapper[4599]: I1012 07:59:44.653327 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6ngmq" Oct 12 07:59:44 crc kubenswrapper[4599]: I1012 07:59:44.667266 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-6ngmq"] Oct 12 07:59:44 crc kubenswrapper[4599]: I1012 07:59:44.693886 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0027b20a-21c6-437b-b807-50484ab21289-ssh-key\") pod \"0027b20a-21c6-437b-b807-50484ab21289\" (UID: \"0027b20a-21c6-437b-b807-50484ab21289\") " Oct 12 07:59:44 crc kubenswrapper[4599]: I1012 07:59:44.697945 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0027b20a-21c6-437b-b807-50484ab21289-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "0027b20a-21c6-437b-b807-50484ab21289" (UID: "0027b20a-21c6-437b-b807-50484ab21289"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:59:44 crc kubenswrapper[4599]: I1012 07:59:44.796581 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvqdl\" (UniqueName: \"kubernetes.io/projected/e862c4d5-24d0-42ee-82f5-7a17fc6773aa-kube-api-access-kvqdl\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-6ngmq\" (UID: \"e862c4d5-24d0-42ee-82f5-7a17fc6773aa\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6ngmq" Oct 12 07:59:44 crc kubenswrapper[4599]: I1012 07:59:44.796662 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e862c4d5-24d0-42ee-82f5-7a17fc6773aa-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-6ngmq\" (UID: \"e862c4d5-24d0-42ee-82f5-7a17fc6773aa\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6ngmq" Oct 12 07:59:44 crc kubenswrapper[4599]: I1012 07:59:44.796780 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e862c4d5-24d0-42ee-82f5-7a17fc6773aa-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-6ngmq\" (UID: \"e862c4d5-24d0-42ee-82f5-7a17fc6773aa\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6ngmq" Oct 12 07:59:44 crc kubenswrapper[4599]: I1012 07:59:44.796904 4599 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0027b20a-21c6-437b-b807-50484ab21289-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 12 07:59:44 crc kubenswrapper[4599]: I1012 07:59:44.898690 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e862c4d5-24d0-42ee-82f5-7a17fc6773aa-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-6ngmq\" (UID: \"e862c4d5-24d0-42ee-82f5-7a17fc6773aa\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6ngmq" Oct 12 07:59:44 crc kubenswrapper[4599]: I1012 07:59:44.898789 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvqdl\" (UniqueName: \"kubernetes.io/projected/e862c4d5-24d0-42ee-82f5-7a17fc6773aa-kube-api-access-kvqdl\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-6ngmq\" (UID: \"e862c4d5-24d0-42ee-82f5-7a17fc6773aa\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6ngmq" Oct 12 07:59:44 crc kubenswrapper[4599]: I1012 07:59:44.898828 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e862c4d5-24d0-42ee-82f5-7a17fc6773aa-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-6ngmq\" (UID: \"e862c4d5-24d0-42ee-82f5-7a17fc6773aa\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6ngmq" Oct 12 07:59:44 crc kubenswrapper[4599]: I1012 07:59:44.903086 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e862c4d5-24d0-42ee-82f5-7a17fc6773aa-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-6ngmq\" (UID: \"e862c4d5-24d0-42ee-82f5-7a17fc6773aa\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6ngmq" Oct 12 07:59:44 crc kubenswrapper[4599]: I1012 07:59:44.903896 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e862c4d5-24d0-42ee-82f5-7a17fc6773aa-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-6ngmq\" (UID: \"e862c4d5-24d0-42ee-82f5-7a17fc6773aa\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6ngmq" Oct 12 07:59:44 crc kubenswrapper[4599]: I1012 07:59:44.913155 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvqdl\" (UniqueName: \"kubernetes.io/projected/e862c4d5-24d0-42ee-82f5-7a17fc6773aa-kube-api-access-kvqdl\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-6ngmq\" (UID: \"e862c4d5-24d0-42ee-82f5-7a17fc6773aa\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6ngmq" Oct 12 07:59:44 crc kubenswrapper[4599]: I1012 07:59:44.970113 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6ngmq" Oct 12 07:59:45 crc kubenswrapper[4599]: I1012 07:59:45.426007 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-6ngmq"] Oct 12 07:59:45 crc kubenswrapper[4599]: I1012 07:59:45.554497 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc0cde93-5cbb-4652-9dc3-05666e908b49" path="/var/lib/kubelet/pods/dc0cde93-5cbb-4652-9dc3-05666e908b49/volumes" Oct 12 07:59:45 crc kubenswrapper[4599]: I1012 07:59:45.609480 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6ngmq" event={"ID":"e862c4d5-24d0-42ee-82f5-7a17fc6773aa","Type":"ContainerStarted","Data":"44ebe415eb5c9019cd68ccd20eb769e202bf776a38400501b97f85411124a7dc"} Oct 12 07:59:46 crc kubenswrapper[4599]: I1012 07:59:46.618694 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6ngmq" event={"ID":"e862c4d5-24d0-42ee-82f5-7a17fc6773aa","Type":"ContainerStarted","Data":"2546c2bfef705e2acbd31933e306633d6df2f6567a3e192f347badc4b730fbf0"} Oct 12 07:59:46 crc kubenswrapper[4599]: I1012 07:59:46.639095 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6ngmq" podStartSLOduration=2.012299562 podStartE2EDuration="2.639075571s" podCreationTimestamp="2025-10-12 07:59:44 +0000 UTC" firstStartedPulling="2025-10-12 07:59:45.432448013 +0000 UTC m=+1482.221643515" lastFinishedPulling="2025-10-12 07:59:46.059224022 +0000 UTC m=+1482.848419524" observedRunningTime="2025-10-12 07:59:46.634098201 +0000 UTC m=+1483.423293702" watchObservedRunningTime="2025-10-12 07:59:46.639075571 +0000 UTC m=+1483.428271073" Oct 12 07:59:47 crc kubenswrapper[4599]: I1012 07:59:47.025254 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-jjn25"] Oct 12 07:59:47 crc kubenswrapper[4599]: I1012 07:59:47.032624 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-jjn25"] Oct 12 07:59:47 crc kubenswrapper[4599]: I1012 07:59:47.554077 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd14916d-8d1c-4c3d-9721-2d5e0e171db1" path="/var/lib/kubelet/pods/dd14916d-8d1c-4c3d-9721-2d5e0e171db1/volumes" Oct 12 07:59:57 crc kubenswrapper[4599]: I1012 07:59:57.545353 4599 scope.go:117] "RemoveContainer" containerID="f369260da605ea09588ec62c0ee99215abbfa382e3bbd4d53aa9571e2652f968" Oct 12 07:59:57 crc kubenswrapper[4599]: E1012 07:59:57.546081 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5mz5c_openshift-machine-config-operator(cc694bce-8c25-4729-b452-29d44d3efe6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" Oct 12 08:00:00 crc kubenswrapper[4599]: I1012 08:00:00.136577 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29337600-2kgj6"] Oct 12 08:00:00 crc kubenswrapper[4599]: I1012 08:00:00.138099 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29337600-2kgj6" Oct 12 08:00:00 crc kubenswrapper[4599]: I1012 08:00:00.139751 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 12 08:00:00 crc kubenswrapper[4599]: I1012 08:00:00.140277 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 12 08:00:00 crc kubenswrapper[4599]: I1012 08:00:00.150101 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29337600-2kgj6"] Oct 12 08:00:00 crc kubenswrapper[4599]: I1012 08:00:00.178382 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7a4bb464-44b3-460f-b9d1-92872095c4cf-secret-volume\") pod \"collect-profiles-29337600-2kgj6\" (UID: \"7a4bb464-44b3-460f-b9d1-92872095c4cf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337600-2kgj6" Oct 12 08:00:00 crc kubenswrapper[4599]: I1012 08:00:00.178510 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvjst\" (UniqueName: \"kubernetes.io/projected/7a4bb464-44b3-460f-b9d1-92872095c4cf-kube-api-access-gvjst\") pod \"collect-profiles-29337600-2kgj6\" (UID: \"7a4bb464-44b3-460f-b9d1-92872095c4cf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337600-2kgj6" Oct 12 08:00:00 crc kubenswrapper[4599]: I1012 08:00:00.178601 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7a4bb464-44b3-460f-b9d1-92872095c4cf-config-volume\") pod \"collect-profiles-29337600-2kgj6\" (UID: \"7a4bb464-44b3-460f-b9d1-92872095c4cf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337600-2kgj6" Oct 12 08:00:00 crc kubenswrapper[4599]: I1012 08:00:00.280104 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7a4bb464-44b3-460f-b9d1-92872095c4cf-secret-volume\") pod \"collect-profiles-29337600-2kgj6\" (UID: \"7a4bb464-44b3-460f-b9d1-92872095c4cf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337600-2kgj6" Oct 12 08:00:00 crc kubenswrapper[4599]: I1012 08:00:00.280254 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvjst\" (UniqueName: \"kubernetes.io/projected/7a4bb464-44b3-460f-b9d1-92872095c4cf-kube-api-access-gvjst\") pod \"collect-profiles-29337600-2kgj6\" (UID: \"7a4bb464-44b3-460f-b9d1-92872095c4cf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337600-2kgj6" Oct 12 08:00:00 crc kubenswrapper[4599]: I1012 08:00:00.280361 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7a4bb464-44b3-460f-b9d1-92872095c4cf-config-volume\") pod \"collect-profiles-29337600-2kgj6\" (UID: \"7a4bb464-44b3-460f-b9d1-92872095c4cf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337600-2kgj6" Oct 12 08:00:00 crc kubenswrapper[4599]: I1012 08:00:00.281233 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7a4bb464-44b3-460f-b9d1-92872095c4cf-config-volume\") pod \"collect-profiles-29337600-2kgj6\" (UID: \"7a4bb464-44b3-460f-b9d1-92872095c4cf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337600-2kgj6" Oct 12 08:00:00 crc kubenswrapper[4599]: I1012 08:00:00.287782 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7a4bb464-44b3-460f-b9d1-92872095c4cf-secret-volume\") pod \"collect-profiles-29337600-2kgj6\" (UID: \"7a4bb464-44b3-460f-b9d1-92872095c4cf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337600-2kgj6" Oct 12 08:00:00 crc kubenswrapper[4599]: I1012 08:00:00.294800 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvjst\" (UniqueName: \"kubernetes.io/projected/7a4bb464-44b3-460f-b9d1-92872095c4cf-kube-api-access-gvjst\") pod \"collect-profiles-29337600-2kgj6\" (UID: \"7a4bb464-44b3-460f-b9d1-92872095c4cf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337600-2kgj6" Oct 12 08:00:00 crc kubenswrapper[4599]: I1012 08:00:00.453493 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29337600-2kgj6" Oct 12 08:00:00 crc kubenswrapper[4599]: I1012 08:00:00.837521 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29337600-2kgj6"] Oct 12 08:00:01 crc kubenswrapper[4599]: I1012 08:00:01.726370 4599 generic.go:334] "Generic (PLEG): container finished" podID="7a4bb464-44b3-460f-b9d1-92872095c4cf" containerID="1936059035e439e98dc7d4a303245b7de493b624461ee21339c7b5edf1775c01" exitCode=0 Oct 12 08:00:01 crc kubenswrapper[4599]: I1012 08:00:01.726469 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29337600-2kgj6" event={"ID":"7a4bb464-44b3-460f-b9d1-92872095c4cf","Type":"ContainerDied","Data":"1936059035e439e98dc7d4a303245b7de493b624461ee21339c7b5edf1775c01"} Oct 12 08:00:01 crc kubenswrapper[4599]: I1012 08:00:01.726710 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29337600-2kgj6" event={"ID":"7a4bb464-44b3-460f-b9d1-92872095c4cf","Type":"ContainerStarted","Data":"085363c6b4b77b46a53e31c0be2975aadc69f37ccc1ddfca470af1c0a7f9fa06"} Oct 12 08:00:03 crc kubenswrapper[4599]: I1012 08:00:03.003771 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29337600-2kgj6" Oct 12 08:00:03 crc kubenswrapper[4599]: I1012 08:00:03.027765 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7a4bb464-44b3-460f-b9d1-92872095c4cf-config-volume\") pod \"7a4bb464-44b3-460f-b9d1-92872095c4cf\" (UID: \"7a4bb464-44b3-460f-b9d1-92872095c4cf\") " Oct 12 08:00:03 crc kubenswrapper[4599]: I1012 08:00:03.027802 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7a4bb464-44b3-460f-b9d1-92872095c4cf-secret-volume\") pod \"7a4bb464-44b3-460f-b9d1-92872095c4cf\" (UID: \"7a4bb464-44b3-460f-b9d1-92872095c4cf\") " Oct 12 08:00:03 crc kubenswrapper[4599]: I1012 08:00:03.027839 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvjst\" (UniqueName: \"kubernetes.io/projected/7a4bb464-44b3-460f-b9d1-92872095c4cf-kube-api-access-gvjst\") pod \"7a4bb464-44b3-460f-b9d1-92872095c4cf\" (UID: \"7a4bb464-44b3-460f-b9d1-92872095c4cf\") " Oct 12 08:00:03 crc kubenswrapper[4599]: I1012 08:00:03.028357 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a4bb464-44b3-460f-b9d1-92872095c4cf-config-volume" (OuterVolumeSpecName: "config-volume") pod "7a4bb464-44b3-460f-b9d1-92872095c4cf" (UID: "7a4bb464-44b3-460f-b9d1-92872095c4cf"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 08:00:03 crc kubenswrapper[4599]: I1012 08:00:03.032704 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a4bb464-44b3-460f-b9d1-92872095c4cf-kube-api-access-gvjst" (OuterVolumeSpecName: "kube-api-access-gvjst") pod "7a4bb464-44b3-460f-b9d1-92872095c4cf" (UID: "7a4bb464-44b3-460f-b9d1-92872095c4cf"). InnerVolumeSpecName "kube-api-access-gvjst". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 08:00:03 crc kubenswrapper[4599]: I1012 08:00:03.032926 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a4bb464-44b3-460f-b9d1-92872095c4cf-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "7a4bb464-44b3-460f-b9d1-92872095c4cf" (UID: "7a4bb464-44b3-460f-b9d1-92872095c4cf"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 08:00:03 crc kubenswrapper[4599]: I1012 08:00:03.130206 4599 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7a4bb464-44b3-460f-b9d1-92872095c4cf-config-volume\") on node \"crc\" DevicePath \"\"" Oct 12 08:00:03 crc kubenswrapper[4599]: I1012 08:00:03.130245 4599 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7a4bb464-44b3-460f-b9d1-92872095c4cf-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 12 08:00:03 crc kubenswrapper[4599]: I1012 08:00:03.130258 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gvjst\" (UniqueName: \"kubernetes.io/projected/7a4bb464-44b3-460f-b9d1-92872095c4cf-kube-api-access-gvjst\") on node \"crc\" DevicePath \"\"" Oct 12 08:00:03 crc kubenswrapper[4599]: I1012 08:00:03.743491 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29337600-2kgj6" event={"ID":"7a4bb464-44b3-460f-b9d1-92872095c4cf","Type":"ContainerDied","Data":"085363c6b4b77b46a53e31c0be2975aadc69f37ccc1ddfca470af1c0a7f9fa06"} Oct 12 08:00:03 crc kubenswrapper[4599]: I1012 08:00:03.743764 4599 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="085363c6b4b77b46a53e31c0be2975aadc69f37ccc1ddfca470af1c0a7f9fa06" Oct 12 08:00:03 crc kubenswrapper[4599]: I1012 08:00:03.743599 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29337600-2kgj6" Oct 12 08:00:06 crc kubenswrapper[4599]: I1012 08:00:06.664196 4599 scope.go:117] "RemoveContainer" containerID="e99684c0cc74ae635f1895693c49738c2353540617db6b5aadc701551a93a579" Oct 12 08:00:06 crc kubenswrapper[4599]: I1012 08:00:06.696222 4599 scope.go:117] "RemoveContainer" containerID="dec604968bf1207c013ff70bfd46a24b591d3d267a5174aeffe1f5dd2b74886c" Oct 12 08:00:06 crc kubenswrapper[4599]: I1012 08:00:06.737290 4599 scope.go:117] "RemoveContainer" containerID="5a4ad9a8d063c11b64ee6e2b27292cec66ec28f8bd6209138160a1fec33b2c1d" Oct 12 08:00:06 crc kubenswrapper[4599]: I1012 08:00:06.774703 4599 scope.go:117] "RemoveContainer" containerID="47a28c2366261edb36aec145141f5c0e6ab03e13b3b034083d35349f45c1deeb" Oct 12 08:00:06 crc kubenswrapper[4599]: I1012 08:00:06.799310 4599 scope.go:117] "RemoveContainer" containerID="69696fa14b4229b999555a09f3c99d56d1ed2999a72dc2386a779e21cb70446a" Oct 12 08:00:06 crc kubenswrapper[4599]: I1012 08:00:06.828617 4599 scope.go:117] "RemoveContainer" containerID="cd02781890a38d192ec7d89fe075f2aefaef75bbc5f9779efa162dc3bacb067b" Oct 12 08:00:06 crc kubenswrapper[4599]: I1012 08:00:06.863499 4599 scope.go:117] "RemoveContainer" containerID="af771c5b9656ff27ab51a4ad7330e600b927200350e0d20ebade0d7bcffd2bc7" Oct 12 08:00:06 crc kubenswrapper[4599]: I1012 08:00:06.877597 4599 scope.go:117] "RemoveContainer" containerID="0ae8e7b7cbd7d13a1e8b77c33d83159eb9cb77ba4482cdac6c60a1cc21408e05" Oct 12 08:00:09 crc kubenswrapper[4599]: I1012 08:00:09.545722 4599 scope.go:117] "RemoveContainer" containerID="f369260da605ea09588ec62c0ee99215abbfa382e3bbd4d53aa9571e2652f968" Oct 12 08:00:09 crc kubenswrapper[4599]: E1012 08:00:09.546755 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5mz5c_openshift-machine-config-operator(cc694bce-8c25-4729-b452-29d44d3efe6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" Oct 12 08:00:11 crc kubenswrapper[4599]: I1012 08:00:11.819943 4599 generic.go:334] "Generic (PLEG): container finished" podID="e862c4d5-24d0-42ee-82f5-7a17fc6773aa" containerID="2546c2bfef705e2acbd31933e306633d6df2f6567a3e192f347badc4b730fbf0" exitCode=0 Oct 12 08:00:11 crc kubenswrapper[4599]: I1012 08:00:11.820030 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6ngmq" event={"ID":"e862c4d5-24d0-42ee-82f5-7a17fc6773aa","Type":"ContainerDied","Data":"2546c2bfef705e2acbd31933e306633d6df2f6567a3e192f347badc4b730fbf0"} Oct 12 08:00:13 crc kubenswrapper[4599]: I1012 08:00:13.141066 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6ngmq" Oct 12 08:00:13 crc kubenswrapper[4599]: I1012 08:00:13.288899 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e862c4d5-24d0-42ee-82f5-7a17fc6773aa-inventory\") pod \"e862c4d5-24d0-42ee-82f5-7a17fc6773aa\" (UID: \"e862c4d5-24d0-42ee-82f5-7a17fc6773aa\") " Oct 12 08:00:13 crc kubenswrapper[4599]: I1012 08:00:13.288974 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e862c4d5-24d0-42ee-82f5-7a17fc6773aa-ssh-key\") pod \"e862c4d5-24d0-42ee-82f5-7a17fc6773aa\" (UID: \"e862c4d5-24d0-42ee-82f5-7a17fc6773aa\") " Oct 12 08:00:13 crc kubenswrapper[4599]: I1012 08:00:13.289124 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kvqdl\" (UniqueName: \"kubernetes.io/projected/e862c4d5-24d0-42ee-82f5-7a17fc6773aa-kube-api-access-kvqdl\") pod \"e862c4d5-24d0-42ee-82f5-7a17fc6773aa\" (UID: \"e862c4d5-24d0-42ee-82f5-7a17fc6773aa\") " Oct 12 08:00:13 crc kubenswrapper[4599]: I1012 08:00:13.293938 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e862c4d5-24d0-42ee-82f5-7a17fc6773aa-kube-api-access-kvqdl" (OuterVolumeSpecName: "kube-api-access-kvqdl") pod "e862c4d5-24d0-42ee-82f5-7a17fc6773aa" (UID: "e862c4d5-24d0-42ee-82f5-7a17fc6773aa"). InnerVolumeSpecName "kube-api-access-kvqdl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 08:00:13 crc kubenswrapper[4599]: I1012 08:00:13.311204 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e862c4d5-24d0-42ee-82f5-7a17fc6773aa-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e862c4d5-24d0-42ee-82f5-7a17fc6773aa" (UID: "e862c4d5-24d0-42ee-82f5-7a17fc6773aa"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 08:00:13 crc kubenswrapper[4599]: I1012 08:00:13.312661 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e862c4d5-24d0-42ee-82f5-7a17fc6773aa-inventory" (OuterVolumeSpecName: "inventory") pod "e862c4d5-24d0-42ee-82f5-7a17fc6773aa" (UID: "e862c4d5-24d0-42ee-82f5-7a17fc6773aa"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 08:00:13 crc kubenswrapper[4599]: I1012 08:00:13.391571 4599 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e862c4d5-24d0-42ee-82f5-7a17fc6773aa-inventory\") on node \"crc\" DevicePath \"\"" Oct 12 08:00:13 crc kubenswrapper[4599]: I1012 08:00:13.391778 4599 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e862c4d5-24d0-42ee-82f5-7a17fc6773aa-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 12 08:00:13 crc kubenswrapper[4599]: I1012 08:00:13.391788 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kvqdl\" (UniqueName: \"kubernetes.io/projected/e862c4d5-24d0-42ee-82f5-7a17fc6773aa-kube-api-access-kvqdl\") on node \"crc\" DevicePath \"\"" Oct 12 08:00:13 crc kubenswrapper[4599]: E1012 08:00:13.753962 4599 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a4bb464_44b3_460f_b9d1_92872095c4cf.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a4bb464_44b3_460f_b9d1_92872095c4cf.slice/crio-085363c6b4b77b46a53e31c0be2975aadc69f37ccc1ddfca470af1c0a7f9fa06\": RecentStats: unable to find data in memory cache]" Oct 12 08:00:13 crc kubenswrapper[4599]: I1012 08:00:13.835642 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6ngmq" event={"ID":"e862c4d5-24d0-42ee-82f5-7a17fc6773aa","Type":"ContainerDied","Data":"44ebe415eb5c9019cd68ccd20eb769e202bf776a38400501b97f85411124a7dc"} Oct 12 08:00:13 crc kubenswrapper[4599]: I1012 08:00:13.835681 4599 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="44ebe415eb5c9019cd68ccd20eb769e202bf776a38400501b97f85411124a7dc" Oct 12 08:00:13 crc kubenswrapper[4599]: I1012 08:00:13.835697 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6ngmq" Oct 12 08:00:13 crc kubenswrapper[4599]: I1012 08:00:13.884326 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-n2smx"] Oct 12 08:00:13 crc kubenswrapper[4599]: E1012 08:00:13.884761 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e862c4d5-24d0-42ee-82f5-7a17fc6773aa" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 12 08:00:13 crc kubenswrapper[4599]: I1012 08:00:13.884778 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="e862c4d5-24d0-42ee-82f5-7a17fc6773aa" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 12 08:00:13 crc kubenswrapper[4599]: E1012 08:00:13.884800 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a4bb464-44b3-460f-b9d1-92872095c4cf" containerName="collect-profiles" Oct 12 08:00:13 crc kubenswrapper[4599]: I1012 08:00:13.884806 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a4bb464-44b3-460f-b9d1-92872095c4cf" containerName="collect-profiles" Oct 12 08:00:13 crc kubenswrapper[4599]: I1012 08:00:13.885009 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a4bb464-44b3-460f-b9d1-92872095c4cf" containerName="collect-profiles" Oct 12 08:00:13 crc kubenswrapper[4599]: I1012 08:00:13.885025 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="e862c4d5-24d0-42ee-82f5-7a17fc6773aa" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 12 08:00:13 crc kubenswrapper[4599]: I1012 08:00:13.887811 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-n2smx" Oct 12 08:00:13 crc kubenswrapper[4599]: I1012 08:00:13.889808 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 12 08:00:13 crc kubenswrapper[4599]: I1012 08:00:13.890113 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 12 08:00:13 crc kubenswrapper[4599]: I1012 08:00:13.890124 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 12 08:00:13 crc kubenswrapper[4599]: I1012 08:00:13.890745 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-x82pw" Oct 12 08:00:13 crc kubenswrapper[4599]: I1012 08:00:13.893361 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-n2smx"] Oct 12 08:00:13 crc kubenswrapper[4599]: I1012 08:00:13.899368 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-594hv\" (UniqueName: \"kubernetes.io/projected/dd9a9999-dc26-4df4-b259-dbdbc31766f3-kube-api-access-594hv\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-n2smx\" (UID: \"dd9a9999-dc26-4df4-b259-dbdbc31766f3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-n2smx" Oct 12 08:00:13 crc kubenswrapper[4599]: I1012 08:00:13.899407 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dd9a9999-dc26-4df4-b259-dbdbc31766f3-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-n2smx\" (UID: \"dd9a9999-dc26-4df4-b259-dbdbc31766f3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-n2smx" Oct 12 08:00:13 crc kubenswrapper[4599]: I1012 08:00:13.899530 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dd9a9999-dc26-4df4-b259-dbdbc31766f3-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-n2smx\" (UID: \"dd9a9999-dc26-4df4-b259-dbdbc31766f3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-n2smx" Oct 12 08:00:14 crc kubenswrapper[4599]: I1012 08:00:14.001148 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dd9a9999-dc26-4df4-b259-dbdbc31766f3-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-n2smx\" (UID: \"dd9a9999-dc26-4df4-b259-dbdbc31766f3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-n2smx" Oct 12 08:00:14 crc kubenswrapper[4599]: I1012 08:00:14.001266 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-594hv\" (UniqueName: \"kubernetes.io/projected/dd9a9999-dc26-4df4-b259-dbdbc31766f3-kube-api-access-594hv\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-n2smx\" (UID: \"dd9a9999-dc26-4df4-b259-dbdbc31766f3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-n2smx" Oct 12 08:00:14 crc kubenswrapper[4599]: I1012 08:00:14.001306 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dd9a9999-dc26-4df4-b259-dbdbc31766f3-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-n2smx\" (UID: \"dd9a9999-dc26-4df4-b259-dbdbc31766f3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-n2smx" Oct 12 08:00:14 crc kubenswrapper[4599]: I1012 08:00:14.005670 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dd9a9999-dc26-4df4-b259-dbdbc31766f3-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-n2smx\" (UID: \"dd9a9999-dc26-4df4-b259-dbdbc31766f3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-n2smx" Oct 12 08:00:14 crc kubenswrapper[4599]: I1012 08:00:14.005845 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dd9a9999-dc26-4df4-b259-dbdbc31766f3-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-n2smx\" (UID: \"dd9a9999-dc26-4df4-b259-dbdbc31766f3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-n2smx" Oct 12 08:00:14 crc kubenswrapper[4599]: I1012 08:00:14.014968 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-594hv\" (UniqueName: \"kubernetes.io/projected/dd9a9999-dc26-4df4-b259-dbdbc31766f3-kube-api-access-594hv\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-n2smx\" (UID: \"dd9a9999-dc26-4df4-b259-dbdbc31766f3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-n2smx" Oct 12 08:00:14 crc kubenswrapper[4599]: I1012 08:00:14.205730 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-n2smx" Oct 12 08:00:14 crc kubenswrapper[4599]: I1012 08:00:14.639603 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-n2smx"] Oct 12 08:00:14 crc kubenswrapper[4599]: I1012 08:00:14.844988 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-n2smx" event={"ID":"dd9a9999-dc26-4df4-b259-dbdbc31766f3","Type":"ContainerStarted","Data":"25da1b02d5036a918dcd6f67576e6e0872c72efff4dd9eb9198ebe8c577efcb6"} Oct 12 08:00:15 crc kubenswrapper[4599]: I1012 08:00:15.852245 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-n2smx" event={"ID":"dd9a9999-dc26-4df4-b259-dbdbc31766f3","Type":"ContainerStarted","Data":"a503990f44b75c1d2796426686b9a4fc6f56972302e92cefc0f5f187faafc6d1"} Oct 12 08:00:15 crc kubenswrapper[4599]: I1012 08:00:15.866019 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-n2smx" podStartSLOduration=2.113005711 podStartE2EDuration="2.86600648s" podCreationTimestamp="2025-10-12 08:00:13 +0000 UTC" firstStartedPulling="2025-10-12 08:00:14.64396959 +0000 UTC m=+1511.433165092" lastFinishedPulling="2025-10-12 08:00:15.396970359 +0000 UTC m=+1512.186165861" observedRunningTime="2025-10-12 08:00:15.861166889 +0000 UTC m=+1512.650362392" watchObservedRunningTime="2025-10-12 08:00:15.86600648 +0000 UTC m=+1512.655201983" Oct 12 08:00:22 crc kubenswrapper[4599]: I1012 08:00:22.033071 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-rfqhk"] Oct 12 08:00:22 crc kubenswrapper[4599]: I1012 08:00:22.038127 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-rfqhk"] Oct 12 08:00:22 crc kubenswrapper[4599]: I1012 08:00:22.545541 4599 scope.go:117] "RemoveContainer" containerID="f369260da605ea09588ec62c0ee99215abbfa382e3bbd4d53aa9571e2652f968" Oct 12 08:00:22 crc kubenswrapper[4599]: E1012 08:00:22.545840 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5mz5c_openshift-machine-config-operator(cc694bce-8c25-4729-b452-29d44d3efe6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" Oct 12 08:00:23 crc kubenswrapper[4599]: I1012 08:00:23.023562 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-2p2tr"] Oct 12 08:00:23 crc kubenswrapper[4599]: I1012 08:00:23.029528 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-nsn2r"] Oct 12 08:00:23 crc kubenswrapper[4599]: I1012 08:00:23.036030 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-2p2tr"] Oct 12 08:00:23 crc kubenswrapper[4599]: I1012 08:00:23.041242 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-nsn2r"] Oct 12 08:00:23 crc kubenswrapper[4599]: I1012 08:00:23.556014 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23162836-2301-4422-b6c0-30591a99bed0" path="/var/lib/kubelet/pods/23162836-2301-4422-b6c0-30591a99bed0/volumes" Oct 12 08:00:23 crc kubenswrapper[4599]: I1012 08:00:23.556860 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84022a3c-4f3a-40f7-b3f6-6aa3a83fb174" path="/var/lib/kubelet/pods/84022a3c-4f3a-40f7-b3f6-6aa3a83fb174/volumes" Oct 12 08:00:23 crc kubenswrapper[4599]: I1012 08:00:23.557403 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac30765e-04ab-4e53-87ca-8d5bcf9df91b" path="/var/lib/kubelet/pods/ac30765e-04ab-4e53-87ca-8d5bcf9df91b/volumes" Oct 12 08:00:23 crc kubenswrapper[4599]: E1012 08:00:23.974050 4599 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a4bb464_44b3_460f_b9d1_92872095c4cf.slice/crio-085363c6b4b77b46a53e31c0be2975aadc69f37ccc1ddfca470af1c0a7f9fa06\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a4bb464_44b3_460f_b9d1_92872095c4cf.slice\": RecentStats: unable to find data in memory cache]" Oct 12 08:00:33 crc kubenswrapper[4599]: I1012 08:00:33.027961 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-365a-account-create-hgsfs"] Oct 12 08:00:33 crc kubenswrapper[4599]: I1012 08:00:33.037923 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-365a-account-create-hgsfs"] Oct 12 08:00:33 crc kubenswrapper[4599]: I1012 08:00:33.554662 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c351afb9-ec01-409c-bde8-b466a94e4d9d" path="/var/lib/kubelet/pods/c351afb9-ec01-409c-bde8-b466a94e4d9d/volumes" Oct 12 08:00:34 crc kubenswrapper[4599]: I1012 08:00:34.019590 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-d46a-account-create-gnb8g"] Oct 12 08:00:34 crc kubenswrapper[4599]: I1012 08:00:34.025893 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-8bb3-account-create-cfxfp"] Oct 12 08:00:34 crc kubenswrapper[4599]: I1012 08:00:34.032958 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-d46a-account-create-gnb8g"] Oct 12 08:00:34 crc kubenswrapper[4599]: I1012 08:00:34.038621 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-8bb3-account-create-cfxfp"] Oct 12 08:00:34 crc kubenswrapper[4599]: E1012 08:00:34.176594 4599 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a4bb464_44b3_460f_b9d1_92872095c4cf.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a4bb464_44b3_460f_b9d1_92872095c4cf.slice/crio-085363c6b4b77b46a53e31c0be2975aadc69f37ccc1ddfca470af1c0a7f9fa06\": RecentStats: unable to find data in memory cache]" Oct 12 08:00:34 crc kubenswrapper[4599]: I1012 08:00:34.545014 4599 scope.go:117] "RemoveContainer" containerID="f369260da605ea09588ec62c0ee99215abbfa382e3bbd4d53aa9571e2652f968" Oct 12 08:00:34 crc kubenswrapper[4599]: E1012 08:00:34.545243 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5mz5c_openshift-machine-config-operator(cc694bce-8c25-4729-b452-29d44d3efe6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" Oct 12 08:00:35 crc kubenswrapper[4599]: I1012 08:00:35.554310 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d20b8ffe-fcc1-4097-a968-0cbe4bcd30a3" path="/var/lib/kubelet/pods/d20b8ffe-fcc1-4097-a968-0cbe4bcd30a3/volumes" Oct 12 08:00:35 crc kubenswrapper[4599]: I1012 08:00:35.555377 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f812212a-8c44-416a-8671-bceb361a6779" path="/var/lib/kubelet/pods/f812212a-8c44-416a-8671-bceb361a6779/volumes" Oct 12 08:00:44 crc kubenswrapper[4599]: E1012 08:00:44.378517 4599 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a4bb464_44b3_460f_b9d1_92872095c4cf.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a4bb464_44b3_460f_b9d1_92872095c4cf.slice/crio-085363c6b4b77b46a53e31c0be2975aadc69f37ccc1ddfca470af1c0a7f9fa06\": RecentStats: unable to find data in memory cache]" Oct 12 08:00:48 crc kubenswrapper[4599]: I1012 08:00:48.544999 4599 scope.go:117] "RemoveContainer" containerID="f369260da605ea09588ec62c0ee99215abbfa382e3bbd4d53aa9571e2652f968" Oct 12 08:00:48 crc kubenswrapper[4599]: E1012 08:00:48.545479 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5mz5c_openshift-machine-config-operator(cc694bce-8c25-4729-b452-29d44d3efe6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" Oct 12 08:00:53 crc kubenswrapper[4599]: I1012 08:00:53.028146 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-q7gzz"] Oct 12 08:00:53 crc kubenswrapper[4599]: I1012 08:00:53.033834 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-q7gzz"] Oct 12 08:00:53 crc kubenswrapper[4599]: I1012 08:00:53.554454 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3c31956-8a05-421b-86cf-5e04f27a0ad1" path="/var/lib/kubelet/pods/c3c31956-8a05-421b-86cf-5e04f27a0ad1/volumes" Oct 12 08:00:54 crc kubenswrapper[4599]: E1012 08:00:54.576279 4599 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a4bb464_44b3_460f_b9d1_92872095c4cf.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a4bb464_44b3_460f_b9d1_92872095c4cf.slice/crio-085363c6b4b77b46a53e31c0be2975aadc69f37ccc1ddfca470af1c0a7f9fa06\": RecentStats: unable to find data in memory cache]" Oct 12 08:00:59 crc kubenswrapper[4599]: I1012 08:00:59.545401 4599 scope.go:117] "RemoveContainer" containerID="f369260da605ea09588ec62c0ee99215abbfa382e3bbd4d53aa9571e2652f968" Oct 12 08:00:59 crc kubenswrapper[4599]: E1012 08:00:59.546027 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5mz5c_openshift-machine-config-operator(cc694bce-8c25-4729-b452-29d44d3efe6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" Oct 12 08:01:00 crc kubenswrapper[4599]: I1012 08:01:00.133830 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29337601-bb6kg"] Oct 12 08:01:00 crc kubenswrapper[4599]: I1012 08:01:00.137203 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29337601-bb6kg" Oct 12 08:01:00 crc kubenswrapper[4599]: I1012 08:01:00.159091 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29337601-bb6kg"] Oct 12 08:01:00 crc kubenswrapper[4599]: I1012 08:01:00.332847 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6grw5\" (UniqueName: \"kubernetes.io/projected/53717d04-56c1-42cc-af4a-f7edc51e3611-kube-api-access-6grw5\") pod \"keystone-cron-29337601-bb6kg\" (UID: \"53717d04-56c1-42cc-af4a-f7edc51e3611\") " pod="openstack/keystone-cron-29337601-bb6kg" Oct 12 08:01:00 crc kubenswrapper[4599]: I1012 08:01:00.333282 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53717d04-56c1-42cc-af4a-f7edc51e3611-combined-ca-bundle\") pod \"keystone-cron-29337601-bb6kg\" (UID: \"53717d04-56c1-42cc-af4a-f7edc51e3611\") " pod="openstack/keystone-cron-29337601-bb6kg" Oct 12 08:01:00 crc kubenswrapper[4599]: I1012 08:01:00.333476 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53717d04-56c1-42cc-af4a-f7edc51e3611-config-data\") pod \"keystone-cron-29337601-bb6kg\" (UID: \"53717d04-56c1-42cc-af4a-f7edc51e3611\") " pod="openstack/keystone-cron-29337601-bb6kg" Oct 12 08:01:00 crc kubenswrapper[4599]: I1012 08:01:00.333625 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/53717d04-56c1-42cc-af4a-f7edc51e3611-fernet-keys\") pod \"keystone-cron-29337601-bb6kg\" (UID: \"53717d04-56c1-42cc-af4a-f7edc51e3611\") " pod="openstack/keystone-cron-29337601-bb6kg" Oct 12 08:01:00 crc kubenswrapper[4599]: I1012 08:01:00.434235 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53717d04-56c1-42cc-af4a-f7edc51e3611-config-data\") pod \"keystone-cron-29337601-bb6kg\" (UID: \"53717d04-56c1-42cc-af4a-f7edc51e3611\") " pod="openstack/keystone-cron-29337601-bb6kg" Oct 12 08:01:00 crc kubenswrapper[4599]: I1012 08:01:00.434377 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/53717d04-56c1-42cc-af4a-f7edc51e3611-fernet-keys\") pod \"keystone-cron-29337601-bb6kg\" (UID: \"53717d04-56c1-42cc-af4a-f7edc51e3611\") " pod="openstack/keystone-cron-29337601-bb6kg" Oct 12 08:01:00 crc kubenswrapper[4599]: I1012 08:01:00.434420 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6grw5\" (UniqueName: \"kubernetes.io/projected/53717d04-56c1-42cc-af4a-f7edc51e3611-kube-api-access-6grw5\") pod \"keystone-cron-29337601-bb6kg\" (UID: \"53717d04-56c1-42cc-af4a-f7edc51e3611\") " pod="openstack/keystone-cron-29337601-bb6kg" Oct 12 08:01:00 crc kubenswrapper[4599]: I1012 08:01:00.434451 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53717d04-56c1-42cc-af4a-f7edc51e3611-combined-ca-bundle\") pod \"keystone-cron-29337601-bb6kg\" (UID: \"53717d04-56c1-42cc-af4a-f7edc51e3611\") " pod="openstack/keystone-cron-29337601-bb6kg" Oct 12 08:01:00 crc kubenswrapper[4599]: I1012 08:01:00.440087 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53717d04-56c1-42cc-af4a-f7edc51e3611-combined-ca-bundle\") pod \"keystone-cron-29337601-bb6kg\" (UID: \"53717d04-56c1-42cc-af4a-f7edc51e3611\") " pod="openstack/keystone-cron-29337601-bb6kg" Oct 12 08:01:00 crc kubenswrapper[4599]: I1012 08:01:00.440151 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53717d04-56c1-42cc-af4a-f7edc51e3611-config-data\") pod \"keystone-cron-29337601-bb6kg\" (UID: \"53717d04-56c1-42cc-af4a-f7edc51e3611\") " pod="openstack/keystone-cron-29337601-bb6kg" Oct 12 08:01:00 crc kubenswrapper[4599]: I1012 08:01:00.441152 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/53717d04-56c1-42cc-af4a-f7edc51e3611-fernet-keys\") pod \"keystone-cron-29337601-bb6kg\" (UID: \"53717d04-56c1-42cc-af4a-f7edc51e3611\") " pod="openstack/keystone-cron-29337601-bb6kg" Oct 12 08:01:00 crc kubenswrapper[4599]: I1012 08:01:00.447895 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6grw5\" (UniqueName: \"kubernetes.io/projected/53717d04-56c1-42cc-af4a-f7edc51e3611-kube-api-access-6grw5\") pod \"keystone-cron-29337601-bb6kg\" (UID: \"53717d04-56c1-42cc-af4a-f7edc51e3611\") " pod="openstack/keystone-cron-29337601-bb6kg" Oct 12 08:01:00 crc kubenswrapper[4599]: I1012 08:01:00.454072 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29337601-bb6kg" Oct 12 08:01:00 crc kubenswrapper[4599]: I1012 08:01:00.827587 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29337601-bb6kg"] Oct 12 08:01:01 crc kubenswrapper[4599]: I1012 08:01:01.211722 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29337601-bb6kg" event={"ID":"53717d04-56c1-42cc-af4a-f7edc51e3611","Type":"ContainerStarted","Data":"da4dc601786416e4e461d9b5ad8e2fe616d23f89d64ba53867119c0d81ff44dd"} Oct 12 08:01:01 crc kubenswrapper[4599]: I1012 08:01:01.211771 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29337601-bb6kg" event={"ID":"53717d04-56c1-42cc-af4a-f7edc51e3611","Type":"ContainerStarted","Data":"7477d0ae01cbab561052e03f676f1e2d1b646966f126d48f2c2678a9ce3e9555"} Oct 12 08:01:01 crc kubenswrapper[4599]: I1012 08:01:01.213220 4599 generic.go:334] "Generic (PLEG): container finished" podID="dd9a9999-dc26-4df4-b259-dbdbc31766f3" containerID="a503990f44b75c1d2796426686b9a4fc6f56972302e92cefc0f5f187faafc6d1" exitCode=2 Oct 12 08:01:01 crc kubenswrapper[4599]: I1012 08:01:01.213288 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-n2smx" event={"ID":"dd9a9999-dc26-4df4-b259-dbdbc31766f3","Type":"ContainerDied","Data":"a503990f44b75c1d2796426686b9a4fc6f56972302e92cefc0f5f187faafc6d1"} Oct 12 08:01:01 crc kubenswrapper[4599]: I1012 08:01:01.229005 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29337601-bb6kg" podStartSLOduration=1.228991369 podStartE2EDuration="1.228991369s" podCreationTimestamp="2025-10-12 08:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 08:01:01.226743247 +0000 UTC m=+1558.015938750" watchObservedRunningTime="2025-10-12 08:01:01.228991369 +0000 UTC m=+1558.018186871" Oct 12 08:01:02 crc kubenswrapper[4599]: I1012 08:01:02.530777 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-n2smx" Oct 12 08:01:02 crc kubenswrapper[4599]: I1012 08:01:02.672482 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-594hv\" (UniqueName: \"kubernetes.io/projected/dd9a9999-dc26-4df4-b259-dbdbc31766f3-kube-api-access-594hv\") pod \"dd9a9999-dc26-4df4-b259-dbdbc31766f3\" (UID: \"dd9a9999-dc26-4df4-b259-dbdbc31766f3\") " Oct 12 08:01:02 crc kubenswrapper[4599]: I1012 08:01:02.672644 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dd9a9999-dc26-4df4-b259-dbdbc31766f3-inventory\") pod \"dd9a9999-dc26-4df4-b259-dbdbc31766f3\" (UID: \"dd9a9999-dc26-4df4-b259-dbdbc31766f3\") " Oct 12 08:01:02 crc kubenswrapper[4599]: I1012 08:01:02.672692 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dd9a9999-dc26-4df4-b259-dbdbc31766f3-ssh-key\") pod \"dd9a9999-dc26-4df4-b259-dbdbc31766f3\" (UID: \"dd9a9999-dc26-4df4-b259-dbdbc31766f3\") " Oct 12 08:01:02 crc kubenswrapper[4599]: I1012 08:01:02.677570 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd9a9999-dc26-4df4-b259-dbdbc31766f3-kube-api-access-594hv" (OuterVolumeSpecName: "kube-api-access-594hv") pod "dd9a9999-dc26-4df4-b259-dbdbc31766f3" (UID: "dd9a9999-dc26-4df4-b259-dbdbc31766f3"). InnerVolumeSpecName "kube-api-access-594hv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 08:01:02 crc kubenswrapper[4599]: I1012 08:01:02.694519 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd9a9999-dc26-4df4-b259-dbdbc31766f3-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "dd9a9999-dc26-4df4-b259-dbdbc31766f3" (UID: "dd9a9999-dc26-4df4-b259-dbdbc31766f3"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 08:01:02 crc kubenswrapper[4599]: I1012 08:01:02.694840 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd9a9999-dc26-4df4-b259-dbdbc31766f3-inventory" (OuterVolumeSpecName: "inventory") pod "dd9a9999-dc26-4df4-b259-dbdbc31766f3" (UID: "dd9a9999-dc26-4df4-b259-dbdbc31766f3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 08:01:02 crc kubenswrapper[4599]: I1012 08:01:02.774984 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-594hv\" (UniqueName: \"kubernetes.io/projected/dd9a9999-dc26-4df4-b259-dbdbc31766f3-kube-api-access-594hv\") on node \"crc\" DevicePath \"\"" Oct 12 08:01:02 crc kubenswrapper[4599]: I1012 08:01:02.775019 4599 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dd9a9999-dc26-4df4-b259-dbdbc31766f3-inventory\") on node \"crc\" DevicePath \"\"" Oct 12 08:01:02 crc kubenswrapper[4599]: I1012 08:01:02.775029 4599 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dd9a9999-dc26-4df4-b259-dbdbc31766f3-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 12 08:01:03 crc kubenswrapper[4599]: I1012 08:01:03.228853 4599 generic.go:334] "Generic (PLEG): container finished" podID="53717d04-56c1-42cc-af4a-f7edc51e3611" containerID="da4dc601786416e4e461d9b5ad8e2fe616d23f89d64ba53867119c0d81ff44dd" exitCode=0 Oct 12 08:01:03 crc kubenswrapper[4599]: I1012 08:01:03.228924 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29337601-bb6kg" event={"ID":"53717d04-56c1-42cc-af4a-f7edc51e3611","Type":"ContainerDied","Data":"da4dc601786416e4e461d9b5ad8e2fe616d23f89d64ba53867119c0d81ff44dd"} Oct 12 08:01:03 crc kubenswrapper[4599]: I1012 08:01:03.230142 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-n2smx" event={"ID":"dd9a9999-dc26-4df4-b259-dbdbc31766f3","Type":"ContainerDied","Data":"25da1b02d5036a918dcd6f67576e6e0872c72efff4dd9eb9198ebe8c577efcb6"} Oct 12 08:01:03 crc kubenswrapper[4599]: I1012 08:01:03.230166 4599 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="25da1b02d5036a918dcd6f67576e6e0872c72efff4dd9eb9198ebe8c577efcb6" Oct 12 08:01:03 crc kubenswrapper[4599]: I1012 08:01:03.230217 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-n2smx" Oct 12 08:01:04 crc kubenswrapper[4599]: I1012 08:01:04.511022 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29337601-bb6kg" Oct 12 08:01:04 crc kubenswrapper[4599]: I1012 08:01:04.599557 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6grw5\" (UniqueName: \"kubernetes.io/projected/53717d04-56c1-42cc-af4a-f7edc51e3611-kube-api-access-6grw5\") pod \"53717d04-56c1-42cc-af4a-f7edc51e3611\" (UID: \"53717d04-56c1-42cc-af4a-f7edc51e3611\") " Oct 12 08:01:04 crc kubenswrapper[4599]: I1012 08:01:04.599628 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53717d04-56c1-42cc-af4a-f7edc51e3611-combined-ca-bundle\") pod \"53717d04-56c1-42cc-af4a-f7edc51e3611\" (UID: \"53717d04-56c1-42cc-af4a-f7edc51e3611\") " Oct 12 08:01:04 crc kubenswrapper[4599]: I1012 08:01:04.599660 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/53717d04-56c1-42cc-af4a-f7edc51e3611-fernet-keys\") pod \"53717d04-56c1-42cc-af4a-f7edc51e3611\" (UID: \"53717d04-56c1-42cc-af4a-f7edc51e3611\") " Oct 12 08:01:04 crc kubenswrapper[4599]: I1012 08:01:04.599688 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53717d04-56c1-42cc-af4a-f7edc51e3611-config-data\") pod \"53717d04-56c1-42cc-af4a-f7edc51e3611\" (UID: \"53717d04-56c1-42cc-af4a-f7edc51e3611\") " Oct 12 08:01:04 crc kubenswrapper[4599]: I1012 08:01:04.604844 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53717d04-56c1-42cc-af4a-f7edc51e3611-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "53717d04-56c1-42cc-af4a-f7edc51e3611" (UID: "53717d04-56c1-42cc-af4a-f7edc51e3611"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 08:01:04 crc kubenswrapper[4599]: I1012 08:01:04.606774 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53717d04-56c1-42cc-af4a-f7edc51e3611-kube-api-access-6grw5" (OuterVolumeSpecName: "kube-api-access-6grw5") pod "53717d04-56c1-42cc-af4a-f7edc51e3611" (UID: "53717d04-56c1-42cc-af4a-f7edc51e3611"). InnerVolumeSpecName "kube-api-access-6grw5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 08:01:04 crc kubenswrapper[4599]: I1012 08:01:04.624315 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53717d04-56c1-42cc-af4a-f7edc51e3611-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "53717d04-56c1-42cc-af4a-f7edc51e3611" (UID: "53717d04-56c1-42cc-af4a-f7edc51e3611"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 08:01:04 crc kubenswrapper[4599]: I1012 08:01:04.640144 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53717d04-56c1-42cc-af4a-f7edc51e3611-config-data" (OuterVolumeSpecName: "config-data") pod "53717d04-56c1-42cc-af4a-f7edc51e3611" (UID: "53717d04-56c1-42cc-af4a-f7edc51e3611"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 08:01:04 crc kubenswrapper[4599]: I1012 08:01:04.701258 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6grw5\" (UniqueName: \"kubernetes.io/projected/53717d04-56c1-42cc-af4a-f7edc51e3611-kube-api-access-6grw5\") on node \"crc\" DevicePath \"\"" Oct 12 08:01:04 crc kubenswrapper[4599]: I1012 08:01:04.701282 4599 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53717d04-56c1-42cc-af4a-f7edc51e3611-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 08:01:04 crc kubenswrapper[4599]: I1012 08:01:04.701291 4599 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/53717d04-56c1-42cc-af4a-f7edc51e3611-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 12 08:01:04 crc kubenswrapper[4599]: I1012 08:01:04.701299 4599 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53717d04-56c1-42cc-af4a-f7edc51e3611-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 08:01:05 crc kubenswrapper[4599]: I1012 08:01:05.247109 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29337601-bb6kg" event={"ID":"53717d04-56c1-42cc-af4a-f7edc51e3611","Type":"ContainerDied","Data":"7477d0ae01cbab561052e03f676f1e2d1b646966f126d48f2c2678a9ce3e9555"} Oct 12 08:01:05 crc kubenswrapper[4599]: I1012 08:01:05.247374 4599 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7477d0ae01cbab561052e03f676f1e2d1b646966f126d48f2c2678a9ce3e9555" Oct 12 08:01:05 crc kubenswrapper[4599]: I1012 08:01:05.247156 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29337601-bb6kg" Oct 12 08:01:07 crc kubenswrapper[4599]: I1012 08:01:07.029472 4599 scope.go:117] "RemoveContainer" containerID="0cf8e6c98ac835ac25ab6609fceef30863df8bc0ca707bb8b0066f7f1bd23692" Oct 12 08:01:07 crc kubenswrapper[4599]: I1012 08:01:07.053315 4599 scope.go:117] "RemoveContainer" containerID="83db2f60d001c5dcb6721a23c2ad2ab8bbbe0aa9486ba3cbf623c5b688aa256c" Oct 12 08:01:07 crc kubenswrapper[4599]: I1012 08:01:07.080823 4599 scope.go:117] "RemoveContainer" containerID="aa5dd1932318e1b477a7aa7cbe3e7779728a34d3e12999e78e4da4a5cd10f3b6" Oct 12 08:01:07 crc kubenswrapper[4599]: I1012 08:01:07.115501 4599 scope.go:117] "RemoveContainer" containerID="fa1c6ad6118afca12345e075d1eee8718b7d8386aa4167c93749b5827db46642" Oct 12 08:01:07 crc kubenswrapper[4599]: I1012 08:01:07.159056 4599 scope.go:117] "RemoveContainer" containerID="f7e0ba824b59471ddc71385a78893d38f268f35e5ed401fe423e7e79ae0fb537" Oct 12 08:01:07 crc kubenswrapper[4599]: I1012 08:01:07.177144 4599 scope.go:117] "RemoveContainer" containerID="058b39d173e98ee0de66806ea3590e63914b0753557dc443f86bfee5ab33f4e1" Oct 12 08:01:07 crc kubenswrapper[4599]: I1012 08:01:07.204964 4599 scope.go:117] "RemoveContainer" containerID="a84af588ccb5ecbc5a78a080a52ad804c0d650adac4e87c10abdbcf34d96c2e5" Oct 12 08:01:08 crc kubenswrapper[4599]: I1012 08:01:08.669540 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9ng9p"] Oct 12 08:01:08 crc kubenswrapper[4599]: E1012 08:01:08.670390 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd9a9999-dc26-4df4-b259-dbdbc31766f3" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 12 08:01:08 crc kubenswrapper[4599]: I1012 08:01:08.670409 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd9a9999-dc26-4df4-b259-dbdbc31766f3" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 12 08:01:08 crc kubenswrapper[4599]: E1012 08:01:08.670446 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53717d04-56c1-42cc-af4a-f7edc51e3611" containerName="keystone-cron" Oct 12 08:01:08 crc kubenswrapper[4599]: I1012 08:01:08.670453 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="53717d04-56c1-42cc-af4a-f7edc51e3611" containerName="keystone-cron" Oct 12 08:01:08 crc kubenswrapper[4599]: I1012 08:01:08.670655 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="53717d04-56c1-42cc-af4a-f7edc51e3611" containerName="keystone-cron" Oct 12 08:01:08 crc kubenswrapper[4599]: I1012 08:01:08.670692 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd9a9999-dc26-4df4-b259-dbdbc31766f3" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 12 08:01:08 crc kubenswrapper[4599]: I1012 08:01:08.672109 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9ng9p" Oct 12 08:01:08 crc kubenswrapper[4599]: I1012 08:01:08.678719 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9ng9p"] Oct 12 08:01:08 crc kubenswrapper[4599]: I1012 08:01:08.766267 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfncl\" (UniqueName: \"kubernetes.io/projected/5cc9b474-202d-41b7-adc1-a31989de8be2-kube-api-access-mfncl\") pod \"certified-operators-9ng9p\" (UID: \"5cc9b474-202d-41b7-adc1-a31989de8be2\") " pod="openshift-marketplace/certified-operators-9ng9p" Oct 12 08:01:08 crc kubenswrapper[4599]: I1012 08:01:08.766486 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5cc9b474-202d-41b7-adc1-a31989de8be2-catalog-content\") pod \"certified-operators-9ng9p\" (UID: \"5cc9b474-202d-41b7-adc1-a31989de8be2\") " pod="openshift-marketplace/certified-operators-9ng9p" Oct 12 08:01:08 crc kubenswrapper[4599]: I1012 08:01:08.766756 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5cc9b474-202d-41b7-adc1-a31989de8be2-utilities\") pod \"certified-operators-9ng9p\" (UID: \"5cc9b474-202d-41b7-adc1-a31989de8be2\") " pod="openshift-marketplace/certified-operators-9ng9p" Oct 12 08:01:08 crc kubenswrapper[4599]: I1012 08:01:08.868756 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfncl\" (UniqueName: \"kubernetes.io/projected/5cc9b474-202d-41b7-adc1-a31989de8be2-kube-api-access-mfncl\") pod \"certified-operators-9ng9p\" (UID: \"5cc9b474-202d-41b7-adc1-a31989de8be2\") " pod="openshift-marketplace/certified-operators-9ng9p" Oct 12 08:01:08 crc kubenswrapper[4599]: I1012 08:01:08.868929 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5cc9b474-202d-41b7-adc1-a31989de8be2-catalog-content\") pod \"certified-operators-9ng9p\" (UID: \"5cc9b474-202d-41b7-adc1-a31989de8be2\") " pod="openshift-marketplace/certified-operators-9ng9p" Oct 12 08:01:08 crc kubenswrapper[4599]: I1012 08:01:08.869009 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5cc9b474-202d-41b7-adc1-a31989de8be2-utilities\") pod \"certified-operators-9ng9p\" (UID: \"5cc9b474-202d-41b7-adc1-a31989de8be2\") " pod="openshift-marketplace/certified-operators-9ng9p" Oct 12 08:01:08 crc kubenswrapper[4599]: I1012 08:01:08.869483 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5cc9b474-202d-41b7-adc1-a31989de8be2-utilities\") pod \"certified-operators-9ng9p\" (UID: \"5cc9b474-202d-41b7-adc1-a31989de8be2\") " pod="openshift-marketplace/certified-operators-9ng9p" Oct 12 08:01:08 crc kubenswrapper[4599]: I1012 08:01:08.869549 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5cc9b474-202d-41b7-adc1-a31989de8be2-catalog-content\") pod \"certified-operators-9ng9p\" (UID: \"5cc9b474-202d-41b7-adc1-a31989de8be2\") " pod="openshift-marketplace/certified-operators-9ng9p" Oct 12 08:01:08 crc kubenswrapper[4599]: I1012 08:01:08.887610 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfncl\" (UniqueName: \"kubernetes.io/projected/5cc9b474-202d-41b7-adc1-a31989de8be2-kube-api-access-mfncl\") pod \"certified-operators-9ng9p\" (UID: \"5cc9b474-202d-41b7-adc1-a31989de8be2\") " pod="openshift-marketplace/certified-operators-9ng9p" Oct 12 08:01:08 crc kubenswrapper[4599]: I1012 08:01:08.994632 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9ng9p" Oct 12 08:01:09 crc kubenswrapper[4599]: I1012 08:01:09.439417 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9ng9p"] Oct 12 08:01:10 crc kubenswrapper[4599]: I1012 08:01:10.017887 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sck65"] Oct 12 08:01:10 crc kubenswrapper[4599]: I1012 08:01:10.019110 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sck65" Oct 12 08:01:10 crc kubenswrapper[4599]: I1012 08:01:10.020617 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 12 08:01:10 crc kubenswrapper[4599]: I1012 08:01:10.025551 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 12 08:01:10 crc kubenswrapper[4599]: I1012 08:01:10.025551 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 12 08:01:10 crc kubenswrapper[4599]: I1012 08:01:10.025554 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-x82pw" Oct 12 08:01:10 crc kubenswrapper[4599]: I1012 08:01:10.030175 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sck65"] Oct 12 08:01:10 crc kubenswrapper[4599]: I1012 08:01:10.193827 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9fd7ec79-0a36-4ac6-a81a-486df9b2ba89-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-sck65\" (UID: \"9fd7ec79-0a36-4ac6-a81a-486df9b2ba89\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sck65" Oct 12 08:01:10 crc kubenswrapper[4599]: I1012 08:01:10.194226 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9fd7ec79-0a36-4ac6-a81a-486df9b2ba89-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-sck65\" (UID: \"9fd7ec79-0a36-4ac6-a81a-486df9b2ba89\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sck65" Oct 12 08:01:10 crc kubenswrapper[4599]: I1012 08:01:10.194252 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6p2s\" (UniqueName: \"kubernetes.io/projected/9fd7ec79-0a36-4ac6-a81a-486df9b2ba89-kube-api-access-g6p2s\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-sck65\" (UID: \"9fd7ec79-0a36-4ac6-a81a-486df9b2ba89\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sck65" Oct 12 08:01:10 crc kubenswrapper[4599]: I1012 08:01:10.292206 4599 generic.go:334] "Generic (PLEG): container finished" podID="5cc9b474-202d-41b7-adc1-a31989de8be2" containerID="4a58b9a3559a8f4370dd025183bebb9ed74d0e2466cd5036195c3d466a43bb11" exitCode=0 Oct 12 08:01:10 crc kubenswrapper[4599]: I1012 08:01:10.292254 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9ng9p" event={"ID":"5cc9b474-202d-41b7-adc1-a31989de8be2","Type":"ContainerDied","Data":"4a58b9a3559a8f4370dd025183bebb9ed74d0e2466cd5036195c3d466a43bb11"} Oct 12 08:01:10 crc kubenswrapper[4599]: I1012 08:01:10.292281 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9ng9p" event={"ID":"5cc9b474-202d-41b7-adc1-a31989de8be2","Type":"ContainerStarted","Data":"9a2545475d4d94e889e8123b3ff0bbdf0c348e28fdbe2a894fc18dfbdc5e5f97"} Oct 12 08:01:10 crc kubenswrapper[4599]: I1012 08:01:10.295442 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9fd7ec79-0a36-4ac6-a81a-486df9b2ba89-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-sck65\" (UID: \"9fd7ec79-0a36-4ac6-a81a-486df9b2ba89\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sck65" Oct 12 08:01:10 crc kubenswrapper[4599]: I1012 08:01:10.295475 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6p2s\" (UniqueName: \"kubernetes.io/projected/9fd7ec79-0a36-4ac6-a81a-486df9b2ba89-kube-api-access-g6p2s\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-sck65\" (UID: \"9fd7ec79-0a36-4ac6-a81a-486df9b2ba89\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sck65" Oct 12 08:01:10 crc kubenswrapper[4599]: I1012 08:01:10.295555 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9fd7ec79-0a36-4ac6-a81a-486df9b2ba89-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-sck65\" (UID: \"9fd7ec79-0a36-4ac6-a81a-486df9b2ba89\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sck65" Oct 12 08:01:10 crc kubenswrapper[4599]: I1012 08:01:10.300780 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9fd7ec79-0a36-4ac6-a81a-486df9b2ba89-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-sck65\" (UID: \"9fd7ec79-0a36-4ac6-a81a-486df9b2ba89\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sck65" Oct 12 08:01:10 crc kubenswrapper[4599]: I1012 08:01:10.301129 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9fd7ec79-0a36-4ac6-a81a-486df9b2ba89-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-sck65\" (UID: \"9fd7ec79-0a36-4ac6-a81a-486df9b2ba89\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sck65" Oct 12 08:01:10 crc kubenswrapper[4599]: I1012 08:01:10.309487 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6p2s\" (UniqueName: \"kubernetes.io/projected/9fd7ec79-0a36-4ac6-a81a-486df9b2ba89-kube-api-access-g6p2s\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-sck65\" (UID: \"9fd7ec79-0a36-4ac6-a81a-486df9b2ba89\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sck65" Oct 12 08:01:10 crc kubenswrapper[4599]: I1012 08:01:10.334659 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sck65" Oct 12 08:01:10 crc kubenswrapper[4599]: I1012 08:01:10.767415 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sck65"] Oct 12 08:01:10 crc kubenswrapper[4599]: W1012 08:01:10.774072 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9fd7ec79_0a36_4ac6_a81a_486df9b2ba89.slice/crio-191ffdb91f9d39e6c6f23c257a6ceb22ba2aba8e50b0be709ad4abf81753ae64 WatchSource:0}: Error finding container 191ffdb91f9d39e6c6f23c257a6ceb22ba2aba8e50b0be709ad4abf81753ae64: Status 404 returned error can't find the container with id 191ffdb91f9d39e6c6f23c257a6ceb22ba2aba8e50b0be709ad4abf81753ae64 Oct 12 08:01:11 crc kubenswrapper[4599]: I1012 08:01:11.302371 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sck65" event={"ID":"9fd7ec79-0a36-4ac6-a81a-486df9b2ba89","Type":"ContainerStarted","Data":"191ffdb91f9d39e6c6f23c257a6ceb22ba2aba8e50b0be709ad4abf81753ae64"} Oct 12 08:01:12 crc kubenswrapper[4599]: I1012 08:01:12.311600 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sck65" event={"ID":"9fd7ec79-0a36-4ac6-a81a-486df9b2ba89","Type":"ContainerStarted","Data":"099b53966882e44b42e16ba86396a58baa892ea90b9f9335618d2304a4be7fc4"} Oct 12 08:01:12 crc kubenswrapper[4599]: I1012 08:01:12.315021 4599 generic.go:334] "Generic (PLEG): container finished" podID="5cc9b474-202d-41b7-adc1-a31989de8be2" containerID="a36d4e0342b3133ae8d3c1bcbc1d304a82920cb244d1e4450a6c707fded6e8d6" exitCode=0 Oct 12 08:01:12 crc kubenswrapper[4599]: I1012 08:01:12.315068 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9ng9p" event={"ID":"5cc9b474-202d-41b7-adc1-a31989de8be2","Type":"ContainerDied","Data":"a36d4e0342b3133ae8d3c1bcbc1d304a82920cb244d1e4450a6c707fded6e8d6"} Oct 12 08:01:12 crc kubenswrapper[4599]: I1012 08:01:12.326040 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sck65" podStartSLOduration=1.684141537 podStartE2EDuration="2.325386434s" podCreationTimestamp="2025-10-12 08:01:10 +0000 UTC" firstStartedPulling="2025-10-12 08:01:10.776526578 +0000 UTC m=+1567.565722080" lastFinishedPulling="2025-10-12 08:01:11.417771475 +0000 UTC m=+1568.206966977" observedRunningTime="2025-10-12 08:01:12.325166419 +0000 UTC m=+1569.114361921" watchObservedRunningTime="2025-10-12 08:01:12.325386434 +0000 UTC m=+1569.114581936" Oct 12 08:01:12 crc kubenswrapper[4599]: I1012 08:01:12.545540 4599 scope.go:117] "RemoveContainer" containerID="f369260da605ea09588ec62c0ee99215abbfa382e3bbd4d53aa9571e2652f968" Oct 12 08:01:12 crc kubenswrapper[4599]: E1012 08:01:12.545775 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5mz5c_openshift-machine-config-operator(cc694bce-8c25-4729-b452-29d44d3efe6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" Oct 12 08:01:13 crc kubenswrapper[4599]: I1012 08:01:13.023535 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-f7jhc"] Oct 12 08:01:13 crc kubenswrapper[4599]: I1012 08:01:13.029938 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-f7jhc"] Oct 12 08:01:13 crc kubenswrapper[4599]: I1012 08:01:13.331811 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9ng9p" event={"ID":"5cc9b474-202d-41b7-adc1-a31989de8be2","Type":"ContainerStarted","Data":"553c8835bb757f1c2f4415b8c0023e5f909d2b88d24d72c287ceb10c0b49c4cf"} Oct 12 08:01:13 crc kubenswrapper[4599]: I1012 08:01:13.355843 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9ng9p" podStartSLOduration=2.878033614 podStartE2EDuration="5.355825618s" podCreationTimestamp="2025-10-12 08:01:08 +0000 UTC" firstStartedPulling="2025-10-12 08:01:10.293948729 +0000 UTC m=+1567.083144231" lastFinishedPulling="2025-10-12 08:01:12.771740732 +0000 UTC m=+1569.560936235" observedRunningTime="2025-10-12 08:01:13.352463245 +0000 UTC m=+1570.141658746" watchObservedRunningTime="2025-10-12 08:01:13.355825618 +0000 UTC m=+1570.145021120" Oct 12 08:01:13 crc kubenswrapper[4599]: I1012 08:01:13.553553 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73cd69f3-194c-4484-a0cf-01815024d884" path="/var/lib/kubelet/pods/73cd69f3-194c-4484-a0cf-01815024d884/volumes" Oct 12 08:01:15 crc kubenswrapper[4599]: I1012 08:01:15.027385 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-tszt8"] Oct 12 08:01:15 crc kubenswrapper[4599]: I1012 08:01:15.033408 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-tszt8"] Oct 12 08:01:15 crc kubenswrapper[4599]: I1012 08:01:15.553120 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5ccaa75-c98c-4f83-af3c-0276bb6ad3a8" path="/var/lib/kubelet/pods/a5ccaa75-c98c-4f83-af3c-0276bb6ad3a8/volumes" Oct 12 08:01:18 crc kubenswrapper[4599]: I1012 08:01:18.995613 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9ng9p" Oct 12 08:01:18 crc kubenswrapper[4599]: I1012 08:01:18.995923 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9ng9p" Oct 12 08:01:19 crc kubenswrapper[4599]: I1012 08:01:19.029080 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9ng9p" Oct 12 08:01:19 crc kubenswrapper[4599]: I1012 08:01:19.408611 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9ng9p" Oct 12 08:01:19 crc kubenswrapper[4599]: I1012 08:01:19.444428 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9ng9p"] Oct 12 08:01:21 crc kubenswrapper[4599]: I1012 08:01:21.389007 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-9ng9p" podUID="5cc9b474-202d-41b7-adc1-a31989de8be2" containerName="registry-server" containerID="cri-o://553c8835bb757f1c2f4415b8c0023e5f909d2b88d24d72c287ceb10c0b49c4cf" gracePeriod=2 Oct 12 08:01:21 crc kubenswrapper[4599]: I1012 08:01:21.774600 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9ng9p" Oct 12 08:01:21 crc kubenswrapper[4599]: I1012 08:01:21.791191 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mfncl\" (UniqueName: \"kubernetes.io/projected/5cc9b474-202d-41b7-adc1-a31989de8be2-kube-api-access-mfncl\") pod \"5cc9b474-202d-41b7-adc1-a31989de8be2\" (UID: \"5cc9b474-202d-41b7-adc1-a31989de8be2\") " Oct 12 08:01:21 crc kubenswrapper[4599]: I1012 08:01:21.791358 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5cc9b474-202d-41b7-adc1-a31989de8be2-utilities\") pod \"5cc9b474-202d-41b7-adc1-a31989de8be2\" (UID: \"5cc9b474-202d-41b7-adc1-a31989de8be2\") " Oct 12 08:01:21 crc kubenswrapper[4599]: I1012 08:01:21.791411 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5cc9b474-202d-41b7-adc1-a31989de8be2-catalog-content\") pod \"5cc9b474-202d-41b7-adc1-a31989de8be2\" (UID: \"5cc9b474-202d-41b7-adc1-a31989de8be2\") " Oct 12 08:01:21 crc kubenswrapper[4599]: I1012 08:01:21.793403 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5cc9b474-202d-41b7-adc1-a31989de8be2-utilities" (OuterVolumeSpecName: "utilities") pod "5cc9b474-202d-41b7-adc1-a31989de8be2" (UID: "5cc9b474-202d-41b7-adc1-a31989de8be2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 08:01:21 crc kubenswrapper[4599]: I1012 08:01:21.797265 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5cc9b474-202d-41b7-adc1-a31989de8be2-kube-api-access-mfncl" (OuterVolumeSpecName: "kube-api-access-mfncl") pod "5cc9b474-202d-41b7-adc1-a31989de8be2" (UID: "5cc9b474-202d-41b7-adc1-a31989de8be2"). InnerVolumeSpecName "kube-api-access-mfncl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 08:01:21 crc kubenswrapper[4599]: I1012 08:01:21.831766 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5cc9b474-202d-41b7-adc1-a31989de8be2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5cc9b474-202d-41b7-adc1-a31989de8be2" (UID: "5cc9b474-202d-41b7-adc1-a31989de8be2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 08:01:21 crc kubenswrapper[4599]: I1012 08:01:21.893571 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mfncl\" (UniqueName: \"kubernetes.io/projected/5cc9b474-202d-41b7-adc1-a31989de8be2-kube-api-access-mfncl\") on node \"crc\" DevicePath \"\"" Oct 12 08:01:21 crc kubenswrapper[4599]: I1012 08:01:21.893609 4599 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5cc9b474-202d-41b7-adc1-a31989de8be2-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 08:01:21 crc kubenswrapper[4599]: I1012 08:01:21.893619 4599 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5cc9b474-202d-41b7-adc1-a31989de8be2-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 08:01:22 crc kubenswrapper[4599]: I1012 08:01:22.398406 4599 generic.go:334] "Generic (PLEG): container finished" podID="5cc9b474-202d-41b7-adc1-a31989de8be2" containerID="553c8835bb757f1c2f4415b8c0023e5f909d2b88d24d72c287ceb10c0b49c4cf" exitCode=0 Oct 12 08:01:22 crc kubenswrapper[4599]: I1012 08:01:22.398443 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9ng9p" event={"ID":"5cc9b474-202d-41b7-adc1-a31989de8be2","Type":"ContainerDied","Data":"553c8835bb757f1c2f4415b8c0023e5f909d2b88d24d72c287ceb10c0b49c4cf"} Oct 12 08:01:22 crc kubenswrapper[4599]: I1012 08:01:22.398467 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9ng9p" Oct 12 08:01:22 crc kubenswrapper[4599]: I1012 08:01:22.398495 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9ng9p" event={"ID":"5cc9b474-202d-41b7-adc1-a31989de8be2","Type":"ContainerDied","Data":"9a2545475d4d94e889e8123b3ff0bbdf0c348e28fdbe2a894fc18dfbdc5e5f97"} Oct 12 08:01:22 crc kubenswrapper[4599]: I1012 08:01:22.398521 4599 scope.go:117] "RemoveContainer" containerID="553c8835bb757f1c2f4415b8c0023e5f909d2b88d24d72c287ceb10c0b49c4cf" Oct 12 08:01:22 crc kubenswrapper[4599]: I1012 08:01:22.418086 4599 scope.go:117] "RemoveContainer" containerID="a36d4e0342b3133ae8d3c1bcbc1d304a82920cb244d1e4450a6c707fded6e8d6" Oct 12 08:01:22 crc kubenswrapper[4599]: I1012 08:01:22.424806 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9ng9p"] Oct 12 08:01:22 crc kubenswrapper[4599]: I1012 08:01:22.434101 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-9ng9p"] Oct 12 08:01:22 crc kubenswrapper[4599]: I1012 08:01:22.444019 4599 scope.go:117] "RemoveContainer" containerID="4a58b9a3559a8f4370dd025183bebb9ed74d0e2466cd5036195c3d466a43bb11" Oct 12 08:01:22 crc kubenswrapper[4599]: I1012 08:01:22.473479 4599 scope.go:117] "RemoveContainer" containerID="553c8835bb757f1c2f4415b8c0023e5f909d2b88d24d72c287ceb10c0b49c4cf" Oct 12 08:01:22 crc kubenswrapper[4599]: E1012 08:01:22.474393 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"553c8835bb757f1c2f4415b8c0023e5f909d2b88d24d72c287ceb10c0b49c4cf\": container with ID starting with 553c8835bb757f1c2f4415b8c0023e5f909d2b88d24d72c287ceb10c0b49c4cf not found: ID does not exist" containerID="553c8835bb757f1c2f4415b8c0023e5f909d2b88d24d72c287ceb10c0b49c4cf" Oct 12 08:01:22 crc kubenswrapper[4599]: I1012 08:01:22.474426 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"553c8835bb757f1c2f4415b8c0023e5f909d2b88d24d72c287ceb10c0b49c4cf"} err="failed to get container status \"553c8835bb757f1c2f4415b8c0023e5f909d2b88d24d72c287ceb10c0b49c4cf\": rpc error: code = NotFound desc = could not find container \"553c8835bb757f1c2f4415b8c0023e5f909d2b88d24d72c287ceb10c0b49c4cf\": container with ID starting with 553c8835bb757f1c2f4415b8c0023e5f909d2b88d24d72c287ceb10c0b49c4cf not found: ID does not exist" Oct 12 08:01:22 crc kubenswrapper[4599]: I1012 08:01:22.474447 4599 scope.go:117] "RemoveContainer" containerID="a36d4e0342b3133ae8d3c1bcbc1d304a82920cb244d1e4450a6c707fded6e8d6" Oct 12 08:01:22 crc kubenswrapper[4599]: E1012 08:01:22.474928 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a36d4e0342b3133ae8d3c1bcbc1d304a82920cb244d1e4450a6c707fded6e8d6\": container with ID starting with a36d4e0342b3133ae8d3c1bcbc1d304a82920cb244d1e4450a6c707fded6e8d6 not found: ID does not exist" containerID="a36d4e0342b3133ae8d3c1bcbc1d304a82920cb244d1e4450a6c707fded6e8d6" Oct 12 08:01:22 crc kubenswrapper[4599]: I1012 08:01:22.474959 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a36d4e0342b3133ae8d3c1bcbc1d304a82920cb244d1e4450a6c707fded6e8d6"} err="failed to get container status \"a36d4e0342b3133ae8d3c1bcbc1d304a82920cb244d1e4450a6c707fded6e8d6\": rpc error: code = NotFound desc = could not find container \"a36d4e0342b3133ae8d3c1bcbc1d304a82920cb244d1e4450a6c707fded6e8d6\": container with ID starting with a36d4e0342b3133ae8d3c1bcbc1d304a82920cb244d1e4450a6c707fded6e8d6 not found: ID does not exist" Oct 12 08:01:22 crc kubenswrapper[4599]: I1012 08:01:22.474977 4599 scope.go:117] "RemoveContainer" containerID="4a58b9a3559a8f4370dd025183bebb9ed74d0e2466cd5036195c3d466a43bb11" Oct 12 08:01:22 crc kubenswrapper[4599]: E1012 08:01:22.475144 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a58b9a3559a8f4370dd025183bebb9ed74d0e2466cd5036195c3d466a43bb11\": container with ID starting with 4a58b9a3559a8f4370dd025183bebb9ed74d0e2466cd5036195c3d466a43bb11 not found: ID does not exist" containerID="4a58b9a3559a8f4370dd025183bebb9ed74d0e2466cd5036195c3d466a43bb11" Oct 12 08:01:22 crc kubenswrapper[4599]: I1012 08:01:22.475173 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a58b9a3559a8f4370dd025183bebb9ed74d0e2466cd5036195c3d466a43bb11"} err="failed to get container status \"4a58b9a3559a8f4370dd025183bebb9ed74d0e2466cd5036195c3d466a43bb11\": rpc error: code = NotFound desc = could not find container \"4a58b9a3559a8f4370dd025183bebb9ed74d0e2466cd5036195c3d466a43bb11\": container with ID starting with 4a58b9a3559a8f4370dd025183bebb9ed74d0e2466cd5036195c3d466a43bb11 not found: ID does not exist" Oct 12 08:01:23 crc kubenswrapper[4599]: I1012 08:01:23.555614 4599 scope.go:117] "RemoveContainer" containerID="f369260da605ea09588ec62c0ee99215abbfa382e3bbd4d53aa9571e2652f968" Oct 12 08:01:23 crc kubenswrapper[4599]: E1012 08:01:23.556258 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5mz5c_openshift-machine-config-operator(cc694bce-8c25-4729-b452-29d44d3efe6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" Oct 12 08:01:23 crc kubenswrapper[4599]: I1012 08:01:23.561938 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5cc9b474-202d-41b7-adc1-a31989de8be2" path="/var/lib/kubelet/pods/5cc9b474-202d-41b7-adc1-a31989de8be2/volumes" Oct 12 08:01:35 crc kubenswrapper[4599]: I1012 08:01:35.545461 4599 scope.go:117] "RemoveContainer" containerID="f369260da605ea09588ec62c0ee99215abbfa382e3bbd4d53aa9571e2652f968" Oct 12 08:01:35 crc kubenswrapper[4599]: E1012 08:01:35.546279 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5mz5c_openshift-machine-config-operator(cc694bce-8c25-4729-b452-29d44d3efe6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" Oct 12 08:01:42 crc kubenswrapper[4599]: I1012 08:01:42.571234 4599 generic.go:334] "Generic (PLEG): container finished" podID="9fd7ec79-0a36-4ac6-a81a-486df9b2ba89" containerID="099b53966882e44b42e16ba86396a58baa892ea90b9f9335618d2304a4be7fc4" exitCode=0 Oct 12 08:01:42 crc kubenswrapper[4599]: I1012 08:01:42.571380 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sck65" event={"ID":"9fd7ec79-0a36-4ac6-a81a-486df9b2ba89","Type":"ContainerDied","Data":"099b53966882e44b42e16ba86396a58baa892ea90b9f9335618d2304a4be7fc4"} Oct 12 08:01:43 crc kubenswrapper[4599]: I1012 08:01:43.915261 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sck65" Oct 12 08:01:44 crc kubenswrapper[4599]: I1012 08:01:44.115537 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9fd7ec79-0a36-4ac6-a81a-486df9b2ba89-ssh-key\") pod \"9fd7ec79-0a36-4ac6-a81a-486df9b2ba89\" (UID: \"9fd7ec79-0a36-4ac6-a81a-486df9b2ba89\") " Oct 12 08:01:44 crc kubenswrapper[4599]: I1012 08:01:44.115732 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9fd7ec79-0a36-4ac6-a81a-486df9b2ba89-inventory\") pod \"9fd7ec79-0a36-4ac6-a81a-486df9b2ba89\" (UID: \"9fd7ec79-0a36-4ac6-a81a-486df9b2ba89\") " Oct 12 08:01:44 crc kubenswrapper[4599]: I1012 08:01:44.115918 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g6p2s\" (UniqueName: \"kubernetes.io/projected/9fd7ec79-0a36-4ac6-a81a-486df9b2ba89-kube-api-access-g6p2s\") pod \"9fd7ec79-0a36-4ac6-a81a-486df9b2ba89\" (UID: \"9fd7ec79-0a36-4ac6-a81a-486df9b2ba89\") " Oct 12 08:01:44 crc kubenswrapper[4599]: I1012 08:01:44.122204 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9fd7ec79-0a36-4ac6-a81a-486df9b2ba89-kube-api-access-g6p2s" (OuterVolumeSpecName: "kube-api-access-g6p2s") pod "9fd7ec79-0a36-4ac6-a81a-486df9b2ba89" (UID: "9fd7ec79-0a36-4ac6-a81a-486df9b2ba89"). InnerVolumeSpecName "kube-api-access-g6p2s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 08:01:44 crc kubenswrapper[4599]: I1012 08:01:44.140132 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fd7ec79-0a36-4ac6-a81a-486df9b2ba89-inventory" (OuterVolumeSpecName: "inventory") pod "9fd7ec79-0a36-4ac6-a81a-486df9b2ba89" (UID: "9fd7ec79-0a36-4ac6-a81a-486df9b2ba89"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 08:01:44 crc kubenswrapper[4599]: I1012 08:01:44.140213 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fd7ec79-0a36-4ac6-a81a-486df9b2ba89-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9fd7ec79-0a36-4ac6-a81a-486df9b2ba89" (UID: "9fd7ec79-0a36-4ac6-a81a-486df9b2ba89"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 08:01:44 crc kubenswrapper[4599]: I1012 08:01:44.218832 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g6p2s\" (UniqueName: \"kubernetes.io/projected/9fd7ec79-0a36-4ac6-a81a-486df9b2ba89-kube-api-access-g6p2s\") on node \"crc\" DevicePath \"\"" Oct 12 08:01:44 crc kubenswrapper[4599]: I1012 08:01:44.218869 4599 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9fd7ec79-0a36-4ac6-a81a-486df9b2ba89-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 12 08:01:44 crc kubenswrapper[4599]: I1012 08:01:44.218881 4599 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9fd7ec79-0a36-4ac6-a81a-486df9b2ba89-inventory\") on node \"crc\" DevicePath \"\"" Oct 12 08:01:44 crc kubenswrapper[4599]: I1012 08:01:44.587666 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sck65" event={"ID":"9fd7ec79-0a36-4ac6-a81a-486df9b2ba89","Type":"ContainerDied","Data":"191ffdb91f9d39e6c6f23c257a6ceb22ba2aba8e50b0be709ad4abf81753ae64"} Oct 12 08:01:44 crc kubenswrapper[4599]: I1012 08:01:44.587710 4599 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="191ffdb91f9d39e6c6f23c257a6ceb22ba2aba8e50b0be709ad4abf81753ae64" Oct 12 08:01:44 crc kubenswrapper[4599]: I1012 08:01:44.587712 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sck65" Oct 12 08:01:44 crc kubenswrapper[4599]: I1012 08:01:44.656535 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-kbv4c"] Oct 12 08:01:44 crc kubenswrapper[4599]: E1012 08:01:44.657059 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fd7ec79-0a36-4ac6-a81a-486df9b2ba89" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 12 08:01:44 crc kubenswrapper[4599]: I1012 08:01:44.657084 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fd7ec79-0a36-4ac6-a81a-486df9b2ba89" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 12 08:01:44 crc kubenswrapper[4599]: E1012 08:01:44.657110 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5cc9b474-202d-41b7-adc1-a31989de8be2" containerName="extract-utilities" Oct 12 08:01:44 crc kubenswrapper[4599]: I1012 08:01:44.657117 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cc9b474-202d-41b7-adc1-a31989de8be2" containerName="extract-utilities" Oct 12 08:01:44 crc kubenswrapper[4599]: E1012 08:01:44.657130 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5cc9b474-202d-41b7-adc1-a31989de8be2" containerName="registry-server" Oct 12 08:01:44 crc kubenswrapper[4599]: I1012 08:01:44.657136 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cc9b474-202d-41b7-adc1-a31989de8be2" containerName="registry-server" Oct 12 08:01:44 crc kubenswrapper[4599]: E1012 08:01:44.657157 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5cc9b474-202d-41b7-adc1-a31989de8be2" containerName="extract-content" Oct 12 08:01:44 crc kubenswrapper[4599]: I1012 08:01:44.657163 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cc9b474-202d-41b7-adc1-a31989de8be2" containerName="extract-content" Oct 12 08:01:44 crc kubenswrapper[4599]: I1012 08:01:44.657394 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fd7ec79-0a36-4ac6-a81a-486df9b2ba89" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 12 08:01:44 crc kubenswrapper[4599]: I1012 08:01:44.657427 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="5cc9b474-202d-41b7-adc1-a31989de8be2" containerName="registry-server" Oct 12 08:01:44 crc kubenswrapper[4599]: I1012 08:01:44.658202 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-kbv4c" Oct 12 08:01:44 crc kubenswrapper[4599]: I1012 08:01:44.660182 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 12 08:01:44 crc kubenswrapper[4599]: I1012 08:01:44.660740 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 12 08:01:44 crc kubenswrapper[4599]: I1012 08:01:44.660791 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-x82pw" Oct 12 08:01:44 crc kubenswrapper[4599]: I1012 08:01:44.660999 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 12 08:01:44 crc kubenswrapper[4599]: I1012 08:01:44.672302 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-kbv4c"] Oct 12 08:01:44 crc kubenswrapper[4599]: I1012 08:01:44.729654 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/88ad049a-ede5-4ac3-842b-c1ab9199014a-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-kbv4c\" (UID: \"88ad049a-ede5-4ac3-842b-c1ab9199014a\") " pod="openstack/ssh-known-hosts-edpm-deployment-kbv4c" Oct 12 08:01:44 crc kubenswrapper[4599]: I1012 08:01:44.729736 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6vll\" (UniqueName: \"kubernetes.io/projected/88ad049a-ede5-4ac3-842b-c1ab9199014a-kube-api-access-q6vll\") pod \"ssh-known-hosts-edpm-deployment-kbv4c\" (UID: \"88ad049a-ede5-4ac3-842b-c1ab9199014a\") " pod="openstack/ssh-known-hosts-edpm-deployment-kbv4c" Oct 12 08:01:44 crc kubenswrapper[4599]: I1012 08:01:44.729778 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/88ad049a-ede5-4ac3-842b-c1ab9199014a-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-kbv4c\" (UID: \"88ad049a-ede5-4ac3-842b-c1ab9199014a\") " pod="openstack/ssh-known-hosts-edpm-deployment-kbv4c" Oct 12 08:01:44 crc kubenswrapper[4599]: I1012 08:01:44.832136 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/88ad049a-ede5-4ac3-842b-c1ab9199014a-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-kbv4c\" (UID: \"88ad049a-ede5-4ac3-842b-c1ab9199014a\") " pod="openstack/ssh-known-hosts-edpm-deployment-kbv4c" Oct 12 08:01:44 crc kubenswrapper[4599]: I1012 08:01:44.832209 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6vll\" (UniqueName: \"kubernetes.io/projected/88ad049a-ede5-4ac3-842b-c1ab9199014a-kube-api-access-q6vll\") pod \"ssh-known-hosts-edpm-deployment-kbv4c\" (UID: \"88ad049a-ede5-4ac3-842b-c1ab9199014a\") " pod="openstack/ssh-known-hosts-edpm-deployment-kbv4c" Oct 12 08:01:44 crc kubenswrapper[4599]: I1012 08:01:44.832244 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/88ad049a-ede5-4ac3-842b-c1ab9199014a-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-kbv4c\" (UID: \"88ad049a-ede5-4ac3-842b-c1ab9199014a\") " pod="openstack/ssh-known-hosts-edpm-deployment-kbv4c" Oct 12 08:01:44 crc kubenswrapper[4599]: I1012 08:01:44.835883 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/88ad049a-ede5-4ac3-842b-c1ab9199014a-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-kbv4c\" (UID: \"88ad049a-ede5-4ac3-842b-c1ab9199014a\") " pod="openstack/ssh-known-hosts-edpm-deployment-kbv4c" Oct 12 08:01:44 crc kubenswrapper[4599]: I1012 08:01:44.836441 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/88ad049a-ede5-4ac3-842b-c1ab9199014a-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-kbv4c\" (UID: \"88ad049a-ede5-4ac3-842b-c1ab9199014a\") " pod="openstack/ssh-known-hosts-edpm-deployment-kbv4c" Oct 12 08:01:44 crc kubenswrapper[4599]: I1012 08:01:44.846607 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6vll\" (UniqueName: \"kubernetes.io/projected/88ad049a-ede5-4ac3-842b-c1ab9199014a-kube-api-access-q6vll\") pod \"ssh-known-hosts-edpm-deployment-kbv4c\" (UID: \"88ad049a-ede5-4ac3-842b-c1ab9199014a\") " pod="openstack/ssh-known-hosts-edpm-deployment-kbv4c" Oct 12 08:01:44 crc kubenswrapper[4599]: I1012 08:01:44.978156 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-kbv4c" Oct 12 08:01:45 crc kubenswrapper[4599]: I1012 08:01:45.466305 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-kbv4c"] Oct 12 08:01:45 crc kubenswrapper[4599]: I1012 08:01:45.600831 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-kbv4c" event={"ID":"88ad049a-ede5-4ac3-842b-c1ab9199014a","Type":"ContainerStarted","Data":"4fc30d262290109df08a1b4c19ccf6ff626afa6ff4266c21bf6913e914044666"} Oct 12 08:01:46 crc kubenswrapper[4599]: I1012 08:01:46.612117 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-kbv4c" event={"ID":"88ad049a-ede5-4ac3-842b-c1ab9199014a","Type":"ContainerStarted","Data":"3e99feebd8a520b5e0578f0bafcc24b4f75a202c7baf4903d74ec892337f22c5"} Oct 12 08:01:46 crc kubenswrapper[4599]: I1012 08:01:46.632171 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-kbv4c" podStartSLOduration=2.103645154 podStartE2EDuration="2.632152947s" podCreationTimestamp="2025-10-12 08:01:44 +0000 UTC" firstStartedPulling="2025-10-12 08:01:45.462498094 +0000 UTC m=+1602.251693596" lastFinishedPulling="2025-10-12 08:01:45.991005886 +0000 UTC m=+1602.780201389" observedRunningTime="2025-10-12 08:01:46.627431961 +0000 UTC m=+1603.416627463" watchObservedRunningTime="2025-10-12 08:01:46.632152947 +0000 UTC m=+1603.421348450" Oct 12 08:01:47 crc kubenswrapper[4599]: I1012 08:01:47.545246 4599 scope.go:117] "RemoveContainer" containerID="f369260da605ea09588ec62c0ee99215abbfa382e3bbd4d53aa9571e2652f968" Oct 12 08:01:47 crc kubenswrapper[4599]: E1012 08:01:47.546138 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5mz5c_openshift-machine-config-operator(cc694bce-8c25-4729-b452-29d44d3efe6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" Oct 12 08:01:51 crc kubenswrapper[4599]: I1012 08:01:51.655138 4599 generic.go:334] "Generic (PLEG): container finished" podID="88ad049a-ede5-4ac3-842b-c1ab9199014a" containerID="3e99feebd8a520b5e0578f0bafcc24b4f75a202c7baf4903d74ec892337f22c5" exitCode=0 Oct 12 08:01:51 crc kubenswrapper[4599]: I1012 08:01:51.655225 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-kbv4c" event={"ID":"88ad049a-ede5-4ac3-842b-c1ab9199014a","Type":"ContainerDied","Data":"3e99feebd8a520b5e0578f0bafcc24b4f75a202c7baf4903d74ec892337f22c5"} Oct 12 08:01:53 crc kubenswrapper[4599]: I1012 08:01:53.025393 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-kbv4c" Oct 12 08:01:53 crc kubenswrapper[4599]: I1012 08:01:53.114711 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/88ad049a-ede5-4ac3-842b-c1ab9199014a-ssh-key-openstack-edpm-ipam\") pod \"88ad049a-ede5-4ac3-842b-c1ab9199014a\" (UID: \"88ad049a-ede5-4ac3-842b-c1ab9199014a\") " Oct 12 08:01:53 crc kubenswrapper[4599]: I1012 08:01:53.114797 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/88ad049a-ede5-4ac3-842b-c1ab9199014a-inventory-0\") pod \"88ad049a-ede5-4ac3-842b-c1ab9199014a\" (UID: \"88ad049a-ede5-4ac3-842b-c1ab9199014a\") " Oct 12 08:01:53 crc kubenswrapper[4599]: I1012 08:01:53.114819 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6vll\" (UniqueName: \"kubernetes.io/projected/88ad049a-ede5-4ac3-842b-c1ab9199014a-kube-api-access-q6vll\") pod \"88ad049a-ede5-4ac3-842b-c1ab9199014a\" (UID: \"88ad049a-ede5-4ac3-842b-c1ab9199014a\") " Oct 12 08:01:53 crc kubenswrapper[4599]: I1012 08:01:53.123428 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88ad049a-ede5-4ac3-842b-c1ab9199014a-kube-api-access-q6vll" (OuterVolumeSpecName: "kube-api-access-q6vll") pod "88ad049a-ede5-4ac3-842b-c1ab9199014a" (UID: "88ad049a-ede5-4ac3-842b-c1ab9199014a"). InnerVolumeSpecName "kube-api-access-q6vll". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 08:01:53 crc kubenswrapper[4599]: I1012 08:01:53.138967 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88ad049a-ede5-4ac3-842b-c1ab9199014a-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "88ad049a-ede5-4ac3-842b-c1ab9199014a" (UID: "88ad049a-ede5-4ac3-842b-c1ab9199014a"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 08:01:53 crc kubenswrapper[4599]: I1012 08:01:53.139228 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88ad049a-ede5-4ac3-842b-c1ab9199014a-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "88ad049a-ede5-4ac3-842b-c1ab9199014a" (UID: "88ad049a-ede5-4ac3-842b-c1ab9199014a"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 08:01:53 crc kubenswrapper[4599]: I1012 08:01:53.217177 4599 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/88ad049a-ede5-4ac3-842b-c1ab9199014a-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 12 08:01:53 crc kubenswrapper[4599]: I1012 08:01:53.217213 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6vll\" (UniqueName: \"kubernetes.io/projected/88ad049a-ede5-4ac3-842b-c1ab9199014a-kube-api-access-q6vll\") on node \"crc\" DevicePath \"\"" Oct 12 08:01:53 crc kubenswrapper[4599]: I1012 08:01:53.217224 4599 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/88ad049a-ede5-4ac3-842b-c1ab9199014a-inventory-0\") on node \"crc\" DevicePath \"\"" Oct 12 08:01:53 crc kubenswrapper[4599]: I1012 08:01:53.678901 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-kbv4c" event={"ID":"88ad049a-ede5-4ac3-842b-c1ab9199014a","Type":"ContainerDied","Data":"4fc30d262290109df08a1b4c19ccf6ff626afa6ff4266c21bf6913e914044666"} Oct 12 08:01:53 crc kubenswrapper[4599]: I1012 08:01:53.678964 4599 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4fc30d262290109df08a1b4c19ccf6ff626afa6ff4266c21bf6913e914044666" Oct 12 08:01:53 crc kubenswrapper[4599]: I1012 08:01:53.679074 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-kbv4c" Oct 12 08:01:53 crc kubenswrapper[4599]: I1012 08:01:53.741872 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-ssdqc"] Oct 12 08:01:53 crc kubenswrapper[4599]: E1012 08:01:53.742752 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88ad049a-ede5-4ac3-842b-c1ab9199014a" containerName="ssh-known-hosts-edpm-deployment" Oct 12 08:01:53 crc kubenswrapper[4599]: I1012 08:01:53.742777 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="88ad049a-ede5-4ac3-842b-c1ab9199014a" containerName="ssh-known-hosts-edpm-deployment" Oct 12 08:01:53 crc kubenswrapper[4599]: I1012 08:01:53.743016 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="88ad049a-ede5-4ac3-842b-c1ab9199014a" containerName="ssh-known-hosts-edpm-deployment" Oct 12 08:01:53 crc kubenswrapper[4599]: I1012 08:01:53.743909 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ssdqc" Oct 12 08:01:53 crc kubenswrapper[4599]: I1012 08:01:53.746131 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 12 08:01:53 crc kubenswrapper[4599]: I1012 08:01:53.746248 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 12 08:01:53 crc kubenswrapper[4599]: I1012 08:01:53.746410 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 12 08:01:53 crc kubenswrapper[4599]: I1012 08:01:53.746452 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-x82pw" Oct 12 08:01:53 crc kubenswrapper[4599]: I1012 08:01:53.749258 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-ssdqc"] Oct 12 08:01:53 crc kubenswrapper[4599]: I1012 08:01:53.827877 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/53901e82-60d6-4dd0-9ec9-15851ecb4215-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-ssdqc\" (UID: \"53901e82-60d6-4dd0-9ec9-15851ecb4215\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ssdqc" Oct 12 08:01:53 crc kubenswrapper[4599]: I1012 08:01:53.827942 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krs89\" (UniqueName: \"kubernetes.io/projected/53901e82-60d6-4dd0-9ec9-15851ecb4215-kube-api-access-krs89\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-ssdqc\" (UID: \"53901e82-60d6-4dd0-9ec9-15851ecb4215\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ssdqc" Oct 12 08:01:53 crc kubenswrapper[4599]: I1012 08:01:53.827980 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/53901e82-60d6-4dd0-9ec9-15851ecb4215-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-ssdqc\" (UID: \"53901e82-60d6-4dd0-9ec9-15851ecb4215\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ssdqc" Oct 12 08:01:53 crc kubenswrapper[4599]: I1012 08:01:53.930014 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/53901e82-60d6-4dd0-9ec9-15851ecb4215-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-ssdqc\" (UID: \"53901e82-60d6-4dd0-9ec9-15851ecb4215\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ssdqc" Oct 12 08:01:53 crc kubenswrapper[4599]: I1012 08:01:53.930073 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krs89\" (UniqueName: \"kubernetes.io/projected/53901e82-60d6-4dd0-9ec9-15851ecb4215-kube-api-access-krs89\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-ssdqc\" (UID: \"53901e82-60d6-4dd0-9ec9-15851ecb4215\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ssdqc" Oct 12 08:01:53 crc kubenswrapper[4599]: I1012 08:01:53.930106 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/53901e82-60d6-4dd0-9ec9-15851ecb4215-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-ssdqc\" (UID: \"53901e82-60d6-4dd0-9ec9-15851ecb4215\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ssdqc" Oct 12 08:01:53 crc kubenswrapper[4599]: I1012 08:01:53.936280 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/53901e82-60d6-4dd0-9ec9-15851ecb4215-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-ssdqc\" (UID: \"53901e82-60d6-4dd0-9ec9-15851ecb4215\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ssdqc" Oct 12 08:01:53 crc kubenswrapper[4599]: I1012 08:01:53.936302 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/53901e82-60d6-4dd0-9ec9-15851ecb4215-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-ssdqc\" (UID: \"53901e82-60d6-4dd0-9ec9-15851ecb4215\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ssdqc" Oct 12 08:01:53 crc kubenswrapper[4599]: I1012 08:01:53.945509 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krs89\" (UniqueName: \"kubernetes.io/projected/53901e82-60d6-4dd0-9ec9-15851ecb4215-kube-api-access-krs89\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-ssdqc\" (UID: \"53901e82-60d6-4dd0-9ec9-15851ecb4215\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ssdqc" Oct 12 08:01:54 crc kubenswrapper[4599]: I1012 08:01:54.060543 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ssdqc" Oct 12 08:01:54 crc kubenswrapper[4599]: I1012 08:01:54.521905 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-ssdqc"] Oct 12 08:01:54 crc kubenswrapper[4599]: I1012 08:01:54.689598 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ssdqc" event={"ID":"53901e82-60d6-4dd0-9ec9-15851ecb4215","Type":"ContainerStarted","Data":"244c87a6afd51637700bc68b491e793d324b79f23f7a9578b23868bd947e2bd1"} Oct 12 08:01:55 crc kubenswrapper[4599]: I1012 08:01:55.701180 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ssdqc" event={"ID":"53901e82-60d6-4dd0-9ec9-15851ecb4215","Type":"ContainerStarted","Data":"041576f4dee23930ea589522ee19ae93dc612fa7a747c781defca8e3bc26b5b1"} Oct 12 08:01:55 crc kubenswrapper[4599]: I1012 08:01:55.719093 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ssdqc" podStartSLOduration=2.182243512 podStartE2EDuration="2.719072473s" podCreationTimestamp="2025-10-12 08:01:53 +0000 UTC" firstStartedPulling="2025-10-12 08:01:54.524721597 +0000 UTC m=+1611.313917100" lastFinishedPulling="2025-10-12 08:01:55.061550559 +0000 UTC m=+1611.850746061" observedRunningTime="2025-10-12 08:01:55.718470176 +0000 UTC m=+1612.507665679" watchObservedRunningTime="2025-10-12 08:01:55.719072473 +0000 UTC m=+1612.508267975" Oct 12 08:01:58 crc kubenswrapper[4599]: I1012 08:01:58.547007 4599 scope.go:117] "RemoveContainer" containerID="f369260da605ea09588ec62c0ee99215abbfa382e3bbd4d53aa9571e2652f968" Oct 12 08:01:58 crc kubenswrapper[4599]: E1012 08:01:58.547836 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5mz5c_openshift-machine-config-operator(cc694bce-8c25-4729-b452-29d44d3efe6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" Oct 12 08:01:59 crc kubenswrapper[4599]: I1012 08:01:59.046271 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-mz8gv"] Oct 12 08:01:59 crc kubenswrapper[4599]: I1012 08:01:59.055244 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-mz8gv"] Oct 12 08:01:59 crc kubenswrapper[4599]: I1012 08:01:59.556844 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7331ab7-59b5-4d49-9314-e59cbb00f156" path="/var/lib/kubelet/pods/e7331ab7-59b5-4d49-9314-e59cbb00f156/volumes" Oct 12 08:02:01 crc kubenswrapper[4599]: I1012 08:02:01.760114 4599 generic.go:334] "Generic (PLEG): container finished" podID="53901e82-60d6-4dd0-9ec9-15851ecb4215" containerID="041576f4dee23930ea589522ee19ae93dc612fa7a747c781defca8e3bc26b5b1" exitCode=0 Oct 12 08:02:01 crc kubenswrapper[4599]: I1012 08:02:01.760217 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ssdqc" event={"ID":"53901e82-60d6-4dd0-9ec9-15851ecb4215","Type":"ContainerDied","Data":"041576f4dee23930ea589522ee19ae93dc612fa7a747c781defca8e3bc26b5b1"} Oct 12 08:02:03 crc kubenswrapper[4599]: I1012 08:02:03.083926 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ssdqc" Oct 12 08:02:03 crc kubenswrapper[4599]: I1012 08:02:03.229198 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/53901e82-60d6-4dd0-9ec9-15851ecb4215-inventory\") pod \"53901e82-60d6-4dd0-9ec9-15851ecb4215\" (UID: \"53901e82-60d6-4dd0-9ec9-15851ecb4215\") " Oct 12 08:02:03 crc kubenswrapper[4599]: I1012 08:02:03.229274 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/53901e82-60d6-4dd0-9ec9-15851ecb4215-ssh-key\") pod \"53901e82-60d6-4dd0-9ec9-15851ecb4215\" (UID: \"53901e82-60d6-4dd0-9ec9-15851ecb4215\") " Oct 12 08:02:03 crc kubenswrapper[4599]: I1012 08:02:03.229313 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-krs89\" (UniqueName: \"kubernetes.io/projected/53901e82-60d6-4dd0-9ec9-15851ecb4215-kube-api-access-krs89\") pod \"53901e82-60d6-4dd0-9ec9-15851ecb4215\" (UID: \"53901e82-60d6-4dd0-9ec9-15851ecb4215\") " Oct 12 08:02:03 crc kubenswrapper[4599]: I1012 08:02:03.236221 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53901e82-60d6-4dd0-9ec9-15851ecb4215-kube-api-access-krs89" (OuterVolumeSpecName: "kube-api-access-krs89") pod "53901e82-60d6-4dd0-9ec9-15851ecb4215" (UID: "53901e82-60d6-4dd0-9ec9-15851ecb4215"). InnerVolumeSpecName "kube-api-access-krs89". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 08:02:03 crc kubenswrapper[4599]: I1012 08:02:03.253853 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53901e82-60d6-4dd0-9ec9-15851ecb4215-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "53901e82-60d6-4dd0-9ec9-15851ecb4215" (UID: "53901e82-60d6-4dd0-9ec9-15851ecb4215"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 08:02:03 crc kubenswrapper[4599]: I1012 08:02:03.256526 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53901e82-60d6-4dd0-9ec9-15851ecb4215-inventory" (OuterVolumeSpecName: "inventory") pod "53901e82-60d6-4dd0-9ec9-15851ecb4215" (UID: "53901e82-60d6-4dd0-9ec9-15851ecb4215"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 08:02:03 crc kubenswrapper[4599]: I1012 08:02:03.331807 4599 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/53901e82-60d6-4dd0-9ec9-15851ecb4215-inventory\") on node \"crc\" DevicePath \"\"" Oct 12 08:02:03 crc kubenswrapper[4599]: I1012 08:02:03.331898 4599 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/53901e82-60d6-4dd0-9ec9-15851ecb4215-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 12 08:02:03 crc kubenswrapper[4599]: I1012 08:02:03.331955 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-krs89\" (UniqueName: \"kubernetes.io/projected/53901e82-60d6-4dd0-9ec9-15851ecb4215-kube-api-access-krs89\") on node \"crc\" DevicePath \"\"" Oct 12 08:02:03 crc kubenswrapper[4599]: I1012 08:02:03.785822 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ssdqc" event={"ID":"53901e82-60d6-4dd0-9ec9-15851ecb4215","Type":"ContainerDied","Data":"244c87a6afd51637700bc68b491e793d324b79f23f7a9578b23868bd947e2bd1"} Oct 12 08:02:03 crc kubenswrapper[4599]: I1012 08:02:03.786147 4599 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="244c87a6afd51637700bc68b491e793d324b79f23f7a9578b23868bd947e2bd1" Oct 12 08:02:03 crc kubenswrapper[4599]: I1012 08:02:03.785931 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ssdqc" Oct 12 08:02:03 crc kubenswrapper[4599]: I1012 08:02:03.841367 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rvv95"] Oct 12 08:02:03 crc kubenswrapper[4599]: E1012 08:02:03.841909 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53901e82-60d6-4dd0-9ec9-15851ecb4215" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 12 08:02:03 crc kubenswrapper[4599]: I1012 08:02:03.841928 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="53901e82-60d6-4dd0-9ec9-15851ecb4215" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 12 08:02:03 crc kubenswrapper[4599]: I1012 08:02:03.842249 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="53901e82-60d6-4dd0-9ec9-15851ecb4215" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 12 08:02:03 crc kubenswrapper[4599]: I1012 08:02:03.843034 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rvv95" Oct 12 08:02:03 crc kubenswrapper[4599]: I1012 08:02:03.845444 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-x82pw" Oct 12 08:02:03 crc kubenswrapper[4599]: I1012 08:02:03.845531 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 12 08:02:03 crc kubenswrapper[4599]: I1012 08:02:03.846808 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 12 08:02:03 crc kubenswrapper[4599]: I1012 08:02:03.850991 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 12 08:02:03 crc kubenswrapper[4599]: I1012 08:02:03.851921 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rvv95"] Oct 12 08:02:03 crc kubenswrapper[4599]: I1012 08:02:03.948419 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e741b043-6773-4cbb-88fc-d3dc8cd7d39d-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-rvv95\" (UID: \"e741b043-6773-4cbb-88fc-d3dc8cd7d39d\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rvv95" Oct 12 08:02:03 crc kubenswrapper[4599]: I1012 08:02:03.948479 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e741b043-6773-4cbb-88fc-d3dc8cd7d39d-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-rvv95\" (UID: \"e741b043-6773-4cbb-88fc-d3dc8cd7d39d\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rvv95" Oct 12 08:02:03 crc kubenswrapper[4599]: I1012 08:02:03.948743 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59kbv\" (UniqueName: \"kubernetes.io/projected/e741b043-6773-4cbb-88fc-d3dc8cd7d39d-kube-api-access-59kbv\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-rvv95\" (UID: \"e741b043-6773-4cbb-88fc-d3dc8cd7d39d\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rvv95" Oct 12 08:02:04 crc kubenswrapper[4599]: I1012 08:02:04.052150 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e741b043-6773-4cbb-88fc-d3dc8cd7d39d-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-rvv95\" (UID: \"e741b043-6773-4cbb-88fc-d3dc8cd7d39d\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rvv95" Oct 12 08:02:04 crc kubenswrapper[4599]: I1012 08:02:04.052205 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e741b043-6773-4cbb-88fc-d3dc8cd7d39d-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-rvv95\" (UID: \"e741b043-6773-4cbb-88fc-d3dc8cd7d39d\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rvv95" Oct 12 08:02:04 crc kubenswrapper[4599]: I1012 08:02:04.052281 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59kbv\" (UniqueName: \"kubernetes.io/projected/e741b043-6773-4cbb-88fc-d3dc8cd7d39d-kube-api-access-59kbv\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-rvv95\" (UID: \"e741b043-6773-4cbb-88fc-d3dc8cd7d39d\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rvv95" Oct 12 08:02:04 crc kubenswrapper[4599]: I1012 08:02:04.058751 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e741b043-6773-4cbb-88fc-d3dc8cd7d39d-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-rvv95\" (UID: \"e741b043-6773-4cbb-88fc-d3dc8cd7d39d\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rvv95" Oct 12 08:02:04 crc kubenswrapper[4599]: I1012 08:02:04.060260 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e741b043-6773-4cbb-88fc-d3dc8cd7d39d-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-rvv95\" (UID: \"e741b043-6773-4cbb-88fc-d3dc8cd7d39d\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rvv95" Oct 12 08:02:04 crc kubenswrapper[4599]: I1012 08:02:04.068304 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59kbv\" (UniqueName: \"kubernetes.io/projected/e741b043-6773-4cbb-88fc-d3dc8cd7d39d-kube-api-access-59kbv\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-rvv95\" (UID: \"e741b043-6773-4cbb-88fc-d3dc8cd7d39d\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rvv95" Oct 12 08:02:04 crc kubenswrapper[4599]: I1012 08:02:04.175480 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rvv95" Oct 12 08:02:04 crc kubenswrapper[4599]: I1012 08:02:04.631941 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rvv95"] Oct 12 08:02:04 crc kubenswrapper[4599]: I1012 08:02:04.796595 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rvv95" event={"ID":"e741b043-6773-4cbb-88fc-d3dc8cd7d39d","Type":"ContainerStarted","Data":"16946e7e59ceb17fb49838dd4f59f3e84b94761476c7e4509055b478a6f70728"} Oct 12 08:02:05 crc kubenswrapper[4599]: I1012 08:02:05.810019 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rvv95" event={"ID":"e741b043-6773-4cbb-88fc-d3dc8cd7d39d","Type":"ContainerStarted","Data":"9811007120ab1f1fd41e78ff7a1dcea3eaac1f4c7ffcc2dba032e35aa0409c0b"} Oct 12 08:02:05 crc kubenswrapper[4599]: I1012 08:02:05.826480 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rvv95" podStartSLOduration=2.115871486 podStartE2EDuration="2.826451223s" podCreationTimestamp="2025-10-12 08:02:03 +0000 UTC" firstStartedPulling="2025-10-12 08:02:04.639914483 +0000 UTC m=+1621.429109985" lastFinishedPulling="2025-10-12 08:02:05.35049422 +0000 UTC m=+1622.139689722" observedRunningTime="2025-10-12 08:02:05.824074452 +0000 UTC m=+1622.613269943" watchObservedRunningTime="2025-10-12 08:02:05.826451223 +0000 UTC m=+1622.615646725" Oct 12 08:02:07 crc kubenswrapper[4599]: I1012 08:02:07.304996 4599 scope.go:117] "RemoveContainer" containerID="4d592c4300dd77d3ce4d85000060919b3b15a7d14a77df4f4fd43b2a04a20080" Oct 12 08:02:07 crc kubenswrapper[4599]: I1012 08:02:07.350937 4599 scope.go:117] "RemoveContainer" containerID="e36713052874f7760fa2c79f35758ba1305bdd4f4f880c287b39882eb5b5c250" Oct 12 08:02:07 crc kubenswrapper[4599]: I1012 08:02:07.389179 4599 scope.go:117] "RemoveContainer" containerID="9251b6efd4054f03bf78b785b242d671aede9a26bb2415e8b96330e3b4e6d0de" Oct 12 08:02:11 crc kubenswrapper[4599]: I1012 08:02:11.546174 4599 scope.go:117] "RemoveContainer" containerID="f369260da605ea09588ec62c0ee99215abbfa382e3bbd4d53aa9571e2652f968" Oct 12 08:02:11 crc kubenswrapper[4599]: E1012 08:02:11.547117 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5mz5c_openshift-machine-config-operator(cc694bce-8c25-4729-b452-29d44d3efe6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" Oct 12 08:02:12 crc kubenswrapper[4599]: I1012 08:02:12.868986 4599 generic.go:334] "Generic (PLEG): container finished" podID="e741b043-6773-4cbb-88fc-d3dc8cd7d39d" containerID="9811007120ab1f1fd41e78ff7a1dcea3eaac1f4c7ffcc2dba032e35aa0409c0b" exitCode=0 Oct 12 08:02:12 crc kubenswrapper[4599]: I1012 08:02:12.869086 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rvv95" event={"ID":"e741b043-6773-4cbb-88fc-d3dc8cd7d39d","Type":"ContainerDied","Data":"9811007120ab1f1fd41e78ff7a1dcea3eaac1f4c7ffcc2dba032e35aa0409c0b"} Oct 12 08:02:14 crc kubenswrapper[4599]: I1012 08:02:14.194786 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rvv95" Oct 12 08:02:14 crc kubenswrapper[4599]: I1012 08:02:14.282240 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59kbv\" (UniqueName: \"kubernetes.io/projected/e741b043-6773-4cbb-88fc-d3dc8cd7d39d-kube-api-access-59kbv\") pod \"e741b043-6773-4cbb-88fc-d3dc8cd7d39d\" (UID: \"e741b043-6773-4cbb-88fc-d3dc8cd7d39d\") " Oct 12 08:02:14 crc kubenswrapper[4599]: I1012 08:02:14.282331 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e741b043-6773-4cbb-88fc-d3dc8cd7d39d-inventory\") pod \"e741b043-6773-4cbb-88fc-d3dc8cd7d39d\" (UID: \"e741b043-6773-4cbb-88fc-d3dc8cd7d39d\") " Oct 12 08:02:14 crc kubenswrapper[4599]: I1012 08:02:14.288460 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e741b043-6773-4cbb-88fc-d3dc8cd7d39d-kube-api-access-59kbv" (OuterVolumeSpecName: "kube-api-access-59kbv") pod "e741b043-6773-4cbb-88fc-d3dc8cd7d39d" (UID: "e741b043-6773-4cbb-88fc-d3dc8cd7d39d"). InnerVolumeSpecName "kube-api-access-59kbv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 08:02:14 crc kubenswrapper[4599]: I1012 08:02:14.306425 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e741b043-6773-4cbb-88fc-d3dc8cd7d39d-inventory" (OuterVolumeSpecName: "inventory") pod "e741b043-6773-4cbb-88fc-d3dc8cd7d39d" (UID: "e741b043-6773-4cbb-88fc-d3dc8cd7d39d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 08:02:14 crc kubenswrapper[4599]: I1012 08:02:14.383764 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e741b043-6773-4cbb-88fc-d3dc8cd7d39d-ssh-key\") pod \"e741b043-6773-4cbb-88fc-d3dc8cd7d39d\" (UID: \"e741b043-6773-4cbb-88fc-d3dc8cd7d39d\") " Oct 12 08:02:14 crc kubenswrapper[4599]: I1012 08:02:14.384317 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-59kbv\" (UniqueName: \"kubernetes.io/projected/e741b043-6773-4cbb-88fc-d3dc8cd7d39d-kube-api-access-59kbv\") on node \"crc\" DevicePath \"\"" Oct 12 08:02:14 crc kubenswrapper[4599]: I1012 08:02:14.384352 4599 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e741b043-6773-4cbb-88fc-d3dc8cd7d39d-inventory\") on node \"crc\" DevicePath \"\"" Oct 12 08:02:14 crc kubenswrapper[4599]: I1012 08:02:14.403550 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e741b043-6773-4cbb-88fc-d3dc8cd7d39d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e741b043-6773-4cbb-88fc-d3dc8cd7d39d" (UID: "e741b043-6773-4cbb-88fc-d3dc8cd7d39d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 08:02:14 crc kubenswrapper[4599]: I1012 08:02:14.486560 4599 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e741b043-6773-4cbb-88fc-d3dc8cd7d39d-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 12 08:02:14 crc kubenswrapper[4599]: I1012 08:02:14.887218 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rvv95" event={"ID":"e741b043-6773-4cbb-88fc-d3dc8cd7d39d","Type":"ContainerDied","Data":"16946e7e59ceb17fb49838dd4f59f3e84b94761476c7e4509055b478a6f70728"} Oct 12 08:02:14 crc kubenswrapper[4599]: I1012 08:02:14.887589 4599 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="16946e7e59ceb17fb49838dd4f59f3e84b94761476c7e4509055b478a6f70728" Oct 12 08:02:14 crc kubenswrapper[4599]: I1012 08:02:14.887288 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rvv95" Oct 12 08:02:14 crc kubenswrapper[4599]: I1012 08:02:14.955616 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hsx4"] Oct 12 08:02:14 crc kubenswrapper[4599]: E1012 08:02:14.956203 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e741b043-6773-4cbb-88fc-d3dc8cd7d39d" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 12 08:02:14 crc kubenswrapper[4599]: I1012 08:02:14.956228 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="e741b043-6773-4cbb-88fc-d3dc8cd7d39d" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 12 08:02:14 crc kubenswrapper[4599]: I1012 08:02:14.956474 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="e741b043-6773-4cbb-88fc-d3dc8cd7d39d" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 12 08:02:14 crc kubenswrapper[4599]: I1012 08:02:14.957322 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hsx4" Oct 12 08:02:14 crc kubenswrapper[4599]: I1012 08:02:14.959686 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 12 08:02:14 crc kubenswrapper[4599]: I1012 08:02:14.960282 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 12 08:02:14 crc kubenswrapper[4599]: I1012 08:02:14.960307 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Oct 12 08:02:14 crc kubenswrapper[4599]: I1012 08:02:14.960363 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 12 08:02:14 crc kubenswrapper[4599]: I1012 08:02:14.960373 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Oct 12 08:02:14 crc kubenswrapper[4599]: I1012 08:02:14.960647 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Oct 12 08:02:14 crc kubenswrapper[4599]: I1012 08:02:14.960762 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-x82pw" Oct 12 08:02:14 crc kubenswrapper[4599]: I1012 08:02:14.962610 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Oct 12 08:02:14 crc kubenswrapper[4599]: I1012 08:02:14.962844 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hsx4"] Oct 12 08:02:14 crc kubenswrapper[4599]: I1012 08:02:14.996788 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bc8b881e-7904-44fe-ae99-975ece57dc4c-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2hsx4\" (UID: \"bc8b881e-7904-44fe-ae99-975ece57dc4c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hsx4" Oct 12 08:02:14 crc kubenswrapper[4599]: I1012 08:02:14.996850 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc8b881e-7904-44fe-ae99-975ece57dc4c-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2hsx4\" (UID: \"bc8b881e-7904-44fe-ae99-975ece57dc4c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hsx4" Oct 12 08:02:14 crc kubenswrapper[4599]: I1012 08:02:14.996925 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc8b881e-7904-44fe-ae99-975ece57dc4c-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2hsx4\" (UID: \"bc8b881e-7904-44fe-ae99-975ece57dc4c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hsx4" Oct 12 08:02:14 crc kubenswrapper[4599]: I1012 08:02:14.996992 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc8b881e-7904-44fe-ae99-975ece57dc4c-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2hsx4\" (UID: \"bc8b881e-7904-44fe-ae99-975ece57dc4c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hsx4" Oct 12 08:02:14 crc kubenswrapper[4599]: I1012 08:02:14.997138 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc8b881e-7904-44fe-ae99-975ece57dc4c-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2hsx4\" (UID: \"bc8b881e-7904-44fe-ae99-975ece57dc4c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hsx4" Oct 12 08:02:14 crc kubenswrapper[4599]: I1012 08:02:14.997202 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/bc8b881e-7904-44fe-ae99-975ece57dc4c-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2hsx4\" (UID: \"bc8b881e-7904-44fe-ae99-975ece57dc4c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hsx4" Oct 12 08:02:14 crc kubenswrapper[4599]: I1012 08:02:14.997240 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc8b881e-7904-44fe-ae99-975ece57dc4c-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2hsx4\" (UID: \"bc8b881e-7904-44fe-ae99-975ece57dc4c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hsx4" Oct 12 08:02:14 crc kubenswrapper[4599]: I1012 08:02:14.997298 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bc8b881e-7904-44fe-ae99-975ece57dc4c-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2hsx4\" (UID: \"bc8b881e-7904-44fe-ae99-975ece57dc4c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hsx4" Oct 12 08:02:14 crc kubenswrapper[4599]: I1012 08:02:14.997398 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/bc8b881e-7904-44fe-ae99-975ece57dc4c-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2hsx4\" (UID: \"bc8b881e-7904-44fe-ae99-975ece57dc4c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hsx4" Oct 12 08:02:14 crc kubenswrapper[4599]: I1012 08:02:14.997445 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/bc8b881e-7904-44fe-ae99-975ece57dc4c-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2hsx4\" (UID: \"bc8b881e-7904-44fe-ae99-975ece57dc4c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hsx4" Oct 12 08:02:14 crc kubenswrapper[4599]: I1012 08:02:14.997476 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/bc8b881e-7904-44fe-ae99-975ece57dc4c-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2hsx4\" (UID: \"bc8b881e-7904-44fe-ae99-975ece57dc4c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hsx4" Oct 12 08:02:14 crc kubenswrapper[4599]: I1012 08:02:14.997502 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qsjm\" (UniqueName: \"kubernetes.io/projected/bc8b881e-7904-44fe-ae99-975ece57dc4c-kube-api-access-7qsjm\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2hsx4\" (UID: \"bc8b881e-7904-44fe-ae99-975ece57dc4c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hsx4" Oct 12 08:02:14 crc kubenswrapper[4599]: I1012 08:02:14.997610 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc8b881e-7904-44fe-ae99-975ece57dc4c-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2hsx4\" (UID: \"bc8b881e-7904-44fe-ae99-975ece57dc4c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hsx4" Oct 12 08:02:14 crc kubenswrapper[4599]: I1012 08:02:14.997770 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc8b881e-7904-44fe-ae99-975ece57dc4c-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2hsx4\" (UID: \"bc8b881e-7904-44fe-ae99-975ece57dc4c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hsx4" Oct 12 08:02:15 crc kubenswrapper[4599]: I1012 08:02:15.099397 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc8b881e-7904-44fe-ae99-975ece57dc4c-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2hsx4\" (UID: \"bc8b881e-7904-44fe-ae99-975ece57dc4c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hsx4" Oct 12 08:02:15 crc kubenswrapper[4599]: I1012 08:02:15.099445 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bc8b881e-7904-44fe-ae99-975ece57dc4c-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2hsx4\" (UID: \"bc8b881e-7904-44fe-ae99-975ece57dc4c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hsx4" Oct 12 08:02:15 crc kubenswrapper[4599]: I1012 08:02:15.099479 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc8b881e-7904-44fe-ae99-975ece57dc4c-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2hsx4\" (UID: \"bc8b881e-7904-44fe-ae99-975ece57dc4c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hsx4" Oct 12 08:02:15 crc kubenswrapper[4599]: I1012 08:02:15.099512 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc8b881e-7904-44fe-ae99-975ece57dc4c-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2hsx4\" (UID: \"bc8b881e-7904-44fe-ae99-975ece57dc4c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hsx4" Oct 12 08:02:15 crc kubenswrapper[4599]: I1012 08:02:15.099552 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc8b881e-7904-44fe-ae99-975ece57dc4c-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2hsx4\" (UID: \"bc8b881e-7904-44fe-ae99-975ece57dc4c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hsx4" Oct 12 08:02:15 crc kubenswrapper[4599]: I1012 08:02:15.099732 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc8b881e-7904-44fe-ae99-975ece57dc4c-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2hsx4\" (UID: \"bc8b881e-7904-44fe-ae99-975ece57dc4c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hsx4" Oct 12 08:02:15 crc kubenswrapper[4599]: I1012 08:02:15.099786 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/bc8b881e-7904-44fe-ae99-975ece57dc4c-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2hsx4\" (UID: \"bc8b881e-7904-44fe-ae99-975ece57dc4c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hsx4" Oct 12 08:02:15 crc kubenswrapper[4599]: I1012 08:02:15.100452 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc8b881e-7904-44fe-ae99-975ece57dc4c-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2hsx4\" (UID: \"bc8b881e-7904-44fe-ae99-975ece57dc4c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hsx4" Oct 12 08:02:15 crc kubenswrapper[4599]: I1012 08:02:15.100505 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bc8b881e-7904-44fe-ae99-975ece57dc4c-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2hsx4\" (UID: \"bc8b881e-7904-44fe-ae99-975ece57dc4c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hsx4" Oct 12 08:02:15 crc kubenswrapper[4599]: I1012 08:02:15.100569 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/bc8b881e-7904-44fe-ae99-975ece57dc4c-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2hsx4\" (UID: \"bc8b881e-7904-44fe-ae99-975ece57dc4c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hsx4" Oct 12 08:02:15 crc kubenswrapper[4599]: I1012 08:02:15.100602 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/bc8b881e-7904-44fe-ae99-975ece57dc4c-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2hsx4\" (UID: \"bc8b881e-7904-44fe-ae99-975ece57dc4c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hsx4" Oct 12 08:02:15 crc kubenswrapper[4599]: I1012 08:02:15.100628 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/bc8b881e-7904-44fe-ae99-975ece57dc4c-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2hsx4\" (UID: \"bc8b881e-7904-44fe-ae99-975ece57dc4c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hsx4" Oct 12 08:02:15 crc kubenswrapper[4599]: I1012 08:02:15.100650 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qsjm\" (UniqueName: \"kubernetes.io/projected/bc8b881e-7904-44fe-ae99-975ece57dc4c-kube-api-access-7qsjm\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2hsx4\" (UID: \"bc8b881e-7904-44fe-ae99-975ece57dc4c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hsx4" Oct 12 08:02:15 crc kubenswrapper[4599]: I1012 08:02:15.100691 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc8b881e-7904-44fe-ae99-975ece57dc4c-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2hsx4\" (UID: \"bc8b881e-7904-44fe-ae99-975ece57dc4c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hsx4" Oct 12 08:02:15 crc kubenswrapper[4599]: I1012 08:02:15.103916 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/bc8b881e-7904-44fe-ae99-975ece57dc4c-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2hsx4\" (UID: \"bc8b881e-7904-44fe-ae99-975ece57dc4c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hsx4" Oct 12 08:02:15 crc kubenswrapper[4599]: I1012 08:02:15.105265 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc8b881e-7904-44fe-ae99-975ece57dc4c-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2hsx4\" (UID: \"bc8b881e-7904-44fe-ae99-975ece57dc4c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hsx4" Oct 12 08:02:15 crc kubenswrapper[4599]: I1012 08:02:15.105515 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc8b881e-7904-44fe-ae99-975ece57dc4c-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2hsx4\" (UID: \"bc8b881e-7904-44fe-ae99-975ece57dc4c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hsx4" Oct 12 08:02:15 crc kubenswrapper[4599]: I1012 08:02:15.105617 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc8b881e-7904-44fe-ae99-975ece57dc4c-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2hsx4\" (UID: \"bc8b881e-7904-44fe-ae99-975ece57dc4c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hsx4" Oct 12 08:02:15 crc kubenswrapper[4599]: I1012 08:02:15.105627 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc8b881e-7904-44fe-ae99-975ece57dc4c-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2hsx4\" (UID: \"bc8b881e-7904-44fe-ae99-975ece57dc4c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hsx4" Oct 12 08:02:15 crc kubenswrapper[4599]: I1012 08:02:15.105777 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bc8b881e-7904-44fe-ae99-975ece57dc4c-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2hsx4\" (UID: \"bc8b881e-7904-44fe-ae99-975ece57dc4c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hsx4" Oct 12 08:02:15 crc kubenswrapper[4599]: I1012 08:02:15.105808 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/bc8b881e-7904-44fe-ae99-975ece57dc4c-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2hsx4\" (UID: \"bc8b881e-7904-44fe-ae99-975ece57dc4c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hsx4" Oct 12 08:02:15 crc kubenswrapper[4599]: I1012 08:02:15.105936 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc8b881e-7904-44fe-ae99-975ece57dc4c-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2hsx4\" (UID: \"bc8b881e-7904-44fe-ae99-975ece57dc4c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hsx4" Oct 12 08:02:15 crc kubenswrapper[4599]: I1012 08:02:15.106863 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc8b881e-7904-44fe-ae99-975ece57dc4c-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2hsx4\" (UID: \"bc8b881e-7904-44fe-ae99-975ece57dc4c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hsx4" Oct 12 08:02:15 crc kubenswrapper[4599]: I1012 08:02:15.107230 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/bc8b881e-7904-44fe-ae99-975ece57dc4c-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2hsx4\" (UID: \"bc8b881e-7904-44fe-ae99-975ece57dc4c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hsx4" Oct 12 08:02:15 crc kubenswrapper[4599]: I1012 08:02:15.107923 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/bc8b881e-7904-44fe-ae99-975ece57dc4c-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2hsx4\" (UID: \"bc8b881e-7904-44fe-ae99-975ece57dc4c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hsx4" Oct 12 08:02:15 crc kubenswrapper[4599]: I1012 08:02:15.108012 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bc8b881e-7904-44fe-ae99-975ece57dc4c-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2hsx4\" (UID: \"bc8b881e-7904-44fe-ae99-975ece57dc4c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hsx4" Oct 12 08:02:15 crc kubenswrapper[4599]: I1012 08:02:15.109673 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc8b881e-7904-44fe-ae99-975ece57dc4c-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2hsx4\" (UID: \"bc8b881e-7904-44fe-ae99-975ece57dc4c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hsx4" Oct 12 08:02:15 crc kubenswrapper[4599]: I1012 08:02:15.118247 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qsjm\" (UniqueName: \"kubernetes.io/projected/bc8b881e-7904-44fe-ae99-975ece57dc4c-kube-api-access-7qsjm\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2hsx4\" (UID: \"bc8b881e-7904-44fe-ae99-975ece57dc4c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hsx4" Oct 12 08:02:15 crc kubenswrapper[4599]: I1012 08:02:15.273457 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hsx4" Oct 12 08:02:15 crc kubenswrapper[4599]: I1012 08:02:15.722949 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hsx4"] Oct 12 08:02:15 crc kubenswrapper[4599]: I1012 08:02:15.898296 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hsx4" event={"ID":"bc8b881e-7904-44fe-ae99-975ece57dc4c","Type":"ContainerStarted","Data":"c419cfa31fdfd362eb5edc49bd5fbf6e3b0d3a87503a3b8c073c1bb9ee54f231"} Oct 12 08:02:16 crc kubenswrapper[4599]: I1012 08:02:16.906257 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hsx4" event={"ID":"bc8b881e-7904-44fe-ae99-975ece57dc4c","Type":"ContainerStarted","Data":"149f1258a60e2ec0043e4a881f0cac111fa73ca8a6d10f75ffda3ad11c2f62fa"} Oct 12 08:02:16 crc kubenswrapper[4599]: I1012 08:02:16.928151 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hsx4" podStartSLOduration=2.325604348 podStartE2EDuration="2.928135483s" podCreationTimestamp="2025-10-12 08:02:14 +0000 UTC" firstStartedPulling="2025-10-12 08:02:15.727035157 +0000 UTC m=+1632.516230659" lastFinishedPulling="2025-10-12 08:02:16.329566301 +0000 UTC m=+1633.118761794" observedRunningTime="2025-10-12 08:02:16.92512685 +0000 UTC m=+1633.714322362" watchObservedRunningTime="2025-10-12 08:02:16.928135483 +0000 UTC m=+1633.717330985" Oct 12 08:02:22 crc kubenswrapper[4599]: I1012 08:02:22.544980 4599 scope.go:117] "RemoveContainer" containerID="f369260da605ea09588ec62c0ee99215abbfa382e3bbd4d53aa9571e2652f968" Oct 12 08:02:22 crc kubenswrapper[4599]: E1012 08:02:22.545717 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5mz5c_openshift-machine-config-operator(cc694bce-8c25-4729-b452-29d44d3efe6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" Oct 12 08:02:34 crc kubenswrapper[4599]: I1012 08:02:34.545954 4599 scope.go:117] "RemoveContainer" containerID="f369260da605ea09588ec62c0ee99215abbfa382e3bbd4d53aa9571e2652f968" Oct 12 08:02:34 crc kubenswrapper[4599]: E1012 08:02:34.547088 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5mz5c_openshift-machine-config-operator(cc694bce-8c25-4729-b452-29d44d3efe6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" Oct 12 08:02:43 crc kubenswrapper[4599]: I1012 08:02:43.098780 4599 generic.go:334] "Generic (PLEG): container finished" podID="bc8b881e-7904-44fe-ae99-975ece57dc4c" containerID="149f1258a60e2ec0043e4a881f0cac111fa73ca8a6d10f75ffda3ad11c2f62fa" exitCode=0 Oct 12 08:02:43 crc kubenswrapper[4599]: I1012 08:02:43.098904 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hsx4" event={"ID":"bc8b881e-7904-44fe-ae99-975ece57dc4c","Type":"ContainerDied","Data":"149f1258a60e2ec0043e4a881f0cac111fa73ca8a6d10f75ffda3ad11c2f62fa"} Oct 12 08:02:44 crc kubenswrapper[4599]: I1012 08:02:44.409465 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hsx4" Oct 12 08:02:44 crc kubenswrapper[4599]: I1012 08:02:44.547628 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7qsjm\" (UniqueName: \"kubernetes.io/projected/bc8b881e-7904-44fe-ae99-975ece57dc4c-kube-api-access-7qsjm\") pod \"bc8b881e-7904-44fe-ae99-975ece57dc4c\" (UID: \"bc8b881e-7904-44fe-ae99-975ece57dc4c\") " Oct 12 08:02:44 crc kubenswrapper[4599]: I1012 08:02:44.547679 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/bc8b881e-7904-44fe-ae99-975ece57dc4c-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"bc8b881e-7904-44fe-ae99-975ece57dc4c\" (UID: \"bc8b881e-7904-44fe-ae99-975ece57dc4c\") " Oct 12 08:02:44 crc kubenswrapper[4599]: I1012 08:02:44.547785 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc8b881e-7904-44fe-ae99-975ece57dc4c-nova-combined-ca-bundle\") pod \"bc8b881e-7904-44fe-ae99-975ece57dc4c\" (UID: \"bc8b881e-7904-44fe-ae99-975ece57dc4c\") " Oct 12 08:02:44 crc kubenswrapper[4599]: I1012 08:02:44.548002 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc8b881e-7904-44fe-ae99-975ece57dc4c-ovn-combined-ca-bundle\") pod \"bc8b881e-7904-44fe-ae99-975ece57dc4c\" (UID: \"bc8b881e-7904-44fe-ae99-975ece57dc4c\") " Oct 12 08:02:44 crc kubenswrapper[4599]: I1012 08:02:44.548038 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bc8b881e-7904-44fe-ae99-975ece57dc4c-inventory\") pod \"bc8b881e-7904-44fe-ae99-975ece57dc4c\" (UID: \"bc8b881e-7904-44fe-ae99-975ece57dc4c\") " Oct 12 08:02:44 crc kubenswrapper[4599]: I1012 08:02:44.548056 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/bc8b881e-7904-44fe-ae99-975ece57dc4c-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"bc8b881e-7904-44fe-ae99-975ece57dc4c\" (UID: \"bc8b881e-7904-44fe-ae99-975ece57dc4c\") " Oct 12 08:02:44 crc kubenswrapper[4599]: I1012 08:02:44.548110 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc8b881e-7904-44fe-ae99-975ece57dc4c-bootstrap-combined-ca-bundle\") pod \"bc8b881e-7904-44fe-ae99-975ece57dc4c\" (UID: \"bc8b881e-7904-44fe-ae99-975ece57dc4c\") " Oct 12 08:02:44 crc kubenswrapper[4599]: I1012 08:02:44.548203 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc8b881e-7904-44fe-ae99-975ece57dc4c-telemetry-combined-ca-bundle\") pod \"bc8b881e-7904-44fe-ae99-975ece57dc4c\" (UID: \"bc8b881e-7904-44fe-ae99-975ece57dc4c\") " Oct 12 08:02:44 crc kubenswrapper[4599]: I1012 08:02:44.548247 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc8b881e-7904-44fe-ae99-975ece57dc4c-neutron-metadata-combined-ca-bundle\") pod \"bc8b881e-7904-44fe-ae99-975ece57dc4c\" (UID: \"bc8b881e-7904-44fe-ae99-975ece57dc4c\") " Oct 12 08:02:44 crc kubenswrapper[4599]: I1012 08:02:44.548298 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc8b881e-7904-44fe-ae99-975ece57dc4c-repo-setup-combined-ca-bundle\") pod \"bc8b881e-7904-44fe-ae99-975ece57dc4c\" (UID: \"bc8b881e-7904-44fe-ae99-975ece57dc4c\") " Oct 12 08:02:44 crc kubenswrapper[4599]: I1012 08:02:44.548321 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bc8b881e-7904-44fe-ae99-975ece57dc4c-ssh-key\") pod \"bc8b881e-7904-44fe-ae99-975ece57dc4c\" (UID: \"bc8b881e-7904-44fe-ae99-975ece57dc4c\") " Oct 12 08:02:44 crc kubenswrapper[4599]: I1012 08:02:44.548365 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/bc8b881e-7904-44fe-ae99-975ece57dc4c-openstack-edpm-ipam-ovn-default-certs-0\") pod \"bc8b881e-7904-44fe-ae99-975ece57dc4c\" (UID: \"bc8b881e-7904-44fe-ae99-975ece57dc4c\") " Oct 12 08:02:44 crc kubenswrapper[4599]: I1012 08:02:44.548383 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc8b881e-7904-44fe-ae99-975ece57dc4c-libvirt-combined-ca-bundle\") pod \"bc8b881e-7904-44fe-ae99-975ece57dc4c\" (UID: \"bc8b881e-7904-44fe-ae99-975ece57dc4c\") " Oct 12 08:02:44 crc kubenswrapper[4599]: I1012 08:02:44.548400 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/bc8b881e-7904-44fe-ae99-975ece57dc4c-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"bc8b881e-7904-44fe-ae99-975ece57dc4c\" (UID: \"bc8b881e-7904-44fe-ae99-975ece57dc4c\") " Oct 12 08:02:44 crc kubenswrapper[4599]: I1012 08:02:44.553502 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc8b881e-7904-44fe-ae99-975ece57dc4c-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "bc8b881e-7904-44fe-ae99-975ece57dc4c" (UID: "bc8b881e-7904-44fe-ae99-975ece57dc4c"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 08:02:44 crc kubenswrapper[4599]: I1012 08:02:44.553833 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc8b881e-7904-44fe-ae99-975ece57dc4c-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "bc8b881e-7904-44fe-ae99-975ece57dc4c" (UID: "bc8b881e-7904-44fe-ae99-975ece57dc4c"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 08:02:44 crc kubenswrapper[4599]: I1012 08:02:44.554258 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc8b881e-7904-44fe-ae99-975ece57dc4c-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "bc8b881e-7904-44fe-ae99-975ece57dc4c" (UID: "bc8b881e-7904-44fe-ae99-975ece57dc4c"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 08:02:44 crc kubenswrapper[4599]: I1012 08:02:44.554980 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc8b881e-7904-44fe-ae99-975ece57dc4c-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "bc8b881e-7904-44fe-ae99-975ece57dc4c" (UID: "bc8b881e-7904-44fe-ae99-975ece57dc4c"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 08:02:44 crc kubenswrapper[4599]: I1012 08:02:44.555199 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc8b881e-7904-44fe-ae99-975ece57dc4c-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "bc8b881e-7904-44fe-ae99-975ece57dc4c" (UID: "bc8b881e-7904-44fe-ae99-975ece57dc4c"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 08:02:44 crc kubenswrapper[4599]: I1012 08:02:44.555199 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc8b881e-7904-44fe-ae99-975ece57dc4c-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "bc8b881e-7904-44fe-ae99-975ece57dc4c" (UID: "bc8b881e-7904-44fe-ae99-975ece57dc4c"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 08:02:44 crc kubenswrapper[4599]: I1012 08:02:44.555240 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc8b881e-7904-44fe-ae99-975ece57dc4c-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "bc8b881e-7904-44fe-ae99-975ece57dc4c" (UID: "bc8b881e-7904-44fe-ae99-975ece57dc4c"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 08:02:44 crc kubenswrapper[4599]: I1012 08:02:44.555584 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc8b881e-7904-44fe-ae99-975ece57dc4c-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "bc8b881e-7904-44fe-ae99-975ece57dc4c" (UID: "bc8b881e-7904-44fe-ae99-975ece57dc4c"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 08:02:44 crc kubenswrapper[4599]: I1012 08:02:44.555602 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc8b881e-7904-44fe-ae99-975ece57dc4c-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "bc8b881e-7904-44fe-ae99-975ece57dc4c" (UID: "bc8b881e-7904-44fe-ae99-975ece57dc4c"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 08:02:44 crc kubenswrapper[4599]: I1012 08:02:44.555953 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc8b881e-7904-44fe-ae99-975ece57dc4c-kube-api-access-7qsjm" (OuterVolumeSpecName: "kube-api-access-7qsjm") pod "bc8b881e-7904-44fe-ae99-975ece57dc4c" (UID: "bc8b881e-7904-44fe-ae99-975ece57dc4c"). InnerVolumeSpecName "kube-api-access-7qsjm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 08:02:44 crc kubenswrapper[4599]: I1012 08:02:44.556273 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc8b881e-7904-44fe-ae99-975ece57dc4c-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "bc8b881e-7904-44fe-ae99-975ece57dc4c" (UID: "bc8b881e-7904-44fe-ae99-975ece57dc4c"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 08:02:44 crc kubenswrapper[4599]: I1012 08:02:44.556903 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc8b881e-7904-44fe-ae99-975ece57dc4c-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "bc8b881e-7904-44fe-ae99-975ece57dc4c" (UID: "bc8b881e-7904-44fe-ae99-975ece57dc4c"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 08:02:44 crc kubenswrapper[4599]: I1012 08:02:44.573492 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc8b881e-7904-44fe-ae99-975ece57dc4c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "bc8b881e-7904-44fe-ae99-975ece57dc4c" (UID: "bc8b881e-7904-44fe-ae99-975ece57dc4c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 08:02:44 crc kubenswrapper[4599]: I1012 08:02:44.575070 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc8b881e-7904-44fe-ae99-975ece57dc4c-inventory" (OuterVolumeSpecName: "inventory") pod "bc8b881e-7904-44fe-ae99-975ece57dc4c" (UID: "bc8b881e-7904-44fe-ae99-975ece57dc4c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 08:02:44 crc kubenswrapper[4599]: I1012 08:02:44.651819 4599 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc8b881e-7904-44fe-ae99-975ece57dc4c-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 08:02:44 crc kubenswrapper[4599]: I1012 08:02:44.651849 4599 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc8b881e-7904-44fe-ae99-975ece57dc4c-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 08:02:44 crc kubenswrapper[4599]: I1012 08:02:44.651859 4599 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc8b881e-7904-44fe-ae99-975ece57dc4c-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 08:02:44 crc kubenswrapper[4599]: I1012 08:02:44.651870 4599 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc8b881e-7904-44fe-ae99-975ece57dc4c-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 08:02:44 crc kubenswrapper[4599]: I1012 08:02:44.651881 4599 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bc8b881e-7904-44fe-ae99-975ece57dc4c-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 12 08:02:44 crc kubenswrapper[4599]: I1012 08:02:44.651891 4599 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/bc8b881e-7904-44fe-ae99-975ece57dc4c-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 12 08:02:44 crc kubenswrapper[4599]: I1012 08:02:44.651900 4599 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc8b881e-7904-44fe-ae99-975ece57dc4c-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 08:02:44 crc kubenswrapper[4599]: I1012 08:02:44.651908 4599 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/bc8b881e-7904-44fe-ae99-975ece57dc4c-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 12 08:02:44 crc kubenswrapper[4599]: I1012 08:02:44.651921 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7qsjm\" (UniqueName: \"kubernetes.io/projected/bc8b881e-7904-44fe-ae99-975ece57dc4c-kube-api-access-7qsjm\") on node \"crc\" DevicePath \"\"" Oct 12 08:02:44 crc kubenswrapper[4599]: I1012 08:02:44.651930 4599 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/bc8b881e-7904-44fe-ae99-975ece57dc4c-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 12 08:02:44 crc kubenswrapper[4599]: I1012 08:02:44.651940 4599 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc8b881e-7904-44fe-ae99-975ece57dc4c-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 08:02:44 crc kubenswrapper[4599]: I1012 08:02:44.651952 4599 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc8b881e-7904-44fe-ae99-975ece57dc4c-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 08:02:44 crc kubenswrapper[4599]: I1012 08:02:44.651962 4599 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bc8b881e-7904-44fe-ae99-975ece57dc4c-inventory\") on node \"crc\" DevicePath \"\"" Oct 12 08:02:44 crc kubenswrapper[4599]: I1012 08:02:44.651972 4599 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/bc8b881e-7904-44fe-ae99-975ece57dc4c-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 12 08:02:45 crc kubenswrapper[4599]: I1012 08:02:45.127975 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hsx4" event={"ID":"bc8b881e-7904-44fe-ae99-975ece57dc4c","Type":"ContainerDied","Data":"c419cfa31fdfd362eb5edc49bd5fbf6e3b0d3a87503a3b8c073c1bb9ee54f231"} Oct 12 08:02:45 crc kubenswrapper[4599]: I1012 08:02:45.128029 4599 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c419cfa31fdfd362eb5edc49bd5fbf6e3b0d3a87503a3b8c073c1bb9ee54f231" Oct 12 08:02:45 crc kubenswrapper[4599]: I1012 08:02:45.128058 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2hsx4" Oct 12 08:02:45 crc kubenswrapper[4599]: I1012 08:02:45.186691 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-nmnvn"] Oct 12 08:02:45 crc kubenswrapper[4599]: E1012 08:02:45.187032 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc8b881e-7904-44fe-ae99-975ece57dc4c" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 12 08:02:45 crc kubenswrapper[4599]: I1012 08:02:45.187050 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc8b881e-7904-44fe-ae99-975ece57dc4c" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 12 08:02:45 crc kubenswrapper[4599]: I1012 08:02:45.187221 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc8b881e-7904-44fe-ae99-975ece57dc4c" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 12 08:02:45 crc kubenswrapper[4599]: I1012 08:02:45.187855 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nmnvn" Oct 12 08:02:45 crc kubenswrapper[4599]: I1012 08:02:45.189364 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Oct 12 08:02:45 crc kubenswrapper[4599]: I1012 08:02:45.189722 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 12 08:02:45 crc kubenswrapper[4599]: I1012 08:02:45.189876 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-x82pw" Oct 12 08:02:45 crc kubenswrapper[4599]: I1012 08:02:45.189985 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 12 08:02:45 crc kubenswrapper[4599]: I1012 08:02:45.190096 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 12 08:02:45 crc kubenswrapper[4599]: I1012 08:02:45.196358 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-nmnvn"] Oct 12 08:02:45 crc kubenswrapper[4599]: I1012 08:02:45.260731 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29a6cb6f-6a2a-405e-a24b-5d49ed9288cd-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nmnvn\" (UID: \"29a6cb6f-6a2a-405e-a24b-5d49ed9288cd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nmnvn" Oct 12 08:02:45 crc kubenswrapper[4599]: I1012 08:02:45.260864 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29a6cb6f-6a2a-405e-a24b-5d49ed9288cd-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nmnvn\" (UID: \"29a6cb6f-6a2a-405e-a24b-5d49ed9288cd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nmnvn" Oct 12 08:02:45 crc kubenswrapper[4599]: I1012 08:02:45.260939 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/29a6cb6f-6a2a-405e-a24b-5d49ed9288cd-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nmnvn\" (UID: \"29a6cb6f-6a2a-405e-a24b-5d49ed9288cd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nmnvn" Oct 12 08:02:45 crc kubenswrapper[4599]: I1012 08:02:45.261061 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/29a6cb6f-6a2a-405e-a24b-5d49ed9288cd-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nmnvn\" (UID: \"29a6cb6f-6a2a-405e-a24b-5d49ed9288cd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nmnvn" Oct 12 08:02:45 crc kubenswrapper[4599]: I1012 08:02:45.261112 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7q57\" (UniqueName: \"kubernetes.io/projected/29a6cb6f-6a2a-405e-a24b-5d49ed9288cd-kube-api-access-n7q57\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nmnvn\" (UID: \"29a6cb6f-6a2a-405e-a24b-5d49ed9288cd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nmnvn" Oct 12 08:02:45 crc kubenswrapper[4599]: I1012 08:02:45.363373 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29a6cb6f-6a2a-405e-a24b-5d49ed9288cd-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nmnvn\" (UID: \"29a6cb6f-6a2a-405e-a24b-5d49ed9288cd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nmnvn" Oct 12 08:02:45 crc kubenswrapper[4599]: I1012 08:02:45.363476 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29a6cb6f-6a2a-405e-a24b-5d49ed9288cd-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nmnvn\" (UID: \"29a6cb6f-6a2a-405e-a24b-5d49ed9288cd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nmnvn" Oct 12 08:02:45 crc kubenswrapper[4599]: I1012 08:02:45.363553 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/29a6cb6f-6a2a-405e-a24b-5d49ed9288cd-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nmnvn\" (UID: \"29a6cb6f-6a2a-405e-a24b-5d49ed9288cd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nmnvn" Oct 12 08:02:45 crc kubenswrapper[4599]: I1012 08:02:45.363657 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/29a6cb6f-6a2a-405e-a24b-5d49ed9288cd-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nmnvn\" (UID: \"29a6cb6f-6a2a-405e-a24b-5d49ed9288cd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nmnvn" Oct 12 08:02:45 crc kubenswrapper[4599]: I1012 08:02:45.363719 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7q57\" (UniqueName: \"kubernetes.io/projected/29a6cb6f-6a2a-405e-a24b-5d49ed9288cd-kube-api-access-n7q57\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nmnvn\" (UID: \"29a6cb6f-6a2a-405e-a24b-5d49ed9288cd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nmnvn" Oct 12 08:02:45 crc kubenswrapper[4599]: I1012 08:02:45.364428 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/29a6cb6f-6a2a-405e-a24b-5d49ed9288cd-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nmnvn\" (UID: \"29a6cb6f-6a2a-405e-a24b-5d49ed9288cd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nmnvn" Oct 12 08:02:45 crc kubenswrapper[4599]: I1012 08:02:45.367110 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/29a6cb6f-6a2a-405e-a24b-5d49ed9288cd-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nmnvn\" (UID: \"29a6cb6f-6a2a-405e-a24b-5d49ed9288cd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nmnvn" Oct 12 08:02:45 crc kubenswrapper[4599]: I1012 08:02:45.367827 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29a6cb6f-6a2a-405e-a24b-5d49ed9288cd-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nmnvn\" (UID: \"29a6cb6f-6a2a-405e-a24b-5d49ed9288cd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nmnvn" Oct 12 08:02:45 crc kubenswrapper[4599]: I1012 08:02:45.368571 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29a6cb6f-6a2a-405e-a24b-5d49ed9288cd-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nmnvn\" (UID: \"29a6cb6f-6a2a-405e-a24b-5d49ed9288cd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nmnvn" Oct 12 08:02:45 crc kubenswrapper[4599]: I1012 08:02:45.378084 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7q57\" (UniqueName: \"kubernetes.io/projected/29a6cb6f-6a2a-405e-a24b-5d49ed9288cd-kube-api-access-n7q57\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nmnvn\" (UID: \"29a6cb6f-6a2a-405e-a24b-5d49ed9288cd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nmnvn" Oct 12 08:02:45 crc kubenswrapper[4599]: I1012 08:02:45.502274 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nmnvn" Oct 12 08:02:45 crc kubenswrapper[4599]: I1012 08:02:45.939117 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-nmnvn"] Oct 12 08:02:46 crc kubenswrapper[4599]: I1012 08:02:46.136055 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nmnvn" event={"ID":"29a6cb6f-6a2a-405e-a24b-5d49ed9288cd","Type":"ContainerStarted","Data":"1f2140a7cb0846a7c00af9470c1acb8efc36de411f8b90ab124c6c2d0dc96431"} Oct 12 08:02:47 crc kubenswrapper[4599]: I1012 08:02:47.150208 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nmnvn" event={"ID":"29a6cb6f-6a2a-405e-a24b-5d49ed9288cd","Type":"ContainerStarted","Data":"49ae2a26d16513fe1c75bd864fdc290d997e007fd64b61767699a6820bdb850e"} Oct 12 08:02:47 crc kubenswrapper[4599]: I1012 08:02:47.164183 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nmnvn" podStartSLOduration=1.571214251 podStartE2EDuration="2.164166961s" podCreationTimestamp="2025-10-12 08:02:45 +0000 UTC" firstStartedPulling="2025-10-12 08:02:45.947487604 +0000 UTC m=+1662.736683107" lastFinishedPulling="2025-10-12 08:02:46.540440315 +0000 UTC m=+1663.329635817" observedRunningTime="2025-10-12 08:02:47.16197619 +0000 UTC m=+1663.951171692" watchObservedRunningTime="2025-10-12 08:02:47.164166961 +0000 UTC m=+1663.953362463" Oct 12 08:02:48 crc kubenswrapper[4599]: I1012 08:02:48.545318 4599 scope.go:117] "RemoveContainer" containerID="f369260da605ea09588ec62c0ee99215abbfa382e3bbd4d53aa9571e2652f968" Oct 12 08:02:48 crc kubenswrapper[4599]: E1012 08:02:48.545939 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5mz5c_openshift-machine-config-operator(cc694bce-8c25-4729-b452-29d44d3efe6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" Oct 12 08:03:00 crc kubenswrapper[4599]: I1012 08:03:00.545887 4599 scope.go:117] "RemoveContainer" containerID="f369260da605ea09588ec62c0ee99215abbfa382e3bbd4d53aa9571e2652f968" Oct 12 08:03:00 crc kubenswrapper[4599]: E1012 08:03:00.546590 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5mz5c_openshift-machine-config-operator(cc694bce-8c25-4729-b452-29d44d3efe6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" Oct 12 08:03:05 crc kubenswrapper[4599]: I1012 08:03:05.368121 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5v7pk"] Oct 12 08:03:05 crc kubenswrapper[4599]: I1012 08:03:05.370306 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5v7pk" Oct 12 08:03:05 crc kubenswrapper[4599]: I1012 08:03:05.381261 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5v7pk"] Oct 12 08:03:05 crc kubenswrapper[4599]: I1012 08:03:05.469656 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzlvc\" (UniqueName: \"kubernetes.io/projected/f9083f4a-18a2-4442-8541-ffa19550e0fc-kube-api-access-wzlvc\") pod \"redhat-marketplace-5v7pk\" (UID: \"f9083f4a-18a2-4442-8541-ffa19550e0fc\") " pod="openshift-marketplace/redhat-marketplace-5v7pk" Oct 12 08:03:05 crc kubenswrapper[4599]: I1012 08:03:05.469809 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9083f4a-18a2-4442-8541-ffa19550e0fc-catalog-content\") pod \"redhat-marketplace-5v7pk\" (UID: \"f9083f4a-18a2-4442-8541-ffa19550e0fc\") " pod="openshift-marketplace/redhat-marketplace-5v7pk" Oct 12 08:03:05 crc kubenswrapper[4599]: I1012 08:03:05.469915 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9083f4a-18a2-4442-8541-ffa19550e0fc-utilities\") pod \"redhat-marketplace-5v7pk\" (UID: \"f9083f4a-18a2-4442-8541-ffa19550e0fc\") " pod="openshift-marketplace/redhat-marketplace-5v7pk" Oct 12 08:03:05 crc kubenswrapper[4599]: I1012 08:03:05.571533 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzlvc\" (UniqueName: \"kubernetes.io/projected/f9083f4a-18a2-4442-8541-ffa19550e0fc-kube-api-access-wzlvc\") pod \"redhat-marketplace-5v7pk\" (UID: \"f9083f4a-18a2-4442-8541-ffa19550e0fc\") " pod="openshift-marketplace/redhat-marketplace-5v7pk" Oct 12 08:03:05 crc kubenswrapper[4599]: I1012 08:03:05.571613 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9083f4a-18a2-4442-8541-ffa19550e0fc-catalog-content\") pod \"redhat-marketplace-5v7pk\" (UID: \"f9083f4a-18a2-4442-8541-ffa19550e0fc\") " pod="openshift-marketplace/redhat-marketplace-5v7pk" Oct 12 08:03:05 crc kubenswrapper[4599]: I1012 08:03:05.571680 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9083f4a-18a2-4442-8541-ffa19550e0fc-utilities\") pod \"redhat-marketplace-5v7pk\" (UID: \"f9083f4a-18a2-4442-8541-ffa19550e0fc\") " pod="openshift-marketplace/redhat-marketplace-5v7pk" Oct 12 08:03:05 crc kubenswrapper[4599]: I1012 08:03:05.572045 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9083f4a-18a2-4442-8541-ffa19550e0fc-catalog-content\") pod \"redhat-marketplace-5v7pk\" (UID: \"f9083f4a-18a2-4442-8541-ffa19550e0fc\") " pod="openshift-marketplace/redhat-marketplace-5v7pk" Oct 12 08:03:05 crc kubenswrapper[4599]: I1012 08:03:05.572161 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9083f4a-18a2-4442-8541-ffa19550e0fc-utilities\") pod \"redhat-marketplace-5v7pk\" (UID: \"f9083f4a-18a2-4442-8541-ffa19550e0fc\") " pod="openshift-marketplace/redhat-marketplace-5v7pk" Oct 12 08:03:05 crc kubenswrapper[4599]: I1012 08:03:05.587771 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzlvc\" (UniqueName: \"kubernetes.io/projected/f9083f4a-18a2-4442-8541-ffa19550e0fc-kube-api-access-wzlvc\") pod \"redhat-marketplace-5v7pk\" (UID: \"f9083f4a-18a2-4442-8541-ffa19550e0fc\") " pod="openshift-marketplace/redhat-marketplace-5v7pk" Oct 12 08:03:05 crc kubenswrapper[4599]: I1012 08:03:05.686153 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5v7pk" Oct 12 08:03:06 crc kubenswrapper[4599]: I1012 08:03:06.063669 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5v7pk"] Oct 12 08:03:06 crc kubenswrapper[4599]: I1012 08:03:06.292624 4599 generic.go:334] "Generic (PLEG): container finished" podID="f9083f4a-18a2-4442-8541-ffa19550e0fc" containerID="b3afdd8c0dbc1f3dc8b777137e115a4838b81aff7d50a1a563ce25cf11233bae" exitCode=0 Oct 12 08:03:06 crc kubenswrapper[4599]: I1012 08:03:06.292668 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5v7pk" event={"ID":"f9083f4a-18a2-4442-8541-ffa19550e0fc","Type":"ContainerDied","Data":"b3afdd8c0dbc1f3dc8b777137e115a4838b81aff7d50a1a563ce25cf11233bae"} Oct 12 08:03:06 crc kubenswrapper[4599]: I1012 08:03:06.292695 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5v7pk" event={"ID":"f9083f4a-18a2-4442-8541-ffa19550e0fc","Type":"ContainerStarted","Data":"b28b2a772125ecad0755ec7d997ddd7704d51b4c6942334a4eb8f84c7502d0ab"} Oct 12 08:03:07 crc kubenswrapper[4599]: I1012 08:03:07.318382 4599 generic.go:334] "Generic (PLEG): container finished" podID="f9083f4a-18a2-4442-8541-ffa19550e0fc" containerID="cc3c9b53781b0e010863e73d071ad3e4f10bf780eb74b36b08e8726d5af43f87" exitCode=0 Oct 12 08:03:07 crc kubenswrapper[4599]: I1012 08:03:07.318676 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5v7pk" event={"ID":"f9083f4a-18a2-4442-8541-ffa19550e0fc","Type":"ContainerDied","Data":"cc3c9b53781b0e010863e73d071ad3e4f10bf780eb74b36b08e8726d5af43f87"} Oct 12 08:03:08 crc kubenswrapper[4599]: I1012 08:03:08.326586 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5v7pk" event={"ID":"f9083f4a-18a2-4442-8541-ffa19550e0fc","Type":"ContainerStarted","Data":"e7004fcaec750ade4a4f58190810c62dbba4c40c675b98a172f5133f9c45c586"} Oct 12 08:03:08 crc kubenswrapper[4599]: I1012 08:03:08.344605 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5v7pk" podStartSLOduration=1.861134413 podStartE2EDuration="3.344587383s" podCreationTimestamp="2025-10-12 08:03:05 +0000 UTC" firstStartedPulling="2025-10-12 08:03:06.294086304 +0000 UTC m=+1683.083281806" lastFinishedPulling="2025-10-12 08:03:07.777539274 +0000 UTC m=+1684.566734776" observedRunningTime="2025-10-12 08:03:08.338589682 +0000 UTC m=+1685.127785184" watchObservedRunningTime="2025-10-12 08:03:08.344587383 +0000 UTC m=+1685.133782885" Oct 12 08:03:11 crc kubenswrapper[4599]: I1012 08:03:11.545829 4599 scope.go:117] "RemoveContainer" containerID="f369260da605ea09588ec62c0ee99215abbfa382e3bbd4d53aa9571e2652f968" Oct 12 08:03:11 crc kubenswrapper[4599]: E1012 08:03:11.546247 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5mz5c_openshift-machine-config-operator(cc694bce-8c25-4729-b452-29d44d3efe6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" Oct 12 08:03:15 crc kubenswrapper[4599]: I1012 08:03:15.686999 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5v7pk" Oct 12 08:03:15 crc kubenswrapper[4599]: I1012 08:03:15.687592 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5v7pk" Oct 12 08:03:15 crc kubenswrapper[4599]: I1012 08:03:15.721649 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5v7pk" Oct 12 08:03:16 crc kubenswrapper[4599]: I1012 08:03:16.406746 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5v7pk" Oct 12 08:03:16 crc kubenswrapper[4599]: I1012 08:03:16.446729 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5v7pk"] Oct 12 08:03:18 crc kubenswrapper[4599]: I1012 08:03:18.388451 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5v7pk" podUID="f9083f4a-18a2-4442-8541-ffa19550e0fc" containerName="registry-server" containerID="cri-o://e7004fcaec750ade4a4f58190810c62dbba4c40c675b98a172f5133f9c45c586" gracePeriod=2 Oct 12 08:03:18 crc kubenswrapper[4599]: I1012 08:03:18.725773 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5v7pk" Oct 12 08:03:18 crc kubenswrapper[4599]: I1012 08:03:18.893274 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9083f4a-18a2-4442-8541-ffa19550e0fc-catalog-content\") pod \"f9083f4a-18a2-4442-8541-ffa19550e0fc\" (UID: \"f9083f4a-18a2-4442-8541-ffa19550e0fc\") " Oct 12 08:03:18 crc kubenswrapper[4599]: I1012 08:03:18.893381 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wzlvc\" (UniqueName: \"kubernetes.io/projected/f9083f4a-18a2-4442-8541-ffa19550e0fc-kube-api-access-wzlvc\") pod \"f9083f4a-18a2-4442-8541-ffa19550e0fc\" (UID: \"f9083f4a-18a2-4442-8541-ffa19550e0fc\") " Oct 12 08:03:18 crc kubenswrapper[4599]: I1012 08:03:18.893417 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9083f4a-18a2-4442-8541-ffa19550e0fc-utilities\") pod \"f9083f4a-18a2-4442-8541-ffa19550e0fc\" (UID: \"f9083f4a-18a2-4442-8541-ffa19550e0fc\") " Oct 12 08:03:18 crc kubenswrapper[4599]: I1012 08:03:18.894153 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9083f4a-18a2-4442-8541-ffa19550e0fc-utilities" (OuterVolumeSpecName: "utilities") pod "f9083f4a-18a2-4442-8541-ffa19550e0fc" (UID: "f9083f4a-18a2-4442-8541-ffa19550e0fc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 08:03:18 crc kubenswrapper[4599]: I1012 08:03:18.898279 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9083f4a-18a2-4442-8541-ffa19550e0fc-kube-api-access-wzlvc" (OuterVolumeSpecName: "kube-api-access-wzlvc") pod "f9083f4a-18a2-4442-8541-ffa19550e0fc" (UID: "f9083f4a-18a2-4442-8541-ffa19550e0fc"). InnerVolumeSpecName "kube-api-access-wzlvc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 08:03:18 crc kubenswrapper[4599]: I1012 08:03:18.926164 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9083f4a-18a2-4442-8541-ffa19550e0fc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f9083f4a-18a2-4442-8541-ffa19550e0fc" (UID: "f9083f4a-18a2-4442-8541-ffa19550e0fc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 08:03:18 crc kubenswrapper[4599]: I1012 08:03:18.995234 4599 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9083f4a-18a2-4442-8541-ffa19550e0fc-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 08:03:18 crc kubenswrapper[4599]: I1012 08:03:18.995276 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wzlvc\" (UniqueName: \"kubernetes.io/projected/f9083f4a-18a2-4442-8541-ffa19550e0fc-kube-api-access-wzlvc\") on node \"crc\" DevicePath \"\"" Oct 12 08:03:18 crc kubenswrapper[4599]: I1012 08:03:18.995290 4599 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9083f4a-18a2-4442-8541-ffa19550e0fc-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 08:03:19 crc kubenswrapper[4599]: I1012 08:03:19.397103 4599 generic.go:334] "Generic (PLEG): container finished" podID="f9083f4a-18a2-4442-8541-ffa19550e0fc" containerID="e7004fcaec750ade4a4f58190810c62dbba4c40c675b98a172f5133f9c45c586" exitCode=0 Oct 12 08:03:19 crc kubenswrapper[4599]: I1012 08:03:19.397160 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5v7pk" event={"ID":"f9083f4a-18a2-4442-8541-ffa19550e0fc","Type":"ContainerDied","Data":"e7004fcaec750ade4a4f58190810c62dbba4c40c675b98a172f5133f9c45c586"} Oct 12 08:03:19 crc kubenswrapper[4599]: I1012 08:03:19.397213 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5v7pk" event={"ID":"f9083f4a-18a2-4442-8541-ffa19550e0fc","Type":"ContainerDied","Data":"b28b2a772125ecad0755ec7d997ddd7704d51b4c6942334a4eb8f84c7502d0ab"} Oct 12 08:03:19 crc kubenswrapper[4599]: I1012 08:03:19.397236 4599 scope.go:117] "RemoveContainer" containerID="e7004fcaec750ade4a4f58190810c62dbba4c40c675b98a172f5133f9c45c586" Oct 12 08:03:19 crc kubenswrapper[4599]: I1012 08:03:19.397169 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5v7pk" Oct 12 08:03:19 crc kubenswrapper[4599]: I1012 08:03:19.414120 4599 scope.go:117] "RemoveContainer" containerID="cc3c9b53781b0e010863e73d071ad3e4f10bf780eb74b36b08e8726d5af43f87" Oct 12 08:03:19 crc kubenswrapper[4599]: I1012 08:03:19.421203 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5v7pk"] Oct 12 08:03:19 crc kubenswrapper[4599]: I1012 08:03:19.427651 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5v7pk"] Oct 12 08:03:19 crc kubenswrapper[4599]: I1012 08:03:19.449599 4599 scope.go:117] "RemoveContainer" containerID="b3afdd8c0dbc1f3dc8b777137e115a4838b81aff7d50a1a563ce25cf11233bae" Oct 12 08:03:19 crc kubenswrapper[4599]: I1012 08:03:19.468049 4599 scope.go:117] "RemoveContainer" containerID="e7004fcaec750ade4a4f58190810c62dbba4c40c675b98a172f5133f9c45c586" Oct 12 08:03:19 crc kubenswrapper[4599]: E1012 08:03:19.468433 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7004fcaec750ade4a4f58190810c62dbba4c40c675b98a172f5133f9c45c586\": container with ID starting with e7004fcaec750ade4a4f58190810c62dbba4c40c675b98a172f5133f9c45c586 not found: ID does not exist" containerID="e7004fcaec750ade4a4f58190810c62dbba4c40c675b98a172f5133f9c45c586" Oct 12 08:03:19 crc kubenswrapper[4599]: I1012 08:03:19.468475 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7004fcaec750ade4a4f58190810c62dbba4c40c675b98a172f5133f9c45c586"} err="failed to get container status \"e7004fcaec750ade4a4f58190810c62dbba4c40c675b98a172f5133f9c45c586\": rpc error: code = NotFound desc = could not find container \"e7004fcaec750ade4a4f58190810c62dbba4c40c675b98a172f5133f9c45c586\": container with ID starting with e7004fcaec750ade4a4f58190810c62dbba4c40c675b98a172f5133f9c45c586 not found: ID does not exist" Oct 12 08:03:19 crc kubenswrapper[4599]: I1012 08:03:19.468500 4599 scope.go:117] "RemoveContainer" containerID="cc3c9b53781b0e010863e73d071ad3e4f10bf780eb74b36b08e8726d5af43f87" Oct 12 08:03:19 crc kubenswrapper[4599]: E1012 08:03:19.468748 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc3c9b53781b0e010863e73d071ad3e4f10bf780eb74b36b08e8726d5af43f87\": container with ID starting with cc3c9b53781b0e010863e73d071ad3e4f10bf780eb74b36b08e8726d5af43f87 not found: ID does not exist" containerID="cc3c9b53781b0e010863e73d071ad3e4f10bf780eb74b36b08e8726d5af43f87" Oct 12 08:03:19 crc kubenswrapper[4599]: I1012 08:03:19.468775 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc3c9b53781b0e010863e73d071ad3e4f10bf780eb74b36b08e8726d5af43f87"} err="failed to get container status \"cc3c9b53781b0e010863e73d071ad3e4f10bf780eb74b36b08e8726d5af43f87\": rpc error: code = NotFound desc = could not find container \"cc3c9b53781b0e010863e73d071ad3e4f10bf780eb74b36b08e8726d5af43f87\": container with ID starting with cc3c9b53781b0e010863e73d071ad3e4f10bf780eb74b36b08e8726d5af43f87 not found: ID does not exist" Oct 12 08:03:19 crc kubenswrapper[4599]: I1012 08:03:19.468790 4599 scope.go:117] "RemoveContainer" containerID="b3afdd8c0dbc1f3dc8b777137e115a4838b81aff7d50a1a563ce25cf11233bae" Oct 12 08:03:19 crc kubenswrapper[4599]: E1012 08:03:19.469006 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3afdd8c0dbc1f3dc8b777137e115a4838b81aff7d50a1a563ce25cf11233bae\": container with ID starting with b3afdd8c0dbc1f3dc8b777137e115a4838b81aff7d50a1a563ce25cf11233bae not found: ID does not exist" containerID="b3afdd8c0dbc1f3dc8b777137e115a4838b81aff7d50a1a563ce25cf11233bae" Oct 12 08:03:19 crc kubenswrapper[4599]: I1012 08:03:19.469025 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3afdd8c0dbc1f3dc8b777137e115a4838b81aff7d50a1a563ce25cf11233bae"} err="failed to get container status \"b3afdd8c0dbc1f3dc8b777137e115a4838b81aff7d50a1a563ce25cf11233bae\": rpc error: code = NotFound desc = could not find container \"b3afdd8c0dbc1f3dc8b777137e115a4838b81aff7d50a1a563ce25cf11233bae\": container with ID starting with b3afdd8c0dbc1f3dc8b777137e115a4838b81aff7d50a1a563ce25cf11233bae not found: ID does not exist" Oct 12 08:03:19 crc kubenswrapper[4599]: I1012 08:03:19.552932 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9083f4a-18a2-4442-8541-ffa19550e0fc" path="/var/lib/kubelet/pods/f9083f4a-18a2-4442-8541-ffa19550e0fc/volumes" Oct 12 08:03:25 crc kubenswrapper[4599]: I1012 08:03:25.545970 4599 scope.go:117] "RemoveContainer" containerID="f369260da605ea09588ec62c0ee99215abbfa382e3bbd4d53aa9571e2652f968" Oct 12 08:03:25 crc kubenswrapper[4599]: E1012 08:03:25.546897 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5mz5c_openshift-machine-config-operator(cc694bce-8c25-4729-b452-29d44d3efe6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" Oct 12 08:03:30 crc kubenswrapper[4599]: I1012 08:03:30.465683 4599 generic.go:334] "Generic (PLEG): container finished" podID="29a6cb6f-6a2a-405e-a24b-5d49ed9288cd" containerID="49ae2a26d16513fe1c75bd864fdc290d997e007fd64b61767699a6820bdb850e" exitCode=0 Oct 12 08:03:30 crc kubenswrapper[4599]: I1012 08:03:30.465772 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nmnvn" event={"ID":"29a6cb6f-6a2a-405e-a24b-5d49ed9288cd","Type":"ContainerDied","Data":"49ae2a26d16513fe1c75bd864fdc290d997e007fd64b61767699a6820bdb850e"} Oct 12 08:03:31 crc kubenswrapper[4599]: I1012 08:03:31.760592 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nmnvn" Oct 12 08:03:31 crc kubenswrapper[4599]: I1012 08:03:31.886598 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29a6cb6f-6a2a-405e-a24b-5d49ed9288cd-inventory\") pod \"29a6cb6f-6a2a-405e-a24b-5d49ed9288cd\" (UID: \"29a6cb6f-6a2a-405e-a24b-5d49ed9288cd\") " Oct 12 08:03:31 crc kubenswrapper[4599]: I1012 08:03:31.886662 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n7q57\" (UniqueName: \"kubernetes.io/projected/29a6cb6f-6a2a-405e-a24b-5d49ed9288cd-kube-api-access-n7q57\") pod \"29a6cb6f-6a2a-405e-a24b-5d49ed9288cd\" (UID: \"29a6cb6f-6a2a-405e-a24b-5d49ed9288cd\") " Oct 12 08:03:31 crc kubenswrapper[4599]: I1012 08:03:31.886696 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29a6cb6f-6a2a-405e-a24b-5d49ed9288cd-ovn-combined-ca-bundle\") pod \"29a6cb6f-6a2a-405e-a24b-5d49ed9288cd\" (UID: \"29a6cb6f-6a2a-405e-a24b-5d49ed9288cd\") " Oct 12 08:03:31 crc kubenswrapper[4599]: I1012 08:03:31.886728 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/29a6cb6f-6a2a-405e-a24b-5d49ed9288cd-ovncontroller-config-0\") pod \"29a6cb6f-6a2a-405e-a24b-5d49ed9288cd\" (UID: \"29a6cb6f-6a2a-405e-a24b-5d49ed9288cd\") " Oct 12 08:03:31 crc kubenswrapper[4599]: I1012 08:03:31.886774 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/29a6cb6f-6a2a-405e-a24b-5d49ed9288cd-ssh-key\") pod \"29a6cb6f-6a2a-405e-a24b-5d49ed9288cd\" (UID: \"29a6cb6f-6a2a-405e-a24b-5d49ed9288cd\") " Oct 12 08:03:31 crc kubenswrapper[4599]: I1012 08:03:31.891202 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29a6cb6f-6a2a-405e-a24b-5d49ed9288cd-kube-api-access-n7q57" (OuterVolumeSpecName: "kube-api-access-n7q57") pod "29a6cb6f-6a2a-405e-a24b-5d49ed9288cd" (UID: "29a6cb6f-6a2a-405e-a24b-5d49ed9288cd"). InnerVolumeSpecName "kube-api-access-n7q57". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 08:03:31 crc kubenswrapper[4599]: I1012 08:03:31.891440 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29a6cb6f-6a2a-405e-a24b-5d49ed9288cd-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "29a6cb6f-6a2a-405e-a24b-5d49ed9288cd" (UID: "29a6cb6f-6a2a-405e-a24b-5d49ed9288cd"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 08:03:31 crc kubenswrapper[4599]: I1012 08:03:31.904872 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29a6cb6f-6a2a-405e-a24b-5d49ed9288cd-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "29a6cb6f-6a2a-405e-a24b-5d49ed9288cd" (UID: "29a6cb6f-6a2a-405e-a24b-5d49ed9288cd"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 08:03:31 crc kubenswrapper[4599]: I1012 08:03:31.907299 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29a6cb6f-6a2a-405e-a24b-5d49ed9288cd-inventory" (OuterVolumeSpecName: "inventory") pod "29a6cb6f-6a2a-405e-a24b-5d49ed9288cd" (UID: "29a6cb6f-6a2a-405e-a24b-5d49ed9288cd"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 08:03:31 crc kubenswrapper[4599]: I1012 08:03:31.907692 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29a6cb6f-6a2a-405e-a24b-5d49ed9288cd-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "29a6cb6f-6a2a-405e-a24b-5d49ed9288cd" (UID: "29a6cb6f-6a2a-405e-a24b-5d49ed9288cd"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 08:03:31 crc kubenswrapper[4599]: I1012 08:03:31.988898 4599 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29a6cb6f-6a2a-405e-a24b-5d49ed9288cd-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 08:03:31 crc kubenswrapper[4599]: I1012 08:03:31.988928 4599 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/29a6cb6f-6a2a-405e-a24b-5d49ed9288cd-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Oct 12 08:03:31 crc kubenswrapper[4599]: I1012 08:03:31.988937 4599 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/29a6cb6f-6a2a-405e-a24b-5d49ed9288cd-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 12 08:03:31 crc kubenswrapper[4599]: I1012 08:03:31.988947 4599 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29a6cb6f-6a2a-405e-a24b-5d49ed9288cd-inventory\") on node \"crc\" DevicePath \"\"" Oct 12 08:03:31 crc kubenswrapper[4599]: I1012 08:03:31.988957 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n7q57\" (UniqueName: \"kubernetes.io/projected/29a6cb6f-6a2a-405e-a24b-5d49ed9288cd-kube-api-access-n7q57\") on node \"crc\" DevicePath \"\"" Oct 12 08:03:32 crc kubenswrapper[4599]: I1012 08:03:32.478801 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nmnvn" event={"ID":"29a6cb6f-6a2a-405e-a24b-5d49ed9288cd","Type":"ContainerDied","Data":"1f2140a7cb0846a7c00af9470c1acb8efc36de411f8b90ab124c6c2d0dc96431"} Oct 12 08:03:32 crc kubenswrapper[4599]: I1012 08:03:32.479043 4599 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1f2140a7cb0846a7c00af9470c1acb8efc36de411f8b90ab124c6c2d0dc96431" Oct 12 08:03:32 crc kubenswrapper[4599]: I1012 08:03:32.478868 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nmnvn" Oct 12 08:03:32 crc kubenswrapper[4599]: I1012 08:03:32.543367 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7rb25"] Oct 12 08:03:32 crc kubenswrapper[4599]: E1012 08:03:32.543669 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9083f4a-18a2-4442-8541-ffa19550e0fc" containerName="registry-server" Oct 12 08:03:32 crc kubenswrapper[4599]: I1012 08:03:32.543687 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9083f4a-18a2-4442-8541-ffa19550e0fc" containerName="registry-server" Oct 12 08:03:32 crc kubenswrapper[4599]: E1012 08:03:32.543718 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29a6cb6f-6a2a-405e-a24b-5d49ed9288cd" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 12 08:03:32 crc kubenswrapper[4599]: I1012 08:03:32.543725 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="29a6cb6f-6a2a-405e-a24b-5d49ed9288cd" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 12 08:03:32 crc kubenswrapper[4599]: E1012 08:03:32.543747 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9083f4a-18a2-4442-8541-ffa19550e0fc" containerName="extract-content" Oct 12 08:03:32 crc kubenswrapper[4599]: I1012 08:03:32.543752 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9083f4a-18a2-4442-8541-ffa19550e0fc" containerName="extract-content" Oct 12 08:03:32 crc kubenswrapper[4599]: E1012 08:03:32.543770 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9083f4a-18a2-4442-8541-ffa19550e0fc" containerName="extract-utilities" Oct 12 08:03:32 crc kubenswrapper[4599]: I1012 08:03:32.543776 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9083f4a-18a2-4442-8541-ffa19550e0fc" containerName="extract-utilities" Oct 12 08:03:32 crc kubenswrapper[4599]: I1012 08:03:32.543925 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9083f4a-18a2-4442-8541-ffa19550e0fc" containerName="registry-server" Oct 12 08:03:32 crc kubenswrapper[4599]: I1012 08:03:32.543959 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="29a6cb6f-6a2a-405e-a24b-5d49ed9288cd" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 12 08:03:32 crc kubenswrapper[4599]: I1012 08:03:32.544480 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7rb25" Oct 12 08:03:32 crc kubenswrapper[4599]: I1012 08:03:32.547524 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 12 08:03:32 crc kubenswrapper[4599]: I1012 08:03:32.547634 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-x82pw" Oct 12 08:03:32 crc kubenswrapper[4599]: I1012 08:03:32.547923 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 12 08:03:32 crc kubenswrapper[4599]: I1012 08:03:32.548210 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Oct 12 08:03:32 crc kubenswrapper[4599]: I1012 08:03:32.548416 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 12 08:03:32 crc kubenswrapper[4599]: I1012 08:03:32.548488 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Oct 12 08:03:32 crc kubenswrapper[4599]: I1012 08:03:32.556000 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7rb25"] Oct 12 08:03:32 crc kubenswrapper[4599]: I1012 08:03:32.699128 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d5a50516-f480-4da5-adb8-853dd9ce7b6c-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7rb25\" (UID: \"d5a50516-f480-4da5-adb8-853dd9ce7b6c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7rb25" Oct 12 08:03:32 crc kubenswrapper[4599]: I1012 08:03:32.699369 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5a50516-f480-4da5-adb8-853dd9ce7b6c-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7rb25\" (UID: \"d5a50516-f480-4da5-adb8-853dd9ce7b6c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7rb25" Oct 12 08:03:32 crc kubenswrapper[4599]: I1012 08:03:32.699568 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d5a50516-f480-4da5-adb8-853dd9ce7b6c-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7rb25\" (UID: \"d5a50516-f480-4da5-adb8-853dd9ce7b6c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7rb25" Oct 12 08:03:32 crc kubenswrapper[4599]: I1012 08:03:32.699635 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vszxn\" (UniqueName: \"kubernetes.io/projected/d5a50516-f480-4da5-adb8-853dd9ce7b6c-kube-api-access-vszxn\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7rb25\" (UID: \"d5a50516-f480-4da5-adb8-853dd9ce7b6c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7rb25" Oct 12 08:03:32 crc kubenswrapper[4599]: I1012 08:03:32.699683 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d5a50516-f480-4da5-adb8-853dd9ce7b6c-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7rb25\" (UID: \"d5a50516-f480-4da5-adb8-853dd9ce7b6c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7rb25" Oct 12 08:03:32 crc kubenswrapper[4599]: I1012 08:03:32.699908 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d5a50516-f480-4da5-adb8-853dd9ce7b6c-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7rb25\" (UID: \"d5a50516-f480-4da5-adb8-853dd9ce7b6c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7rb25" Oct 12 08:03:32 crc kubenswrapper[4599]: I1012 08:03:32.801645 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d5a50516-f480-4da5-adb8-853dd9ce7b6c-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7rb25\" (UID: \"d5a50516-f480-4da5-adb8-853dd9ce7b6c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7rb25" Oct 12 08:03:32 crc kubenswrapper[4599]: I1012 08:03:32.801765 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d5a50516-f480-4da5-adb8-853dd9ce7b6c-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7rb25\" (UID: \"d5a50516-f480-4da5-adb8-853dd9ce7b6c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7rb25" Oct 12 08:03:32 crc kubenswrapper[4599]: I1012 08:03:32.801792 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5a50516-f480-4da5-adb8-853dd9ce7b6c-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7rb25\" (UID: \"d5a50516-f480-4da5-adb8-853dd9ce7b6c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7rb25" Oct 12 08:03:32 crc kubenswrapper[4599]: I1012 08:03:32.801833 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d5a50516-f480-4da5-adb8-853dd9ce7b6c-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7rb25\" (UID: \"d5a50516-f480-4da5-adb8-853dd9ce7b6c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7rb25" Oct 12 08:03:32 crc kubenswrapper[4599]: I1012 08:03:32.801855 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vszxn\" (UniqueName: \"kubernetes.io/projected/d5a50516-f480-4da5-adb8-853dd9ce7b6c-kube-api-access-vszxn\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7rb25\" (UID: \"d5a50516-f480-4da5-adb8-853dd9ce7b6c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7rb25" Oct 12 08:03:32 crc kubenswrapper[4599]: I1012 08:03:32.801877 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d5a50516-f480-4da5-adb8-853dd9ce7b6c-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7rb25\" (UID: \"d5a50516-f480-4da5-adb8-853dd9ce7b6c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7rb25" Oct 12 08:03:32 crc kubenswrapper[4599]: I1012 08:03:32.805191 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d5a50516-f480-4da5-adb8-853dd9ce7b6c-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7rb25\" (UID: \"d5a50516-f480-4da5-adb8-853dd9ce7b6c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7rb25" Oct 12 08:03:32 crc kubenswrapper[4599]: I1012 08:03:32.805732 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d5a50516-f480-4da5-adb8-853dd9ce7b6c-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7rb25\" (UID: \"d5a50516-f480-4da5-adb8-853dd9ce7b6c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7rb25" Oct 12 08:03:32 crc kubenswrapper[4599]: I1012 08:03:32.805815 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5a50516-f480-4da5-adb8-853dd9ce7b6c-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7rb25\" (UID: \"d5a50516-f480-4da5-adb8-853dd9ce7b6c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7rb25" Oct 12 08:03:32 crc kubenswrapper[4599]: I1012 08:03:32.805816 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d5a50516-f480-4da5-adb8-853dd9ce7b6c-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7rb25\" (UID: \"d5a50516-f480-4da5-adb8-853dd9ce7b6c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7rb25" Oct 12 08:03:32 crc kubenswrapper[4599]: I1012 08:03:32.806169 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d5a50516-f480-4da5-adb8-853dd9ce7b6c-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7rb25\" (UID: \"d5a50516-f480-4da5-adb8-853dd9ce7b6c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7rb25" Oct 12 08:03:32 crc kubenswrapper[4599]: I1012 08:03:32.815861 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vszxn\" (UniqueName: \"kubernetes.io/projected/d5a50516-f480-4da5-adb8-853dd9ce7b6c-kube-api-access-vszxn\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7rb25\" (UID: \"d5a50516-f480-4da5-adb8-853dd9ce7b6c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7rb25" Oct 12 08:03:32 crc kubenswrapper[4599]: I1012 08:03:32.860756 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7rb25" Oct 12 08:03:33 crc kubenswrapper[4599]: I1012 08:03:33.289418 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7rb25"] Oct 12 08:03:33 crc kubenswrapper[4599]: I1012 08:03:33.485304 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7rb25" event={"ID":"d5a50516-f480-4da5-adb8-853dd9ce7b6c","Type":"ContainerStarted","Data":"1e5f54018ef9418282eeccd3776afb09bcf0faf36128b13483ea24885ca61347"} Oct 12 08:03:34 crc kubenswrapper[4599]: I1012 08:03:34.493094 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7rb25" event={"ID":"d5a50516-f480-4da5-adb8-853dd9ce7b6c","Type":"ContainerStarted","Data":"e3ec1584c4d78935612dba3110c4224915c609ea25c9fcf81e6a1085dd120a04"} Oct 12 08:03:34 crc kubenswrapper[4599]: I1012 08:03:34.511142 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7rb25" podStartSLOduration=1.926431584 podStartE2EDuration="2.511128656s" podCreationTimestamp="2025-10-12 08:03:32 +0000 UTC" firstStartedPulling="2025-10-12 08:03:33.296886747 +0000 UTC m=+1710.086082249" lastFinishedPulling="2025-10-12 08:03:33.881583819 +0000 UTC m=+1710.670779321" observedRunningTime="2025-10-12 08:03:34.505234431 +0000 UTC m=+1711.294429933" watchObservedRunningTime="2025-10-12 08:03:34.511128656 +0000 UTC m=+1711.300324148" Oct 12 08:03:38 crc kubenswrapper[4599]: I1012 08:03:38.545060 4599 scope.go:117] "RemoveContainer" containerID="f369260da605ea09588ec62c0ee99215abbfa382e3bbd4d53aa9571e2652f968" Oct 12 08:03:38 crc kubenswrapper[4599]: E1012 08:03:38.545688 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5mz5c_openshift-machine-config-operator(cc694bce-8c25-4729-b452-29d44d3efe6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" Oct 12 08:03:50 crc kubenswrapper[4599]: I1012 08:03:50.545540 4599 scope.go:117] "RemoveContainer" containerID="f369260da605ea09588ec62c0ee99215abbfa382e3bbd4d53aa9571e2652f968" Oct 12 08:03:50 crc kubenswrapper[4599]: E1012 08:03:50.546466 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5mz5c_openshift-machine-config-operator(cc694bce-8c25-4729-b452-29d44d3efe6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" Oct 12 08:04:04 crc kubenswrapper[4599]: I1012 08:04:04.545257 4599 scope.go:117] "RemoveContainer" containerID="f369260da605ea09588ec62c0ee99215abbfa382e3bbd4d53aa9571e2652f968" Oct 12 08:04:04 crc kubenswrapper[4599]: E1012 08:04:04.545897 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5mz5c_openshift-machine-config-operator(cc694bce-8c25-4729-b452-29d44d3efe6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" Oct 12 08:04:07 crc kubenswrapper[4599]: I1012 08:04:07.706313 4599 generic.go:334] "Generic (PLEG): container finished" podID="d5a50516-f480-4da5-adb8-853dd9ce7b6c" containerID="e3ec1584c4d78935612dba3110c4224915c609ea25c9fcf81e6a1085dd120a04" exitCode=0 Oct 12 08:04:07 crc kubenswrapper[4599]: I1012 08:04:07.706368 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7rb25" event={"ID":"d5a50516-f480-4da5-adb8-853dd9ce7b6c","Type":"ContainerDied","Data":"e3ec1584c4d78935612dba3110c4224915c609ea25c9fcf81e6a1085dd120a04"} Oct 12 08:04:09 crc kubenswrapper[4599]: I1012 08:04:09.029570 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7rb25" Oct 12 08:04:09 crc kubenswrapper[4599]: I1012 08:04:09.138965 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d5a50516-f480-4da5-adb8-853dd9ce7b6c-inventory\") pod \"d5a50516-f480-4da5-adb8-853dd9ce7b6c\" (UID: \"d5a50516-f480-4da5-adb8-853dd9ce7b6c\") " Oct 12 08:04:09 crc kubenswrapper[4599]: I1012 08:04:09.139309 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d5a50516-f480-4da5-adb8-853dd9ce7b6c-neutron-ovn-metadata-agent-neutron-config-0\") pod \"d5a50516-f480-4da5-adb8-853dd9ce7b6c\" (UID: \"d5a50516-f480-4da5-adb8-853dd9ce7b6c\") " Oct 12 08:04:09 crc kubenswrapper[4599]: I1012 08:04:09.139376 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5a50516-f480-4da5-adb8-853dd9ce7b6c-neutron-metadata-combined-ca-bundle\") pod \"d5a50516-f480-4da5-adb8-853dd9ce7b6c\" (UID: \"d5a50516-f480-4da5-adb8-853dd9ce7b6c\") " Oct 12 08:04:09 crc kubenswrapper[4599]: I1012 08:04:09.139436 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vszxn\" (UniqueName: \"kubernetes.io/projected/d5a50516-f480-4da5-adb8-853dd9ce7b6c-kube-api-access-vszxn\") pod \"d5a50516-f480-4da5-adb8-853dd9ce7b6c\" (UID: \"d5a50516-f480-4da5-adb8-853dd9ce7b6c\") " Oct 12 08:04:09 crc kubenswrapper[4599]: I1012 08:04:09.139518 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d5a50516-f480-4da5-adb8-853dd9ce7b6c-nova-metadata-neutron-config-0\") pod \"d5a50516-f480-4da5-adb8-853dd9ce7b6c\" (UID: \"d5a50516-f480-4da5-adb8-853dd9ce7b6c\") " Oct 12 08:04:09 crc kubenswrapper[4599]: I1012 08:04:09.139563 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d5a50516-f480-4da5-adb8-853dd9ce7b6c-ssh-key\") pod \"d5a50516-f480-4da5-adb8-853dd9ce7b6c\" (UID: \"d5a50516-f480-4da5-adb8-853dd9ce7b6c\") " Oct 12 08:04:09 crc kubenswrapper[4599]: I1012 08:04:09.144214 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5a50516-f480-4da5-adb8-853dd9ce7b6c-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "d5a50516-f480-4da5-adb8-853dd9ce7b6c" (UID: "d5a50516-f480-4da5-adb8-853dd9ce7b6c"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 08:04:09 crc kubenswrapper[4599]: I1012 08:04:09.146897 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5a50516-f480-4da5-adb8-853dd9ce7b6c-kube-api-access-vszxn" (OuterVolumeSpecName: "kube-api-access-vszxn") pod "d5a50516-f480-4da5-adb8-853dd9ce7b6c" (UID: "d5a50516-f480-4da5-adb8-853dd9ce7b6c"). InnerVolumeSpecName "kube-api-access-vszxn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 08:04:09 crc kubenswrapper[4599]: I1012 08:04:09.160704 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5a50516-f480-4da5-adb8-853dd9ce7b6c-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "d5a50516-f480-4da5-adb8-853dd9ce7b6c" (UID: "d5a50516-f480-4da5-adb8-853dd9ce7b6c"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 08:04:09 crc kubenswrapper[4599]: I1012 08:04:09.161149 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5a50516-f480-4da5-adb8-853dd9ce7b6c-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "d5a50516-f480-4da5-adb8-853dd9ce7b6c" (UID: "d5a50516-f480-4da5-adb8-853dd9ce7b6c"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 08:04:09 crc kubenswrapper[4599]: I1012 08:04:09.161641 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5a50516-f480-4da5-adb8-853dd9ce7b6c-inventory" (OuterVolumeSpecName: "inventory") pod "d5a50516-f480-4da5-adb8-853dd9ce7b6c" (UID: "d5a50516-f480-4da5-adb8-853dd9ce7b6c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 08:04:09 crc kubenswrapper[4599]: I1012 08:04:09.162057 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5a50516-f480-4da5-adb8-853dd9ce7b6c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d5a50516-f480-4da5-adb8-853dd9ce7b6c" (UID: "d5a50516-f480-4da5-adb8-853dd9ce7b6c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 08:04:09 crc kubenswrapper[4599]: I1012 08:04:09.241099 4599 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d5a50516-f480-4da5-adb8-853dd9ce7b6c-inventory\") on node \"crc\" DevicePath \"\"" Oct 12 08:04:09 crc kubenswrapper[4599]: I1012 08:04:09.241126 4599 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d5a50516-f480-4da5-adb8-853dd9ce7b6c-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 12 08:04:09 crc kubenswrapper[4599]: I1012 08:04:09.241140 4599 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5a50516-f480-4da5-adb8-853dd9ce7b6c-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 08:04:09 crc kubenswrapper[4599]: I1012 08:04:09.241152 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vszxn\" (UniqueName: \"kubernetes.io/projected/d5a50516-f480-4da5-adb8-853dd9ce7b6c-kube-api-access-vszxn\") on node \"crc\" DevicePath \"\"" Oct 12 08:04:09 crc kubenswrapper[4599]: I1012 08:04:09.241161 4599 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d5a50516-f480-4da5-adb8-853dd9ce7b6c-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 12 08:04:09 crc kubenswrapper[4599]: I1012 08:04:09.241169 4599 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d5a50516-f480-4da5-adb8-853dd9ce7b6c-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 12 08:04:09 crc kubenswrapper[4599]: I1012 08:04:09.719701 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7rb25" event={"ID":"d5a50516-f480-4da5-adb8-853dd9ce7b6c","Type":"ContainerDied","Data":"1e5f54018ef9418282eeccd3776afb09bcf0faf36128b13483ea24885ca61347"} Oct 12 08:04:09 crc kubenswrapper[4599]: I1012 08:04:09.719745 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7rb25" Oct 12 08:04:09 crc kubenswrapper[4599]: I1012 08:04:09.719747 4599 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1e5f54018ef9418282eeccd3776afb09bcf0faf36128b13483ea24885ca61347" Oct 12 08:04:09 crc kubenswrapper[4599]: I1012 08:04:09.779831 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rpvwh"] Oct 12 08:04:09 crc kubenswrapper[4599]: E1012 08:04:09.780159 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5a50516-f480-4da5-adb8-853dd9ce7b6c" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 12 08:04:09 crc kubenswrapper[4599]: I1012 08:04:09.780181 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5a50516-f480-4da5-adb8-853dd9ce7b6c" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 12 08:04:09 crc kubenswrapper[4599]: I1012 08:04:09.780403 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5a50516-f480-4da5-adb8-853dd9ce7b6c" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 12 08:04:09 crc kubenswrapper[4599]: I1012 08:04:09.780966 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rpvwh" Oct 12 08:04:09 crc kubenswrapper[4599]: I1012 08:04:09.783671 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 12 08:04:09 crc kubenswrapper[4599]: I1012 08:04:09.783696 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Oct 12 08:04:09 crc kubenswrapper[4599]: I1012 08:04:09.783761 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 12 08:04:09 crc kubenswrapper[4599]: I1012 08:04:09.783771 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-x82pw" Oct 12 08:04:09 crc kubenswrapper[4599]: I1012 08:04:09.785385 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 12 08:04:09 crc kubenswrapper[4599]: I1012 08:04:09.790363 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rpvwh"] Oct 12 08:04:09 crc kubenswrapper[4599]: I1012 08:04:09.853136 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/ed0b094a-51d6-4287-b4a4-4a0934139fa2-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rpvwh\" (UID: \"ed0b094a-51d6-4287-b4a4-4a0934139fa2\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rpvwh" Oct 12 08:04:09 crc kubenswrapper[4599]: I1012 08:04:09.853706 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed0b094a-51d6-4287-b4a4-4a0934139fa2-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rpvwh\" (UID: \"ed0b094a-51d6-4287-b4a4-4a0934139fa2\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rpvwh" Oct 12 08:04:09 crc kubenswrapper[4599]: I1012 08:04:09.853830 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-md9vm\" (UniqueName: \"kubernetes.io/projected/ed0b094a-51d6-4287-b4a4-4a0934139fa2-kube-api-access-md9vm\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rpvwh\" (UID: \"ed0b094a-51d6-4287-b4a4-4a0934139fa2\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rpvwh" Oct 12 08:04:09 crc kubenswrapper[4599]: I1012 08:04:09.853881 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ed0b094a-51d6-4287-b4a4-4a0934139fa2-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rpvwh\" (UID: \"ed0b094a-51d6-4287-b4a4-4a0934139fa2\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rpvwh" Oct 12 08:04:09 crc kubenswrapper[4599]: I1012 08:04:09.853997 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ed0b094a-51d6-4287-b4a4-4a0934139fa2-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rpvwh\" (UID: \"ed0b094a-51d6-4287-b4a4-4a0934139fa2\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rpvwh" Oct 12 08:04:09 crc kubenswrapper[4599]: I1012 08:04:09.955208 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed0b094a-51d6-4287-b4a4-4a0934139fa2-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rpvwh\" (UID: \"ed0b094a-51d6-4287-b4a4-4a0934139fa2\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rpvwh" Oct 12 08:04:09 crc kubenswrapper[4599]: I1012 08:04:09.955266 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-md9vm\" (UniqueName: \"kubernetes.io/projected/ed0b094a-51d6-4287-b4a4-4a0934139fa2-kube-api-access-md9vm\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rpvwh\" (UID: \"ed0b094a-51d6-4287-b4a4-4a0934139fa2\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rpvwh" Oct 12 08:04:09 crc kubenswrapper[4599]: I1012 08:04:09.955305 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ed0b094a-51d6-4287-b4a4-4a0934139fa2-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rpvwh\" (UID: \"ed0b094a-51d6-4287-b4a4-4a0934139fa2\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rpvwh" Oct 12 08:04:09 crc kubenswrapper[4599]: I1012 08:04:09.955429 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ed0b094a-51d6-4287-b4a4-4a0934139fa2-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rpvwh\" (UID: \"ed0b094a-51d6-4287-b4a4-4a0934139fa2\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rpvwh" Oct 12 08:04:09 crc kubenswrapper[4599]: I1012 08:04:09.955696 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/ed0b094a-51d6-4287-b4a4-4a0934139fa2-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rpvwh\" (UID: \"ed0b094a-51d6-4287-b4a4-4a0934139fa2\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rpvwh" Oct 12 08:04:09 crc kubenswrapper[4599]: I1012 08:04:09.958869 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/ed0b094a-51d6-4287-b4a4-4a0934139fa2-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rpvwh\" (UID: \"ed0b094a-51d6-4287-b4a4-4a0934139fa2\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rpvwh" Oct 12 08:04:09 crc kubenswrapper[4599]: I1012 08:04:09.958902 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed0b094a-51d6-4287-b4a4-4a0934139fa2-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rpvwh\" (UID: \"ed0b094a-51d6-4287-b4a4-4a0934139fa2\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rpvwh" Oct 12 08:04:09 crc kubenswrapper[4599]: I1012 08:04:09.959145 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ed0b094a-51d6-4287-b4a4-4a0934139fa2-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rpvwh\" (UID: \"ed0b094a-51d6-4287-b4a4-4a0934139fa2\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rpvwh" Oct 12 08:04:09 crc kubenswrapper[4599]: I1012 08:04:09.960082 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ed0b094a-51d6-4287-b4a4-4a0934139fa2-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rpvwh\" (UID: \"ed0b094a-51d6-4287-b4a4-4a0934139fa2\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rpvwh" Oct 12 08:04:09 crc kubenswrapper[4599]: I1012 08:04:09.968601 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-md9vm\" (UniqueName: \"kubernetes.io/projected/ed0b094a-51d6-4287-b4a4-4a0934139fa2-kube-api-access-md9vm\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rpvwh\" (UID: \"ed0b094a-51d6-4287-b4a4-4a0934139fa2\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rpvwh" Oct 12 08:04:10 crc kubenswrapper[4599]: I1012 08:04:10.095681 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rpvwh" Oct 12 08:04:10 crc kubenswrapper[4599]: I1012 08:04:10.532008 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rpvwh"] Oct 12 08:04:10 crc kubenswrapper[4599]: I1012 08:04:10.535938 4599 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 12 08:04:10 crc kubenswrapper[4599]: I1012 08:04:10.726497 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rpvwh" event={"ID":"ed0b094a-51d6-4287-b4a4-4a0934139fa2","Type":"ContainerStarted","Data":"8e653027c634f783bfe09e170e5cfc90de167a50c0bbcbed14c3c251a8427885"} Oct 12 08:04:11 crc kubenswrapper[4599]: I1012 08:04:11.734314 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rpvwh" event={"ID":"ed0b094a-51d6-4287-b4a4-4a0934139fa2","Type":"ContainerStarted","Data":"9af4ac3e4f9f54fbfceff37aaa53689791ff1f83701a0867de57d275370b066d"} Oct 12 08:04:16 crc kubenswrapper[4599]: I1012 08:04:16.545567 4599 scope.go:117] "RemoveContainer" containerID="f369260da605ea09588ec62c0ee99215abbfa382e3bbd4d53aa9571e2652f968" Oct 12 08:04:16 crc kubenswrapper[4599]: E1012 08:04:16.546188 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5mz5c_openshift-machine-config-operator(cc694bce-8c25-4729-b452-29d44d3efe6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" Oct 12 08:04:31 crc kubenswrapper[4599]: I1012 08:04:31.545817 4599 scope.go:117] "RemoveContainer" containerID="f369260da605ea09588ec62c0ee99215abbfa382e3bbd4d53aa9571e2652f968" Oct 12 08:04:31 crc kubenswrapper[4599]: I1012 08:04:31.867976 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" event={"ID":"cc694bce-8c25-4729-b452-29d44d3efe6e","Type":"ContainerStarted","Data":"27e98f9f32a7679119f49b8ab56e856d08b70cad719b2a459bd215581f62e802"} Oct 12 08:04:31 crc kubenswrapper[4599]: I1012 08:04:31.884308 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rpvwh" podStartSLOduration=22.059196558 podStartE2EDuration="22.88427878s" podCreationTimestamp="2025-10-12 08:04:09 +0000 UTC" firstStartedPulling="2025-10-12 08:04:10.535716807 +0000 UTC m=+1747.324912308" lastFinishedPulling="2025-10-12 08:04:11.360799028 +0000 UTC m=+1748.149994530" observedRunningTime="2025-10-12 08:04:11.753627405 +0000 UTC m=+1748.542822906" watchObservedRunningTime="2025-10-12 08:04:31.88427878 +0000 UTC m=+1768.673474282" Oct 12 08:06:58 crc kubenswrapper[4599]: I1012 08:06:58.321924 4599 patch_prober.go:28] interesting pod/machine-config-daemon-5mz5c container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 08:06:58 crc kubenswrapper[4599]: I1012 08:06:58.322606 4599 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 08:07:03 crc kubenswrapper[4599]: I1012 08:07:03.886592 4599 generic.go:334] "Generic (PLEG): container finished" podID="ed0b094a-51d6-4287-b4a4-4a0934139fa2" containerID="9af4ac3e4f9f54fbfceff37aaa53689791ff1f83701a0867de57d275370b066d" exitCode=0 Oct 12 08:07:03 crc kubenswrapper[4599]: I1012 08:07:03.886664 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rpvwh" event={"ID":"ed0b094a-51d6-4287-b4a4-4a0934139fa2","Type":"ContainerDied","Data":"9af4ac3e4f9f54fbfceff37aaa53689791ff1f83701a0867de57d275370b066d"} Oct 12 08:07:05 crc kubenswrapper[4599]: I1012 08:07:05.231347 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rpvwh" Oct 12 08:07:05 crc kubenswrapper[4599]: I1012 08:07:05.401378 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/ed0b094a-51d6-4287-b4a4-4a0934139fa2-libvirt-secret-0\") pod \"ed0b094a-51d6-4287-b4a4-4a0934139fa2\" (UID: \"ed0b094a-51d6-4287-b4a4-4a0934139fa2\") " Oct 12 08:07:05 crc kubenswrapper[4599]: I1012 08:07:05.401513 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ed0b094a-51d6-4287-b4a4-4a0934139fa2-inventory\") pod \"ed0b094a-51d6-4287-b4a4-4a0934139fa2\" (UID: \"ed0b094a-51d6-4287-b4a4-4a0934139fa2\") " Oct 12 08:07:05 crc kubenswrapper[4599]: I1012 08:07:05.401563 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ed0b094a-51d6-4287-b4a4-4a0934139fa2-ssh-key\") pod \"ed0b094a-51d6-4287-b4a4-4a0934139fa2\" (UID: \"ed0b094a-51d6-4287-b4a4-4a0934139fa2\") " Oct 12 08:07:05 crc kubenswrapper[4599]: I1012 08:07:05.401648 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed0b094a-51d6-4287-b4a4-4a0934139fa2-libvirt-combined-ca-bundle\") pod \"ed0b094a-51d6-4287-b4a4-4a0934139fa2\" (UID: \"ed0b094a-51d6-4287-b4a4-4a0934139fa2\") " Oct 12 08:07:05 crc kubenswrapper[4599]: I1012 08:07:05.401688 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-md9vm\" (UniqueName: \"kubernetes.io/projected/ed0b094a-51d6-4287-b4a4-4a0934139fa2-kube-api-access-md9vm\") pod \"ed0b094a-51d6-4287-b4a4-4a0934139fa2\" (UID: \"ed0b094a-51d6-4287-b4a4-4a0934139fa2\") " Oct 12 08:07:05 crc kubenswrapper[4599]: I1012 08:07:05.407534 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed0b094a-51d6-4287-b4a4-4a0934139fa2-kube-api-access-md9vm" (OuterVolumeSpecName: "kube-api-access-md9vm") pod "ed0b094a-51d6-4287-b4a4-4a0934139fa2" (UID: "ed0b094a-51d6-4287-b4a4-4a0934139fa2"). InnerVolumeSpecName "kube-api-access-md9vm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 08:07:05 crc kubenswrapper[4599]: I1012 08:07:05.407998 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed0b094a-51d6-4287-b4a4-4a0934139fa2-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "ed0b094a-51d6-4287-b4a4-4a0934139fa2" (UID: "ed0b094a-51d6-4287-b4a4-4a0934139fa2"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 08:07:05 crc kubenswrapper[4599]: I1012 08:07:05.423944 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed0b094a-51d6-4287-b4a4-4a0934139fa2-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ed0b094a-51d6-4287-b4a4-4a0934139fa2" (UID: "ed0b094a-51d6-4287-b4a4-4a0934139fa2"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 08:07:05 crc kubenswrapper[4599]: I1012 08:07:05.424313 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed0b094a-51d6-4287-b4a4-4a0934139fa2-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "ed0b094a-51d6-4287-b4a4-4a0934139fa2" (UID: "ed0b094a-51d6-4287-b4a4-4a0934139fa2"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 08:07:05 crc kubenswrapper[4599]: I1012 08:07:05.425130 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed0b094a-51d6-4287-b4a4-4a0934139fa2-inventory" (OuterVolumeSpecName: "inventory") pod "ed0b094a-51d6-4287-b4a4-4a0934139fa2" (UID: "ed0b094a-51d6-4287-b4a4-4a0934139fa2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 08:07:05 crc kubenswrapper[4599]: I1012 08:07:05.504363 4599 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ed0b094a-51d6-4287-b4a4-4a0934139fa2-inventory\") on node \"crc\" DevicePath \"\"" Oct 12 08:07:05 crc kubenswrapper[4599]: I1012 08:07:05.504469 4599 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ed0b094a-51d6-4287-b4a4-4a0934139fa2-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 12 08:07:05 crc kubenswrapper[4599]: I1012 08:07:05.504526 4599 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed0b094a-51d6-4287-b4a4-4a0934139fa2-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 08:07:05 crc kubenswrapper[4599]: I1012 08:07:05.504607 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-md9vm\" (UniqueName: \"kubernetes.io/projected/ed0b094a-51d6-4287-b4a4-4a0934139fa2-kube-api-access-md9vm\") on node \"crc\" DevicePath \"\"" Oct 12 08:07:05 crc kubenswrapper[4599]: I1012 08:07:05.504661 4599 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/ed0b094a-51d6-4287-b4a4-4a0934139fa2-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Oct 12 08:07:05 crc kubenswrapper[4599]: I1012 08:07:05.902655 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rpvwh" event={"ID":"ed0b094a-51d6-4287-b4a4-4a0934139fa2","Type":"ContainerDied","Data":"8e653027c634f783bfe09e170e5cfc90de167a50c0bbcbed14c3c251a8427885"} Oct 12 08:07:05 crc kubenswrapper[4599]: I1012 08:07:05.902709 4599 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8e653027c634f783bfe09e170e5cfc90de167a50c0bbcbed14c3c251a8427885" Oct 12 08:07:05 crc kubenswrapper[4599]: I1012 08:07:05.902770 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rpvwh" Oct 12 08:07:05 crc kubenswrapper[4599]: I1012 08:07:05.982735 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-qxq8q"] Oct 12 08:07:05 crc kubenswrapper[4599]: E1012 08:07:05.983411 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed0b094a-51d6-4287-b4a4-4a0934139fa2" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 12 08:07:05 crc kubenswrapper[4599]: I1012 08:07:05.983429 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed0b094a-51d6-4287-b4a4-4a0934139fa2" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 12 08:07:05 crc kubenswrapper[4599]: I1012 08:07:05.983589 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed0b094a-51d6-4287-b4a4-4a0934139fa2" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 12 08:07:05 crc kubenswrapper[4599]: I1012 08:07:05.984140 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qxq8q" Oct 12 08:07:05 crc kubenswrapper[4599]: I1012 08:07:05.986611 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 12 08:07:05 crc kubenswrapper[4599]: I1012 08:07:05.988140 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Oct 12 08:07:05 crc kubenswrapper[4599]: I1012 08:07:05.990463 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 12 08:07:05 crc kubenswrapper[4599]: I1012 08:07:05.990498 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-x82pw" Oct 12 08:07:05 crc kubenswrapper[4599]: I1012 08:07:05.992371 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Oct 12 08:07:05 crc kubenswrapper[4599]: I1012 08:07:05.992469 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 12 08:07:05 crc kubenswrapper[4599]: I1012 08:07:05.996097 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Oct 12 08:07:05 crc kubenswrapper[4599]: I1012 08:07:05.998880 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-qxq8q"] Oct 12 08:07:06 crc kubenswrapper[4599]: I1012 08:07:06.012062 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lt68\" (UniqueName: \"kubernetes.io/projected/c01400c9-ebac-486d-ac74-9cec09171386-kube-api-access-4lt68\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qxq8q\" (UID: \"c01400c9-ebac-486d-ac74-9cec09171386\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qxq8q" Oct 12 08:07:06 crc kubenswrapper[4599]: I1012 08:07:06.012128 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c01400c9-ebac-486d-ac74-9cec09171386-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qxq8q\" (UID: \"c01400c9-ebac-486d-ac74-9cec09171386\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qxq8q" Oct 12 08:07:06 crc kubenswrapper[4599]: I1012 08:07:06.012149 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c01400c9-ebac-486d-ac74-9cec09171386-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qxq8q\" (UID: \"c01400c9-ebac-486d-ac74-9cec09171386\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qxq8q" Oct 12 08:07:06 crc kubenswrapper[4599]: I1012 08:07:06.012174 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/c01400c9-ebac-486d-ac74-9cec09171386-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qxq8q\" (UID: \"c01400c9-ebac-486d-ac74-9cec09171386\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qxq8q" Oct 12 08:07:06 crc kubenswrapper[4599]: I1012 08:07:06.012210 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/c01400c9-ebac-486d-ac74-9cec09171386-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qxq8q\" (UID: \"c01400c9-ebac-486d-ac74-9cec09171386\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qxq8q" Oct 12 08:07:06 crc kubenswrapper[4599]: I1012 08:07:06.012249 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c01400c9-ebac-486d-ac74-9cec09171386-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qxq8q\" (UID: \"c01400c9-ebac-486d-ac74-9cec09171386\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qxq8q" Oct 12 08:07:06 crc kubenswrapper[4599]: I1012 08:07:06.012303 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/c01400c9-ebac-486d-ac74-9cec09171386-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qxq8q\" (UID: \"c01400c9-ebac-486d-ac74-9cec09171386\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qxq8q" Oct 12 08:07:06 crc kubenswrapper[4599]: I1012 08:07:06.012322 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/c01400c9-ebac-486d-ac74-9cec09171386-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qxq8q\" (UID: \"c01400c9-ebac-486d-ac74-9cec09171386\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qxq8q" Oct 12 08:07:06 crc kubenswrapper[4599]: I1012 08:07:06.012396 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/c01400c9-ebac-486d-ac74-9cec09171386-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qxq8q\" (UID: \"c01400c9-ebac-486d-ac74-9cec09171386\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qxq8q" Oct 12 08:07:06 crc kubenswrapper[4599]: I1012 08:07:06.113653 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lt68\" (UniqueName: \"kubernetes.io/projected/c01400c9-ebac-486d-ac74-9cec09171386-kube-api-access-4lt68\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qxq8q\" (UID: \"c01400c9-ebac-486d-ac74-9cec09171386\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qxq8q" Oct 12 08:07:06 crc kubenswrapper[4599]: I1012 08:07:06.113709 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c01400c9-ebac-486d-ac74-9cec09171386-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qxq8q\" (UID: \"c01400c9-ebac-486d-ac74-9cec09171386\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qxq8q" Oct 12 08:07:06 crc kubenswrapper[4599]: I1012 08:07:06.113734 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c01400c9-ebac-486d-ac74-9cec09171386-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qxq8q\" (UID: \"c01400c9-ebac-486d-ac74-9cec09171386\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qxq8q" Oct 12 08:07:06 crc kubenswrapper[4599]: I1012 08:07:06.113757 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/c01400c9-ebac-486d-ac74-9cec09171386-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qxq8q\" (UID: \"c01400c9-ebac-486d-ac74-9cec09171386\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qxq8q" Oct 12 08:07:06 crc kubenswrapper[4599]: I1012 08:07:06.113785 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/c01400c9-ebac-486d-ac74-9cec09171386-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qxq8q\" (UID: \"c01400c9-ebac-486d-ac74-9cec09171386\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qxq8q" Oct 12 08:07:06 crc kubenswrapper[4599]: I1012 08:07:06.113819 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c01400c9-ebac-486d-ac74-9cec09171386-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qxq8q\" (UID: \"c01400c9-ebac-486d-ac74-9cec09171386\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qxq8q" Oct 12 08:07:06 crc kubenswrapper[4599]: I1012 08:07:06.113862 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/c01400c9-ebac-486d-ac74-9cec09171386-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qxq8q\" (UID: \"c01400c9-ebac-486d-ac74-9cec09171386\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qxq8q" Oct 12 08:07:06 crc kubenswrapper[4599]: I1012 08:07:06.113881 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/c01400c9-ebac-486d-ac74-9cec09171386-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qxq8q\" (UID: \"c01400c9-ebac-486d-ac74-9cec09171386\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qxq8q" Oct 12 08:07:06 crc kubenswrapper[4599]: I1012 08:07:06.113924 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/c01400c9-ebac-486d-ac74-9cec09171386-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qxq8q\" (UID: \"c01400c9-ebac-486d-ac74-9cec09171386\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qxq8q" Oct 12 08:07:06 crc kubenswrapper[4599]: I1012 08:07:06.115042 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/c01400c9-ebac-486d-ac74-9cec09171386-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qxq8q\" (UID: \"c01400c9-ebac-486d-ac74-9cec09171386\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qxq8q" Oct 12 08:07:06 crc kubenswrapper[4599]: I1012 08:07:06.118298 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/c01400c9-ebac-486d-ac74-9cec09171386-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qxq8q\" (UID: \"c01400c9-ebac-486d-ac74-9cec09171386\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qxq8q" Oct 12 08:07:06 crc kubenswrapper[4599]: I1012 08:07:06.118354 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/c01400c9-ebac-486d-ac74-9cec09171386-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qxq8q\" (UID: \"c01400c9-ebac-486d-ac74-9cec09171386\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qxq8q" Oct 12 08:07:06 crc kubenswrapper[4599]: I1012 08:07:06.118446 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/c01400c9-ebac-486d-ac74-9cec09171386-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qxq8q\" (UID: \"c01400c9-ebac-486d-ac74-9cec09171386\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qxq8q" Oct 12 08:07:06 crc kubenswrapper[4599]: I1012 08:07:06.119109 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c01400c9-ebac-486d-ac74-9cec09171386-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qxq8q\" (UID: \"c01400c9-ebac-486d-ac74-9cec09171386\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qxq8q" Oct 12 08:07:06 crc kubenswrapper[4599]: I1012 08:07:06.121559 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c01400c9-ebac-486d-ac74-9cec09171386-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qxq8q\" (UID: \"c01400c9-ebac-486d-ac74-9cec09171386\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qxq8q" Oct 12 08:07:06 crc kubenswrapper[4599]: I1012 08:07:06.121628 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/c01400c9-ebac-486d-ac74-9cec09171386-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qxq8q\" (UID: \"c01400c9-ebac-486d-ac74-9cec09171386\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qxq8q" Oct 12 08:07:06 crc kubenswrapper[4599]: I1012 08:07:06.121742 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c01400c9-ebac-486d-ac74-9cec09171386-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qxq8q\" (UID: \"c01400c9-ebac-486d-ac74-9cec09171386\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qxq8q" Oct 12 08:07:06 crc kubenswrapper[4599]: I1012 08:07:06.127214 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lt68\" (UniqueName: \"kubernetes.io/projected/c01400c9-ebac-486d-ac74-9cec09171386-kube-api-access-4lt68\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qxq8q\" (UID: \"c01400c9-ebac-486d-ac74-9cec09171386\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qxq8q" Oct 12 08:07:06 crc kubenswrapper[4599]: I1012 08:07:06.300403 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qxq8q" Oct 12 08:07:06 crc kubenswrapper[4599]: I1012 08:07:06.730230 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-qxq8q"] Oct 12 08:07:06 crc kubenswrapper[4599]: I1012 08:07:06.909800 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qxq8q" event={"ID":"c01400c9-ebac-486d-ac74-9cec09171386","Type":"ContainerStarted","Data":"3f37ed70e2e4853b9674f03f96e2a18836469fb1f8f47e74f46651064c66ec68"} Oct 12 08:07:07 crc kubenswrapper[4599]: I1012 08:07:07.917363 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qxq8q" event={"ID":"c01400c9-ebac-486d-ac74-9cec09171386","Type":"ContainerStarted","Data":"fcecdae8b0004a10f26d573dc7a46cb0e9019c0265fa4a57ce7081c5444340b1"} Oct 12 08:07:07 crc kubenswrapper[4599]: I1012 08:07:07.934555 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qxq8q" podStartSLOduration=2.203351742 podStartE2EDuration="2.934537582s" podCreationTimestamp="2025-10-12 08:07:05 +0000 UTC" firstStartedPulling="2025-10-12 08:07:06.73628572 +0000 UTC m=+1923.525481222" lastFinishedPulling="2025-10-12 08:07:07.46747156 +0000 UTC m=+1924.256667062" observedRunningTime="2025-10-12 08:07:07.92935466 +0000 UTC m=+1924.718550162" watchObservedRunningTime="2025-10-12 08:07:07.934537582 +0000 UTC m=+1924.723733084" Oct 12 08:07:28 crc kubenswrapper[4599]: I1012 08:07:28.321488 4599 patch_prober.go:28] interesting pod/machine-config-daemon-5mz5c container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 08:07:28 crc kubenswrapper[4599]: I1012 08:07:28.322063 4599 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 08:07:58 crc kubenswrapper[4599]: I1012 08:07:58.322075 4599 patch_prober.go:28] interesting pod/machine-config-daemon-5mz5c container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 08:07:58 crc kubenswrapper[4599]: I1012 08:07:58.322678 4599 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 08:07:58 crc kubenswrapper[4599]: I1012 08:07:58.322732 4599 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" Oct 12 08:07:58 crc kubenswrapper[4599]: I1012 08:07:58.323477 4599 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"27e98f9f32a7679119f49b8ab56e856d08b70cad719b2a459bd215581f62e802"} pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 12 08:07:58 crc kubenswrapper[4599]: I1012 08:07:58.323527 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" containerName="machine-config-daemon" containerID="cri-o://27e98f9f32a7679119f49b8ab56e856d08b70cad719b2a459bd215581f62e802" gracePeriod=600 Oct 12 08:07:59 crc kubenswrapper[4599]: I1012 08:07:59.280940 4599 generic.go:334] "Generic (PLEG): container finished" podID="cc694bce-8c25-4729-b452-29d44d3efe6e" containerID="27e98f9f32a7679119f49b8ab56e856d08b70cad719b2a459bd215581f62e802" exitCode=0 Oct 12 08:07:59 crc kubenswrapper[4599]: I1012 08:07:59.280998 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" event={"ID":"cc694bce-8c25-4729-b452-29d44d3efe6e","Type":"ContainerDied","Data":"27e98f9f32a7679119f49b8ab56e856d08b70cad719b2a459bd215581f62e802"} Oct 12 08:07:59 crc kubenswrapper[4599]: I1012 08:07:59.281531 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" event={"ID":"cc694bce-8c25-4729-b452-29d44d3efe6e","Type":"ContainerStarted","Data":"4e7d8f7d3d60ebdc79d83491f148013a2c65198dbd3b53c026bfcf65f06699ba"} Oct 12 08:07:59 crc kubenswrapper[4599]: I1012 08:07:59.281553 4599 scope.go:117] "RemoveContainer" containerID="f369260da605ea09588ec62c0ee99215abbfa382e3bbd4d53aa9571e2652f968" Oct 12 08:09:14 crc kubenswrapper[4599]: I1012 08:09:14.398145 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-b77s6"] Oct 12 08:09:14 crc kubenswrapper[4599]: I1012 08:09:14.400692 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b77s6" Oct 12 08:09:14 crc kubenswrapper[4599]: I1012 08:09:14.410834 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-b77s6"] Oct 12 08:09:14 crc kubenswrapper[4599]: I1012 08:09:14.482249 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9fcc\" (UniqueName: \"kubernetes.io/projected/5a878a62-b505-42f1-8dff-e565ffc9e18b-kube-api-access-d9fcc\") pod \"redhat-operators-b77s6\" (UID: \"5a878a62-b505-42f1-8dff-e565ffc9e18b\") " pod="openshift-marketplace/redhat-operators-b77s6" Oct 12 08:09:14 crc kubenswrapper[4599]: I1012 08:09:14.482317 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a878a62-b505-42f1-8dff-e565ffc9e18b-utilities\") pod \"redhat-operators-b77s6\" (UID: \"5a878a62-b505-42f1-8dff-e565ffc9e18b\") " pod="openshift-marketplace/redhat-operators-b77s6" Oct 12 08:09:14 crc kubenswrapper[4599]: I1012 08:09:14.482382 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a878a62-b505-42f1-8dff-e565ffc9e18b-catalog-content\") pod \"redhat-operators-b77s6\" (UID: \"5a878a62-b505-42f1-8dff-e565ffc9e18b\") " pod="openshift-marketplace/redhat-operators-b77s6" Oct 12 08:09:14 crc kubenswrapper[4599]: I1012 08:09:14.584881 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9fcc\" (UniqueName: \"kubernetes.io/projected/5a878a62-b505-42f1-8dff-e565ffc9e18b-kube-api-access-d9fcc\") pod \"redhat-operators-b77s6\" (UID: \"5a878a62-b505-42f1-8dff-e565ffc9e18b\") " pod="openshift-marketplace/redhat-operators-b77s6" Oct 12 08:09:14 crc kubenswrapper[4599]: I1012 08:09:14.584978 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a878a62-b505-42f1-8dff-e565ffc9e18b-utilities\") pod \"redhat-operators-b77s6\" (UID: \"5a878a62-b505-42f1-8dff-e565ffc9e18b\") " pod="openshift-marketplace/redhat-operators-b77s6" Oct 12 08:09:14 crc kubenswrapper[4599]: I1012 08:09:14.585024 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a878a62-b505-42f1-8dff-e565ffc9e18b-catalog-content\") pod \"redhat-operators-b77s6\" (UID: \"5a878a62-b505-42f1-8dff-e565ffc9e18b\") " pod="openshift-marketplace/redhat-operators-b77s6" Oct 12 08:09:14 crc kubenswrapper[4599]: I1012 08:09:14.585627 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a878a62-b505-42f1-8dff-e565ffc9e18b-utilities\") pod \"redhat-operators-b77s6\" (UID: \"5a878a62-b505-42f1-8dff-e565ffc9e18b\") " pod="openshift-marketplace/redhat-operators-b77s6" Oct 12 08:09:14 crc kubenswrapper[4599]: I1012 08:09:14.585687 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a878a62-b505-42f1-8dff-e565ffc9e18b-catalog-content\") pod \"redhat-operators-b77s6\" (UID: \"5a878a62-b505-42f1-8dff-e565ffc9e18b\") " pod="openshift-marketplace/redhat-operators-b77s6" Oct 12 08:09:14 crc kubenswrapper[4599]: I1012 08:09:14.611669 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9fcc\" (UniqueName: \"kubernetes.io/projected/5a878a62-b505-42f1-8dff-e565ffc9e18b-kube-api-access-d9fcc\") pod \"redhat-operators-b77s6\" (UID: \"5a878a62-b505-42f1-8dff-e565ffc9e18b\") " pod="openshift-marketplace/redhat-operators-b77s6" Oct 12 08:09:14 crc kubenswrapper[4599]: I1012 08:09:14.718307 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b77s6" Oct 12 08:09:15 crc kubenswrapper[4599]: I1012 08:09:15.104389 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-b77s6"] Oct 12 08:09:15 crc kubenswrapper[4599]: I1012 08:09:15.897314 4599 generic.go:334] "Generic (PLEG): container finished" podID="5a878a62-b505-42f1-8dff-e565ffc9e18b" containerID="bbbf52f82fa6836011cb12e6f0402d235f9713c9fae333ad19c32aeef3961029" exitCode=0 Oct 12 08:09:15 crc kubenswrapper[4599]: I1012 08:09:15.897401 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b77s6" event={"ID":"5a878a62-b505-42f1-8dff-e565ffc9e18b","Type":"ContainerDied","Data":"bbbf52f82fa6836011cb12e6f0402d235f9713c9fae333ad19c32aeef3961029"} Oct 12 08:09:15 crc kubenswrapper[4599]: I1012 08:09:15.897688 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b77s6" event={"ID":"5a878a62-b505-42f1-8dff-e565ffc9e18b","Type":"ContainerStarted","Data":"553c79bd104af6e94b9be962ceb8524762ec7dfdb5c9b46484ab6e9a9077410d"} Oct 12 08:09:15 crc kubenswrapper[4599]: I1012 08:09:15.899832 4599 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 12 08:09:16 crc kubenswrapper[4599]: I1012 08:09:16.908497 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b77s6" event={"ID":"5a878a62-b505-42f1-8dff-e565ffc9e18b","Type":"ContainerStarted","Data":"94a1f1635d64d479030e0651f9f4738c92df071230a952d2bae1d7abd352d901"} Oct 12 08:09:18 crc kubenswrapper[4599]: I1012 08:09:18.923147 4599 generic.go:334] "Generic (PLEG): container finished" podID="5a878a62-b505-42f1-8dff-e565ffc9e18b" containerID="94a1f1635d64d479030e0651f9f4738c92df071230a952d2bae1d7abd352d901" exitCode=0 Oct 12 08:09:18 crc kubenswrapper[4599]: I1012 08:09:18.923209 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b77s6" event={"ID":"5a878a62-b505-42f1-8dff-e565ffc9e18b","Type":"ContainerDied","Data":"94a1f1635d64d479030e0651f9f4738c92df071230a952d2bae1d7abd352d901"} Oct 12 08:09:19 crc kubenswrapper[4599]: I1012 08:09:19.933559 4599 generic.go:334] "Generic (PLEG): container finished" podID="c01400c9-ebac-486d-ac74-9cec09171386" containerID="fcecdae8b0004a10f26d573dc7a46cb0e9019c0265fa4a57ce7081c5444340b1" exitCode=0 Oct 12 08:09:19 crc kubenswrapper[4599]: I1012 08:09:19.933656 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qxq8q" event={"ID":"c01400c9-ebac-486d-ac74-9cec09171386","Type":"ContainerDied","Data":"fcecdae8b0004a10f26d573dc7a46cb0e9019c0265fa4a57ce7081c5444340b1"} Oct 12 08:09:19 crc kubenswrapper[4599]: I1012 08:09:19.937815 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b77s6" event={"ID":"5a878a62-b505-42f1-8dff-e565ffc9e18b","Type":"ContainerStarted","Data":"91e26687a87c367ac1aae6f1f1826f3c33981b8311cc847ffc994e24f5ef7f0d"} Oct 12 08:09:19 crc kubenswrapper[4599]: I1012 08:09:19.972790 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-b77s6" podStartSLOduration=2.449493104 podStartE2EDuration="5.972775759s" podCreationTimestamp="2025-10-12 08:09:14 +0000 UTC" firstStartedPulling="2025-10-12 08:09:15.899554695 +0000 UTC m=+2052.688750198" lastFinishedPulling="2025-10-12 08:09:19.42283735 +0000 UTC m=+2056.212032853" observedRunningTime="2025-10-12 08:09:19.965349025 +0000 UTC m=+2056.754544528" watchObservedRunningTime="2025-10-12 08:09:19.972775759 +0000 UTC m=+2056.761971260" Oct 12 08:09:21 crc kubenswrapper[4599]: I1012 08:09:21.238167 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qxq8q" Oct 12 08:09:21 crc kubenswrapper[4599]: I1012 08:09:21.396386 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c01400c9-ebac-486d-ac74-9cec09171386-inventory\") pod \"c01400c9-ebac-486d-ac74-9cec09171386\" (UID: \"c01400c9-ebac-486d-ac74-9cec09171386\") " Oct 12 08:09:21 crc kubenswrapper[4599]: I1012 08:09:21.396595 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/c01400c9-ebac-486d-ac74-9cec09171386-nova-migration-ssh-key-0\") pod \"c01400c9-ebac-486d-ac74-9cec09171386\" (UID: \"c01400c9-ebac-486d-ac74-9cec09171386\") " Oct 12 08:09:21 crc kubenswrapper[4599]: I1012 08:09:21.396627 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/c01400c9-ebac-486d-ac74-9cec09171386-nova-cell1-compute-config-1\") pod \"c01400c9-ebac-486d-ac74-9cec09171386\" (UID: \"c01400c9-ebac-486d-ac74-9cec09171386\") " Oct 12 08:09:21 crc kubenswrapper[4599]: I1012 08:09:21.396720 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4lt68\" (UniqueName: \"kubernetes.io/projected/c01400c9-ebac-486d-ac74-9cec09171386-kube-api-access-4lt68\") pod \"c01400c9-ebac-486d-ac74-9cec09171386\" (UID: \"c01400c9-ebac-486d-ac74-9cec09171386\") " Oct 12 08:09:21 crc kubenswrapper[4599]: I1012 08:09:21.397256 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/c01400c9-ebac-486d-ac74-9cec09171386-nova-cell1-compute-config-0\") pod \"c01400c9-ebac-486d-ac74-9cec09171386\" (UID: \"c01400c9-ebac-486d-ac74-9cec09171386\") " Oct 12 08:09:21 crc kubenswrapper[4599]: I1012 08:09:21.397350 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/c01400c9-ebac-486d-ac74-9cec09171386-nova-extra-config-0\") pod \"c01400c9-ebac-486d-ac74-9cec09171386\" (UID: \"c01400c9-ebac-486d-ac74-9cec09171386\") " Oct 12 08:09:21 crc kubenswrapper[4599]: I1012 08:09:21.397378 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c01400c9-ebac-486d-ac74-9cec09171386-ssh-key\") pod \"c01400c9-ebac-486d-ac74-9cec09171386\" (UID: \"c01400c9-ebac-486d-ac74-9cec09171386\") " Oct 12 08:09:21 crc kubenswrapper[4599]: I1012 08:09:21.397399 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/c01400c9-ebac-486d-ac74-9cec09171386-nova-migration-ssh-key-1\") pod \"c01400c9-ebac-486d-ac74-9cec09171386\" (UID: \"c01400c9-ebac-486d-ac74-9cec09171386\") " Oct 12 08:09:21 crc kubenswrapper[4599]: I1012 08:09:21.397430 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c01400c9-ebac-486d-ac74-9cec09171386-nova-combined-ca-bundle\") pod \"c01400c9-ebac-486d-ac74-9cec09171386\" (UID: \"c01400c9-ebac-486d-ac74-9cec09171386\") " Oct 12 08:09:21 crc kubenswrapper[4599]: I1012 08:09:21.401936 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c01400c9-ebac-486d-ac74-9cec09171386-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "c01400c9-ebac-486d-ac74-9cec09171386" (UID: "c01400c9-ebac-486d-ac74-9cec09171386"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 08:09:21 crc kubenswrapper[4599]: I1012 08:09:21.415706 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c01400c9-ebac-486d-ac74-9cec09171386-kube-api-access-4lt68" (OuterVolumeSpecName: "kube-api-access-4lt68") pod "c01400c9-ebac-486d-ac74-9cec09171386" (UID: "c01400c9-ebac-486d-ac74-9cec09171386"). InnerVolumeSpecName "kube-api-access-4lt68". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 08:09:21 crc kubenswrapper[4599]: I1012 08:09:21.417874 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c01400c9-ebac-486d-ac74-9cec09171386-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "c01400c9-ebac-486d-ac74-9cec09171386" (UID: "c01400c9-ebac-486d-ac74-9cec09171386"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 08:09:21 crc kubenswrapper[4599]: I1012 08:09:21.429775 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c01400c9-ebac-486d-ac74-9cec09171386-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "c01400c9-ebac-486d-ac74-9cec09171386" (UID: "c01400c9-ebac-486d-ac74-9cec09171386"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 08:09:21 crc kubenswrapper[4599]: I1012 08:09:21.430057 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c01400c9-ebac-486d-ac74-9cec09171386-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "c01400c9-ebac-486d-ac74-9cec09171386" (UID: "c01400c9-ebac-486d-ac74-9cec09171386"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 08:09:21 crc kubenswrapper[4599]: I1012 08:09:21.431968 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c01400c9-ebac-486d-ac74-9cec09171386-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c01400c9-ebac-486d-ac74-9cec09171386" (UID: "c01400c9-ebac-486d-ac74-9cec09171386"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 08:09:21 crc kubenswrapper[4599]: I1012 08:09:21.433351 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c01400c9-ebac-486d-ac74-9cec09171386-inventory" (OuterVolumeSpecName: "inventory") pod "c01400c9-ebac-486d-ac74-9cec09171386" (UID: "c01400c9-ebac-486d-ac74-9cec09171386"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 08:09:21 crc kubenswrapper[4599]: I1012 08:09:21.434315 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c01400c9-ebac-486d-ac74-9cec09171386-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "c01400c9-ebac-486d-ac74-9cec09171386" (UID: "c01400c9-ebac-486d-ac74-9cec09171386"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 08:09:21 crc kubenswrapper[4599]: I1012 08:09:21.435725 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c01400c9-ebac-486d-ac74-9cec09171386-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "c01400c9-ebac-486d-ac74-9cec09171386" (UID: "c01400c9-ebac-486d-ac74-9cec09171386"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 08:09:21 crc kubenswrapper[4599]: I1012 08:09:21.499822 4599 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c01400c9-ebac-486d-ac74-9cec09171386-inventory\") on node \"crc\" DevicePath \"\"" Oct 12 08:09:21 crc kubenswrapper[4599]: I1012 08:09:21.499862 4599 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/c01400c9-ebac-486d-ac74-9cec09171386-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Oct 12 08:09:21 crc kubenswrapper[4599]: I1012 08:09:21.499873 4599 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/c01400c9-ebac-486d-ac74-9cec09171386-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Oct 12 08:09:21 crc kubenswrapper[4599]: I1012 08:09:21.499882 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4lt68\" (UniqueName: \"kubernetes.io/projected/c01400c9-ebac-486d-ac74-9cec09171386-kube-api-access-4lt68\") on node \"crc\" DevicePath \"\"" Oct 12 08:09:21 crc kubenswrapper[4599]: I1012 08:09:21.499907 4599 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/c01400c9-ebac-486d-ac74-9cec09171386-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Oct 12 08:09:21 crc kubenswrapper[4599]: I1012 08:09:21.499920 4599 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/c01400c9-ebac-486d-ac74-9cec09171386-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Oct 12 08:09:21 crc kubenswrapper[4599]: I1012 08:09:21.499928 4599 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c01400c9-ebac-486d-ac74-9cec09171386-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 12 08:09:21 crc kubenswrapper[4599]: I1012 08:09:21.499940 4599 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/c01400c9-ebac-486d-ac74-9cec09171386-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Oct 12 08:09:21 crc kubenswrapper[4599]: I1012 08:09:21.499948 4599 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c01400c9-ebac-486d-ac74-9cec09171386-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 08:09:21 crc kubenswrapper[4599]: I1012 08:09:21.950760 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qxq8q" event={"ID":"c01400c9-ebac-486d-ac74-9cec09171386","Type":"ContainerDied","Data":"3f37ed70e2e4853b9674f03f96e2a18836469fb1f8f47e74f46651064c66ec68"} Oct 12 08:09:21 crc kubenswrapper[4599]: I1012 08:09:21.950798 4599 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f37ed70e2e4853b9674f03f96e2a18836469fb1f8f47e74f46651064c66ec68" Oct 12 08:09:21 crc kubenswrapper[4599]: I1012 08:09:21.950812 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qxq8q" Oct 12 08:09:22 crc kubenswrapper[4599]: I1012 08:09:22.013999 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-b67cz"] Oct 12 08:09:22 crc kubenswrapper[4599]: E1012 08:09:22.014705 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c01400c9-ebac-486d-ac74-9cec09171386" containerName="nova-edpm-deployment-openstack-edpm-ipam" Oct 12 08:09:22 crc kubenswrapper[4599]: I1012 08:09:22.014726 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="c01400c9-ebac-486d-ac74-9cec09171386" containerName="nova-edpm-deployment-openstack-edpm-ipam" Oct 12 08:09:22 crc kubenswrapper[4599]: I1012 08:09:22.014945 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="c01400c9-ebac-486d-ac74-9cec09171386" containerName="nova-edpm-deployment-openstack-edpm-ipam" Oct 12 08:09:22 crc kubenswrapper[4599]: I1012 08:09:22.015535 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-b67cz" Oct 12 08:09:22 crc kubenswrapper[4599]: I1012 08:09:22.017323 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-x82pw" Oct 12 08:09:22 crc kubenswrapper[4599]: I1012 08:09:22.017405 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Oct 12 08:09:22 crc kubenswrapper[4599]: I1012 08:09:22.017634 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 12 08:09:22 crc kubenswrapper[4599]: I1012 08:09:22.017729 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 12 08:09:22 crc kubenswrapper[4599]: I1012 08:09:22.017782 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 12 08:09:22 crc kubenswrapper[4599]: I1012 08:09:22.024669 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-b67cz"] Oct 12 08:09:22 crc kubenswrapper[4599]: I1012 08:09:22.108678 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/a489145f-f0fe-4e55-a9eb-29df1419aa2b-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-b67cz\" (UID: \"a489145f-f0fe-4e55-a9eb-29df1419aa2b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-b67cz" Oct 12 08:09:22 crc kubenswrapper[4599]: I1012 08:09:22.108733 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h85zr\" (UniqueName: \"kubernetes.io/projected/a489145f-f0fe-4e55-a9eb-29df1419aa2b-kube-api-access-h85zr\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-b67cz\" (UID: \"a489145f-f0fe-4e55-a9eb-29df1419aa2b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-b67cz" Oct 12 08:09:22 crc kubenswrapper[4599]: I1012 08:09:22.108805 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a489145f-f0fe-4e55-a9eb-29df1419aa2b-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-b67cz\" (UID: \"a489145f-f0fe-4e55-a9eb-29df1419aa2b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-b67cz" Oct 12 08:09:22 crc kubenswrapper[4599]: I1012 08:09:22.108866 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a489145f-f0fe-4e55-a9eb-29df1419aa2b-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-b67cz\" (UID: \"a489145f-f0fe-4e55-a9eb-29df1419aa2b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-b67cz" Oct 12 08:09:22 crc kubenswrapper[4599]: I1012 08:09:22.108959 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/a489145f-f0fe-4e55-a9eb-29df1419aa2b-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-b67cz\" (UID: \"a489145f-f0fe-4e55-a9eb-29df1419aa2b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-b67cz" Oct 12 08:09:22 crc kubenswrapper[4599]: I1012 08:09:22.109110 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/a489145f-f0fe-4e55-a9eb-29df1419aa2b-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-b67cz\" (UID: \"a489145f-f0fe-4e55-a9eb-29df1419aa2b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-b67cz" Oct 12 08:09:22 crc kubenswrapper[4599]: I1012 08:09:22.109143 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a489145f-f0fe-4e55-a9eb-29df1419aa2b-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-b67cz\" (UID: \"a489145f-f0fe-4e55-a9eb-29df1419aa2b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-b67cz" Oct 12 08:09:22 crc kubenswrapper[4599]: I1012 08:09:22.210389 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/a489145f-f0fe-4e55-a9eb-29df1419aa2b-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-b67cz\" (UID: \"a489145f-f0fe-4e55-a9eb-29df1419aa2b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-b67cz" Oct 12 08:09:22 crc kubenswrapper[4599]: I1012 08:09:22.210434 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h85zr\" (UniqueName: \"kubernetes.io/projected/a489145f-f0fe-4e55-a9eb-29df1419aa2b-kube-api-access-h85zr\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-b67cz\" (UID: \"a489145f-f0fe-4e55-a9eb-29df1419aa2b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-b67cz" Oct 12 08:09:22 crc kubenswrapper[4599]: I1012 08:09:22.210467 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a489145f-f0fe-4e55-a9eb-29df1419aa2b-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-b67cz\" (UID: \"a489145f-f0fe-4e55-a9eb-29df1419aa2b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-b67cz" Oct 12 08:09:22 crc kubenswrapper[4599]: I1012 08:09:22.210504 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a489145f-f0fe-4e55-a9eb-29df1419aa2b-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-b67cz\" (UID: \"a489145f-f0fe-4e55-a9eb-29df1419aa2b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-b67cz" Oct 12 08:09:22 crc kubenswrapper[4599]: I1012 08:09:22.211041 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/a489145f-f0fe-4e55-a9eb-29df1419aa2b-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-b67cz\" (UID: \"a489145f-f0fe-4e55-a9eb-29df1419aa2b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-b67cz" Oct 12 08:09:22 crc kubenswrapper[4599]: I1012 08:09:22.211104 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/a489145f-f0fe-4e55-a9eb-29df1419aa2b-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-b67cz\" (UID: \"a489145f-f0fe-4e55-a9eb-29df1419aa2b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-b67cz" Oct 12 08:09:22 crc kubenswrapper[4599]: I1012 08:09:22.211125 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a489145f-f0fe-4e55-a9eb-29df1419aa2b-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-b67cz\" (UID: \"a489145f-f0fe-4e55-a9eb-29df1419aa2b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-b67cz" Oct 12 08:09:22 crc kubenswrapper[4599]: I1012 08:09:22.214039 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/a489145f-f0fe-4e55-a9eb-29df1419aa2b-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-b67cz\" (UID: \"a489145f-f0fe-4e55-a9eb-29df1419aa2b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-b67cz" Oct 12 08:09:22 crc kubenswrapper[4599]: I1012 08:09:22.214305 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a489145f-f0fe-4e55-a9eb-29df1419aa2b-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-b67cz\" (UID: \"a489145f-f0fe-4e55-a9eb-29df1419aa2b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-b67cz" Oct 12 08:09:22 crc kubenswrapper[4599]: I1012 08:09:22.214885 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/a489145f-f0fe-4e55-a9eb-29df1419aa2b-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-b67cz\" (UID: \"a489145f-f0fe-4e55-a9eb-29df1419aa2b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-b67cz" Oct 12 08:09:22 crc kubenswrapper[4599]: I1012 08:09:22.221752 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/a489145f-f0fe-4e55-a9eb-29df1419aa2b-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-b67cz\" (UID: \"a489145f-f0fe-4e55-a9eb-29df1419aa2b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-b67cz" Oct 12 08:09:22 crc kubenswrapper[4599]: I1012 08:09:22.221889 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a489145f-f0fe-4e55-a9eb-29df1419aa2b-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-b67cz\" (UID: \"a489145f-f0fe-4e55-a9eb-29df1419aa2b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-b67cz" Oct 12 08:09:22 crc kubenswrapper[4599]: I1012 08:09:22.222022 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a489145f-f0fe-4e55-a9eb-29df1419aa2b-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-b67cz\" (UID: \"a489145f-f0fe-4e55-a9eb-29df1419aa2b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-b67cz" Oct 12 08:09:22 crc kubenswrapper[4599]: I1012 08:09:22.239232 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h85zr\" (UniqueName: \"kubernetes.io/projected/a489145f-f0fe-4e55-a9eb-29df1419aa2b-kube-api-access-h85zr\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-b67cz\" (UID: \"a489145f-f0fe-4e55-a9eb-29df1419aa2b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-b67cz" Oct 12 08:09:22 crc kubenswrapper[4599]: I1012 08:09:22.328127 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-b67cz" Oct 12 08:09:22 crc kubenswrapper[4599]: I1012 08:09:22.758493 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-b67cz"] Oct 12 08:09:22 crc kubenswrapper[4599]: I1012 08:09:22.958804 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-b67cz" event={"ID":"a489145f-f0fe-4e55-a9eb-29df1419aa2b","Type":"ContainerStarted","Data":"bddb8a95227e0b59b3b4ab1d8790ebd765af287b17b15f7fcd5aaba4293896c4"} Oct 12 08:09:23 crc kubenswrapper[4599]: I1012 08:09:23.967143 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-b67cz" event={"ID":"a489145f-f0fe-4e55-a9eb-29df1419aa2b","Type":"ContainerStarted","Data":"3870ab6a136add39ded69316b945ffcf8bd9b5765c4f8437169d1911b446568a"} Oct 12 08:09:23 crc kubenswrapper[4599]: I1012 08:09:23.982182 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-b67cz" podStartSLOduration=1.380682982 podStartE2EDuration="1.98216727s" podCreationTimestamp="2025-10-12 08:09:22 +0000 UTC" firstStartedPulling="2025-10-12 08:09:22.767845638 +0000 UTC m=+2059.557041141" lastFinishedPulling="2025-10-12 08:09:23.369329927 +0000 UTC m=+2060.158525429" observedRunningTime="2025-10-12 08:09:23.977221476 +0000 UTC m=+2060.766416978" watchObservedRunningTime="2025-10-12 08:09:23.98216727 +0000 UTC m=+2060.771362773" Oct 12 08:09:24 crc kubenswrapper[4599]: I1012 08:09:24.719604 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-b77s6" Oct 12 08:09:24 crc kubenswrapper[4599]: I1012 08:09:24.719654 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-b77s6" Oct 12 08:09:24 crc kubenswrapper[4599]: I1012 08:09:24.762527 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-b77s6" Oct 12 08:09:25 crc kubenswrapper[4599]: I1012 08:09:25.006419 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-b77s6" Oct 12 08:09:25 crc kubenswrapper[4599]: I1012 08:09:25.043288 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-b77s6"] Oct 12 08:09:26 crc kubenswrapper[4599]: I1012 08:09:26.992489 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-b77s6" podUID="5a878a62-b505-42f1-8dff-e565ffc9e18b" containerName="registry-server" containerID="cri-o://91e26687a87c367ac1aae6f1f1826f3c33981b8311cc847ffc994e24f5ef7f0d" gracePeriod=2 Oct 12 08:09:27 crc kubenswrapper[4599]: I1012 08:09:27.358623 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b77s6" Oct 12 08:09:27 crc kubenswrapper[4599]: I1012 08:09:27.388951 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a878a62-b505-42f1-8dff-e565ffc9e18b-utilities\") pod \"5a878a62-b505-42f1-8dff-e565ffc9e18b\" (UID: \"5a878a62-b505-42f1-8dff-e565ffc9e18b\") " Oct 12 08:09:27 crc kubenswrapper[4599]: I1012 08:09:27.389105 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d9fcc\" (UniqueName: \"kubernetes.io/projected/5a878a62-b505-42f1-8dff-e565ffc9e18b-kube-api-access-d9fcc\") pod \"5a878a62-b505-42f1-8dff-e565ffc9e18b\" (UID: \"5a878a62-b505-42f1-8dff-e565ffc9e18b\") " Oct 12 08:09:27 crc kubenswrapper[4599]: I1012 08:09:27.389164 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a878a62-b505-42f1-8dff-e565ffc9e18b-catalog-content\") pod \"5a878a62-b505-42f1-8dff-e565ffc9e18b\" (UID: \"5a878a62-b505-42f1-8dff-e565ffc9e18b\") " Oct 12 08:09:27 crc kubenswrapper[4599]: I1012 08:09:27.392156 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a878a62-b505-42f1-8dff-e565ffc9e18b-utilities" (OuterVolumeSpecName: "utilities") pod "5a878a62-b505-42f1-8dff-e565ffc9e18b" (UID: "5a878a62-b505-42f1-8dff-e565ffc9e18b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 08:09:27 crc kubenswrapper[4599]: I1012 08:09:27.396716 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a878a62-b505-42f1-8dff-e565ffc9e18b-kube-api-access-d9fcc" (OuterVolumeSpecName: "kube-api-access-d9fcc") pod "5a878a62-b505-42f1-8dff-e565ffc9e18b" (UID: "5a878a62-b505-42f1-8dff-e565ffc9e18b"). InnerVolumeSpecName "kube-api-access-d9fcc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 08:09:27 crc kubenswrapper[4599]: I1012 08:09:27.451372 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a878a62-b505-42f1-8dff-e565ffc9e18b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5a878a62-b505-42f1-8dff-e565ffc9e18b" (UID: "5a878a62-b505-42f1-8dff-e565ffc9e18b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 08:09:27 crc kubenswrapper[4599]: I1012 08:09:27.492457 4599 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a878a62-b505-42f1-8dff-e565ffc9e18b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 08:09:27 crc kubenswrapper[4599]: I1012 08:09:27.492495 4599 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a878a62-b505-42f1-8dff-e565ffc9e18b-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 08:09:27 crc kubenswrapper[4599]: I1012 08:09:27.492513 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d9fcc\" (UniqueName: \"kubernetes.io/projected/5a878a62-b505-42f1-8dff-e565ffc9e18b-kube-api-access-d9fcc\") on node \"crc\" DevicePath \"\"" Oct 12 08:09:28 crc kubenswrapper[4599]: I1012 08:09:28.000020 4599 generic.go:334] "Generic (PLEG): container finished" podID="5a878a62-b505-42f1-8dff-e565ffc9e18b" containerID="91e26687a87c367ac1aae6f1f1826f3c33981b8311cc847ffc994e24f5ef7f0d" exitCode=0 Oct 12 08:09:28 crc kubenswrapper[4599]: I1012 08:09:28.000068 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b77s6" event={"ID":"5a878a62-b505-42f1-8dff-e565ffc9e18b","Type":"ContainerDied","Data":"91e26687a87c367ac1aae6f1f1826f3c33981b8311cc847ffc994e24f5ef7f0d"} Oct 12 08:09:28 crc kubenswrapper[4599]: I1012 08:09:28.000081 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b77s6" Oct 12 08:09:28 crc kubenswrapper[4599]: I1012 08:09:28.000096 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b77s6" event={"ID":"5a878a62-b505-42f1-8dff-e565ffc9e18b","Type":"ContainerDied","Data":"553c79bd104af6e94b9be962ceb8524762ec7dfdb5c9b46484ab6e9a9077410d"} Oct 12 08:09:28 crc kubenswrapper[4599]: I1012 08:09:28.000119 4599 scope.go:117] "RemoveContainer" containerID="91e26687a87c367ac1aae6f1f1826f3c33981b8311cc847ffc994e24f5ef7f0d" Oct 12 08:09:28 crc kubenswrapper[4599]: I1012 08:09:28.015811 4599 scope.go:117] "RemoveContainer" containerID="94a1f1635d64d479030e0651f9f4738c92df071230a952d2bae1d7abd352d901" Oct 12 08:09:28 crc kubenswrapper[4599]: I1012 08:09:28.019433 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-b77s6"] Oct 12 08:09:28 crc kubenswrapper[4599]: I1012 08:09:28.025538 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-b77s6"] Oct 12 08:09:28 crc kubenswrapper[4599]: I1012 08:09:28.032170 4599 scope.go:117] "RemoveContainer" containerID="bbbf52f82fa6836011cb12e6f0402d235f9713c9fae333ad19c32aeef3961029" Oct 12 08:09:28 crc kubenswrapper[4599]: I1012 08:09:28.063429 4599 scope.go:117] "RemoveContainer" containerID="91e26687a87c367ac1aae6f1f1826f3c33981b8311cc847ffc994e24f5ef7f0d" Oct 12 08:09:28 crc kubenswrapper[4599]: E1012 08:09:28.063699 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91e26687a87c367ac1aae6f1f1826f3c33981b8311cc847ffc994e24f5ef7f0d\": container with ID starting with 91e26687a87c367ac1aae6f1f1826f3c33981b8311cc847ffc994e24f5ef7f0d not found: ID does not exist" containerID="91e26687a87c367ac1aae6f1f1826f3c33981b8311cc847ffc994e24f5ef7f0d" Oct 12 08:09:28 crc kubenswrapper[4599]: I1012 08:09:28.063732 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91e26687a87c367ac1aae6f1f1826f3c33981b8311cc847ffc994e24f5ef7f0d"} err="failed to get container status \"91e26687a87c367ac1aae6f1f1826f3c33981b8311cc847ffc994e24f5ef7f0d\": rpc error: code = NotFound desc = could not find container \"91e26687a87c367ac1aae6f1f1826f3c33981b8311cc847ffc994e24f5ef7f0d\": container with ID starting with 91e26687a87c367ac1aae6f1f1826f3c33981b8311cc847ffc994e24f5ef7f0d not found: ID does not exist" Oct 12 08:09:28 crc kubenswrapper[4599]: I1012 08:09:28.063751 4599 scope.go:117] "RemoveContainer" containerID="94a1f1635d64d479030e0651f9f4738c92df071230a952d2bae1d7abd352d901" Oct 12 08:09:28 crc kubenswrapper[4599]: E1012 08:09:28.063942 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94a1f1635d64d479030e0651f9f4738c92df071230a952d2bae1d7abd352d901\": container with ID starting with 94a1f1635d64d479030e0651f9f4738c92df071230a952d2bae1d7abd352d901 not found: ID does not exist" containerID="94a1f1635d64d479030e0651f9f4738c92df071230a952d2bae1d7abd352d901" Oct 12 08:09:28 crc kubenswrapper[4599]: I1012 08:09:28.063970 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94a1f1635d64d479030e0651f9f4738c92df071230a952d2bae1d7abd352d901"} err="failed to get container status \"94a1f1635d64d479030e0651f9f4738c92df071230a952d2bae1d7abd352d901\": rpc error: code = NotFound desc = could not find container \"94a1f1635d64d479030e0651f9f4738c92df071230a952d2bae1d7abd352d901\": container with ID starting with 94a1f1635d64d479030e0651f9f4738c92df071230a952d2bae1d7abd352d901 not found: ID does not exist" Oct 12 08:09:28 crc kubenswrapper[4599]: I1012 08:09:28.063988 4599 scope.go:117] "RemoveContainer" containerID="bbbf52f82fa6836011cb12e6f0402d235f9713c9fae333ad19c32aeef3961029" Oct 12 08:09:28 crc kubenswrapper[4599]: E1012 08:09:28.064169 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bbbf52f82fa6836011cb12e6f0402d235f9713c9fae333ad19c32aeef3961029\": container with ID starting with bbbf52f82fa6836011cb12e6f0402d235f9713c9fae333ad19c32aeef3961029 not found: ID does not exist" containerID="bbbf52f82fa6836011cb12e6f0402d235f9713c9fae333ad19c32aeef3961029" Oct 12 08:09:28 crc kubenswrapper[4599]: I1012 08:09:28.064194 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbbf52f82fa6836011cb12e6f0402d235f9713c9fae333ad19c32aeef3961029"} err="failed to get container status \"bbbf52f82fa6836011cb12e6f0402d235f9713c9fae333ad19c32aeef3961029\": rpc error: code = NotFound desc = could not find container \"bbbf52f82fa6836011cb12e6f0402d235f9713c9fae333ad19c32aeef3961029\": container with ID starting with bbbf52f82fa6836011cb12e6f0402d235f9713c9fae333ad19c32aeef3961029 not found: ID does not exist" Oct 12 08:09:29 crc kubenswrapper[4599]: I1012 08:09:29.553651 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a878a62-b505-42f1-8dff-e565ffc9e18b" path="/var/lib/kubelet/pods/5a878a62-b505-42f1-8dff-e565ffc9e18b/volumes" Oct 12 08:09:58 crc kubenswrapper[4599]: I1012 08:09:58.322188 4599 patch_prober.go:28] interesting pod/machine-config-daemon-5mz5c container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 08:09:58 crc kubenswrapper[4599]: I1012 08:09:58.322933 4599 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 08:10:25 crc kubenswrapper[4599]: I1012 08:10:25.738039 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-95sck"] Oct 12 08:10:25 crc kubenswrapper[4599]: E1012 08:10:25.739005 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a878a62-b505-42f1-8dff-e565ffc9e18b" containerName="registry-server" Oct 12 08:10:25 crc kubenswrapper[4599]: I1012 08:10:25.739019 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a878a62-b505-42f1-8dff-e565ffc9e18b" containerName="registry-server" Oct 12 08:10:25 crc kubenswrapper[4599]: E1012 08:10:25.739038 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a878a62-b505-42f1-8dff-e565ffc9e18b" containerName="extract-content" Oct 12 08:10:25 crc kubenswrapper[4599]: I1012 08:10:25.739046 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a878a62-b505-42f1-8dff-e565ffc9e18b" containerName="extract-content" Oct 12 08:10:25 crc kubenswrapper[4599]: E1012 08:10:25.739056 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a878a62-b505-42f1-8dff-e565ffc9e18b" containerName="extract-utilities" Oct 12 08:10:25 crc kubenswrapper[4599]: I1012 08:10:25.739063 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a878a62-b505-42f1-8dff-e565ffc9e18b" containerName="extract-utilities" Oct 12 08:10:25 crc kubenswrapper[4599]: I1012 08:10:25.739254 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a878a62-b505-42f1-8dff-e565ffc9e18b" containerName="registry-server" Oct 12 08:10:25 crc kubenswrapper[4599]: I1012 08:10:25.740529 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-95sck" Oct 12 08:10:25 crc kubenswrapper[4599]: I1012 08:10:25.745931 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-95sck"] Oct 12 08:10:25 crc kubenswrapper[4599]: I1012 08:10:25.866232 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9318aead-0123-4f06-b29c-fb98dcc9fe31-catalog-content\") pod \"community-operators-95sck\" (UID: \"9318aead-0123-4f06-b29c-fb98dcc9fe31\") " pod="openshift-marketplace/community-operators-95sck" Oct 12 08:10:25 crc kubenswrapper[4599]: I1012 08:10:25.866301 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpwgm\" (UniqueName: \"kubernetes.io/projected/9318aead-0123-4f06-b29c-fb98dcc9fe31-kube-api-access-xpwgm\") pod \"community-operators-95sck\" (UID: \"9318aead-0123-4f06-b29c-fb98dcc9fe31\") " pod="openshift-marketplace/community-operators-95sck" Oct 12 08:10:25 crc kubenswrapper[4599]: I1012 08:10:25.866361 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9318aead-0123-4f06-b29c-fb98dcc9fe31-utilities\") pod \"community-operators-95sck\" (UID: \"9318aead-0123-4f06-b29c-fb98dcc9fe31\") " pod="openshift-marketplace/community-operators-95sck" Oct 12 08:10:25 crc kubenswrapper[4599]: I1012 08:10:25.967900 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9318aead-0123-4f06-b29c-fb98dcc9fe31-catalog-content\") pod \"community-operators-95sck\" (UID: \"9318aead-0123-4f06-b29c-fb98dcc9fe31\") " pod="openshift-marketplace/community-operators-95sck" Oct 12 08:10:25 crc kubenswrapper[4599]: I1012 08:10:25.967943 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpwgm\" (UniqueName: \"kubernetes.io/projected/9318aead-0123-4f06-b29c-fb98dcc9fe31-kube-api-access-xpwgm\") pod \"community-operators-95sck\" (UID: \"9318aead-0123-4f06-b29c-fb98dcc9fe31\") " pod="openshift-marketplace/community-operators-95sck" Oct 12 08:10:25 crc kubenswrapper[4599]: I1012 08:10:25.967992 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9318aead-0123-4f06-b29c-fb98dcc9fe31-utilities\") pod \"community-operators-95sck\" (UID: \"9318aead-0123-4f06-b29c-fb98dcc9fe31\") " pod="openshift-marketplace/community-operators-95sck" Oct 12 08:10:25 crc kubenswrapper[4599]: I1012 08:10:25.968360 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9318aead-0123-4f06-b29c-fb98dcc9fe31-catalog-content\") pod \"community-operators-95sck\" (UID: \"9318aead-0123-4f06-b29c-fb98dcc9fe31\") " pod="openshift-marketplace/community-operators-95sck" Oct 12 08:10:25 crc kubenswrapper[4599]: I1012 08:10:25.968440 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9318aead-0123-4f06-b29c-fb98dcc9fe31-utilities\") pod \"community-operators-95sck\" (UID: \"9318aead-0123-4f06-b29c-fb98dcc9fe31\") " pod="openshift-marketplace/community-operators-95sck" Oct 12 08:10:25 crc kubenswrapper[4599]: I1012 08:10:25.986106 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpwgm\" (UniqueName: \"kubernetes.io/projected/9318aead-0123-4f06-b29c-fb98dcc9fe31-kube-api-access-xpwgm\") pod \"community-operators-95sck\" (UID: \"9318aead-0123-4f06-b29c-fb98dcc9fe31\") " pod="openshift-marketplace/community-operators-95sck" Oct 12 08:10:26 crc kubenswrapper[4599]: I1012 08:10:26.058864 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-95sck" Oct 12 08:10:26 crc kubenswrapper[4599]: I1012 08:10:26.457726 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-95sck"] Oct 12 08:10:27 crc kubenswrapper[4599]: I1012 08:10:27.402083 4599 generic.go:334] "Generic (PLEG): container finished" podID="9318aead-0123-4f06-b29c-fb98dcc9fe31" containerID="3e483e2c103da78e5803aa95c397cb1a74246657dd1107a50bfffd57ae37e75d" exitCode=0 Oct 12 08:10:27 crc kubenswrapper[4599]: I1012 08:10:27.402134 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-95sck" event={"ID":"9318aead-0123-4f06-b29c-fb98dcc9fe31","Type":"ContainerDied","Data":"3e483e2c103da78e5803aa95c397cb1a74246657dd1107a50bfffd57ae37e75d"} Oct 12 08:10:27 crc kubenswrapper[4599]: I1012 08:10:27.402658 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-95sck" event={"ID":"9318aead-0123-4f06-b29c-fb98dcc9fe31","Type":"ContainerStarted","Data":"874e00bf2218d9dfc957e431296a27fe6a8a78649dfd9f6b080979e9d50b1f93"} Oct 12 08:10:28 crc kubenswrapper[4599]: I1012 08:10:28.321706 4599 patch_prober.go:28] interesting pod/machine-config-daemon-5mz5c container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 08:10:28 crc kubenswrapper[4599]: I1012 08:10:28.321779 4599 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 08:10:28 crc kubenswrapper[4599]: I1012 08:10:28.415148 4599 generic.go:334] "Generic (PLEG): container finished" podID="9318aead-0123-4f06-b29c-fb98dcc9fe31" containerID="f5ba6b825c44fad0a142df6660fdf40727efa3171d0bf5d1bd6c16de8452f9ef" exitCode=0 Oct 12 08:10:28 crc kubenswrapper[4599]: I1012 08:10:28.415183 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-95sck" event={"ID":"9318aead-0123-4f06-b29c-fb98dcc9fe31","Type":"ContainerDied","Data":"f5ba6b825c44fad0a142df6660fdf40727efa3171d0bf5d1bd6c16de8452f9ef"} Oct 12 08:10:29 crc kubenswrapper[4599]: I1012 08:10:29.426511 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-95sck" event={"ID":"9318aead-0123-4f06-b29c-fb98dcc9fe31","Type":"ContainerStarted","Data":"9e19d6f3b1dad1ca8c2e28e0602179f8a57a7839ff4d616ac974567481f9bad3"} Oct 12 08:10:29 crc kubenswrapper[4599]: I1012 08:10:29.444238 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-95sck" podStartSLOduration=2.964880576 podStartE2EDuration="4.444223725s" podCreationTimestamp="2025-10-12 08:10:25 +0000 UTC" firstStartedPulling="2025-10-12 08:10:27.403816109 +0000 UTC m=+2124.193011611" lastFinishedPulling="2025-10-12 08:10:28.883159258 +0000 UTC m=+2125.672354760" observedRunningTime="2025-10-12 08:10:29.440078018 +0000 UTC m=+2126.229273520" watchObservedRunningTime="2025-10-12 08:10:29.444223725 +0000 UTC m=+2126.233419227" Oct 12 08:10:36 crc kubenswrapper[4599]: I1012 08:10:36.060380 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-95sck" Oct 12 08:10:36 crc kubenswrapper[4599]: I1012 08:10:36.060840 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-95sck" Oct 12 08:10:36 crc kubenswrapper[4599]: I1012 08:10:36.094487 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-95sck" Oct 12 08:10:36 crc kubenswrapper[4599]: I1012 08:10:36.514393 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-95sck" Oct 12 08:10:36 crc kubenswrapper[4599]: I1012 08:10:36.551832 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-95sck"] Oct 12 08:10:38 crc kubenswrapper[4599]: I1012 08:10:38.489654 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-95sck" podUID="9318aead-0123-4f06-b29c-fb98dcc9fe31" containerName="registry-server" containerID="cri-o://9e19d6f3b1dad1ca8c2e28e0602179f8a57a7839ff4d616ac974567481f9bad3" gracePeriod=2 Oct 12 08:10:38 crc kubenswrapper[4599]: I1012 08:10:38.846295 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-95sck" Oct 12 08:10:38 crc kubenswrapper[4599]: I1012 08:10:38.985986 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xpwgm\" (UniqueName: \"kubernetes.io/projected/9318aead-0123-4f06-b29c-fb98dcc9fe31-kube-api-access-xpwgm\") pod \"9318aead-0123-4f06-b29c-fb98dcc9fe31\" (UID: \"9318aead-0123-4f06-b29c-fb98dcc9fe31\") " Oct 12 08:10:38 crc kubenswrapper[4599]: I1012 08:10:38.986126 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9318aead-0123-4f06-b29c-fb98dcc9fe31-utilities\") pod \"9318aead-0123-4f06-b29c-fb98dcc9fe31\" (UID: \"9318aead-0123-4f06-b29c-fb98dcc9fe31\") " Oct 12 08:10:38 crc kubenswrapper[4599]: I1012 08:10:38.986182 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9318aead-0123-4f06-b29c-fb98dcc9fe31-catalog-content\") pod \"9318aead-0123-4f06-b29c-fb98dcc9fe31\" (UID: \"9318aead-0123-4f06-b29c-fb98dcc9fe31\") " Oct 12 08:10:38 crc kubenswrapper[4599]: I1012 08:10:38.987001 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9318aead-0123-4f06-b29c-fb98dcc9fe31-utilities" (OuterVolumeSpecName: "utilities") pod "9318aead-0123-4f06-b29c-fb98dcc9fe31" (UID: "9318aead-0123-4f06-b29c-fb98dcc9fe31"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 08:10:38 crc kubenswrapper[4599]: I1012 08:10:38.987367 4599 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9318aead-0123-4f06-b29c-fb98dcc9fe31-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 08:10:38 crc kubenswrapper[4599]: I1012 08:10:38.991656 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9318aead-0123-4f06-b29c-fb98dcc9fe31-kube-api-access-xpwgm" (OuterVolumeSpecName: "kube-api-access-xpwgm") pod "9318aead-0123-4f06-b29c-fb98dcc9fe31" (UID: "9318aead-0123-4f06-b29c-fb98dcc9fe31"). InnerVolumeSpecName "kube-api-access-xpwgm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 08:10:39 crc kubenswrapper[4599]: I1012 08:10:39.021735 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9318aead-0123-4f06-b29c-fb98dcc9fe31-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9318aead-0123-4f06-b29c-fb98dcc9fe31" (UID: "9318aead-0123-4f06-b29c-fb98dcc9fe31"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 08:10:39 crc kubenswrapper[4599]: I1012 08:10:39.088932 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xpwgm\" (UniqueName: \"kubernetes.io/projected/9318aead-0123-4f06-b29c-fb98dcc9fe31-kube-api-access-xpwgm\") on node \"crc\" DevicePath \"\"" Oct 12 08:10:39 crc kubenswrapper[4599]: I1012 08:10:39.088968 4599 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9318aead-0123-4f06-b29c-fb98dcc9fe31-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 08:10:39 crc kubenswrapper[4599]: I1012 08:10:39.498208 4599 generic.go:334] "Generic (PLEG): container finished" podID="9318aead-0123-4f06-b29c-fb98dcc9fe31" containerID="9e19d6f3b1dad1ca8c2e28e0602179f8a57a7839ff4d616ac974567481f9bad3" exitCode=0 Oct 12 08:10:39 crc kubenswrapper[4599]: I1012 08:10:39.498245 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-95sck" event={"ID":"9318aead-0123-4f06-b29c-fb98dcc9fe31","Type":"ContainerDied","Data":"9e19d6f3b1dad1ca8c2e28e0602179f8a57a7839ff4d616ac974567481f9bad3"} Oct 12 08:10:39 crc kubenswrapper[4599]: I1012 08:10:39.498263 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-95sck" Oct 12 08:10:39 crc kubenswrapper[4599]: I1012 08:10:39.498294 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-95sck" event={"ID":"9318aead-0123-4f06-b29c-fb98dcc9fe31","Type":"ContainerDied","Data":"874e00bf2218d9dfc957e431296a27fe6a8a78649dfd9f6b080979e9d50b1f93"} Oct 12 08:10:39 crc kubenswrapper[4599]: I1012 08:10:39.498375 4599 scope.go:117] "RemoveContainer" containerID="9e19d6f3b1dad1ca8c2e28e0602179f8a57a7839ff4d616ac974567481f9bad3" Oct 12 08:10:39 crc kubenswrapper[4599]: I1012 08:10:39.519857 4599 scope.go:117] "RemoveContainer" containerID="f5ba6b825c44fad0a142df6660fdf40727efa3171d0bf5d1bd6c16de8452f9ef" Oct 12 08:10:39 crc kubenswrapper[4599]: I1012 08:10:39.535990 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-95sck"] Oct 12 08:10:39 crc kubenswrapper[4599]: I1012 08:10:39.556555 4599 scope.go:117] "RemoveContainer" containerID="3e483e2c103da78e5803aa95c397cb1a74246657dd1107a50bfffd57ae37e75d" Oct 12 08:10:39 crc kubenswrapper[4599]: I1012 08:10:39.558128 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-95sck"] Oct 12 08:10:39 crc kubenswrapper[4599]: I1012 08:10:39.577476 4599 scope.go:117] "RemoveContainer" containerID="9e19d6f3b1dad1ca8c2e28e0602179f8a57a7839ff4d616ac974567481f9bad3" Oct 12 08:10:39 crc kubenswrapper[4599]: E1012 08:10:39.577865 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e19d6f3b1dad1ca8c2e28e0602179f8a57a7839ff4d616ac974567481f9bad3\": container with ID starting with 9e19d6f3b1dad1ca8c2e28e0602179f8a57a7839ff4d616ac974567481f9bad3 not found: ID does not exist" containerID="9e19d6f3b1dad1ca8c2e28e0602179f8a57a7839ff4d616ac974567481f9bad3" Oct 12 08:10:39 crc kubenswrapper[4599]: I1012 08:10:39.577902 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e19d6f3b1dad1ca8c2e28e0602179f8a57a7839ff4d616ac974567481f9bad3"} err="failed to get container status \"9e19d6f3b1dad1ca8c2e28e0602179f8a57a7839ff4d616ac974567481f9bad3\": rpc error: code = NotFound desc = could not find container \"9e19d6f3b1dad1ca8c2e28e0602179f8a57a7839ff4d616ac974567481f9bad3\": container with ID starting with 9e19d6f3b1dad1ca8c2e28e0602179f8a57a7839ff4d616ac974567481f9bad3 not found: ID does not exist" Oct 12 08:10:39 crc kubenswrapper[4599]: I1012 08:10:39.577937 4599 scope.go:117] "RemoveContainer" containerID="f5ba6b825c44fad0a142df6660fdf40727efa3171d0bf5d1bd6c16de8452f9ef" Oct 12 08:10:39 crc kubenswrapper[4599]: E1012 08:10:39.578261 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5ba6b825c44fad0a142df6660fdf40727efa3171d0bf5d1bd6c16de8452f9ef\": container with ID starting with f5ba6b825c44fad0a142df6660fdf40727efa3171d0bf5d1bd6c16de8452f9ef not found: ID does not exist" containerID="f5ba6b825c44fad0a142df6660fdf40727efa3171d0bf5d1bd6c16de8452f9ef" Oct 12 08:10:39 crc kubenswrapper[4599]: I1012 08:10:39.578281 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5ba6b825c44fad0a142df6660fdf40727efa3171d0bf5d1bd6c16de8452f9ef"} err="failed to get container status \"f5ba6b825c44fad0a142df6660fdf40727efa3171d0bf5d1bd6c16de8452f9ef\": rpc error: code = NotFound desc = could not find container \"f5ba6b825c44fad0a142df6660fdf40727efa3171d0bf5d1bd6c16de8452f9ef\": container with ID starting with f5ba6b825c44fad0a142df6660fdf40727efa3171d0bf5d1bd6c16de8452f9ef not found: ID does not exist" Oct 12 08:10:39 crc kubenswrapper[4599]: I1012 08:10:39.578295 4599 scope.go:117] "RemoveContainer" containerID="3e483e2c103da78e5803aa95c397cb1a74246657dd1107a50bfffd57ae37e75d" Oct 12 08:10:39 crc kubenswrapper[4599]: E1012 08:10:39.578544 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e483e2c103da78e5803aa95c397cb1a74246657dd1107a50bfffd57ae37e75d\": container with ID starting with 3e483e2c103da78e5803aa95c397cb1a74246657dd1107a50bfffd57ae37e75d not found: ID does not exist" containerID="3e483e2c103da78e5803aa95c397cb1a74246657dd1107a50bfffd57ae37e75d" Oct 12 08:10:39 crc kubenswrapper[4599]: I1012 08:10:39.578572 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e483e2c103da78e5803aa95c397cb1a74246657dd1107a50bfffd57ae37e75d"} err="failed to get container status \"3e483e2c103da78e5803aa95c397cb1a74246657dd1107a50bfffd57ae37e75d\": rpc error: code = NotFound desc = could not find container \"3e483e2c103da78e5803aa95c397cb1a74246657dd1107a50bfffd57ae37e75d\": container with ID starting with 3e483e2c103da78e5803aa95c397cb1a74246657dd1107a50bfffd57ae37e75d not found: ID does not exist" Oct 12 08:10:41 crc kubenswrapper[4599]: I1012 08:10:41.554274 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9318aead-0123-4f06-b29c-fb98dcc9fe31" path="/var/lib/kubelet/pods/9318aead-0123-4f06-b29c-fb98dcc9fe31/volumes" Oct 12 08:10:58 crc kubenswrapper[4599]: I1012 08:10:58.322063 4599 patch_prober.go:28] interesting pod/machine-config-daemon-5mz5c container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 08:10:58 crc kubenswrapper[4599]: I1012 08:10:58.322547 4599 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 08:10:58 crc kubenswrapper[4599]: I1012 08:10:58.322594 4599 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" Oct 12 08:10:58 crc kubenswrapper[4599]: I1012 08:10:58.323213 4599 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4e7d8f7d3d60ebdc79d83491f148013a2c65198dbd3b53c026bfcf65f06699ba"} pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 12 08:10:58 crc kubenswrapper[4599]: I1012 08:10:58.323266 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" containerName="machine-config-daemon" containerID="cri-o://4e7d8f7d3d60ebdc79d83491f148013a2c65198dbd3b53c026bfcf65f06699ba" gracePeriod=600 Oct 12 08:10:58 crc kubenswrapper[4599]: E1012 08:10:58.437735 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5mz5c_openshift-machine-config-operator(cc694bce-8c25-4729-b452-29d44d3efe6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" Oct 12 08:10:58 crc kubenswrapper[4599]: I1012 08:10:58.624774 4599 generic.go:334] "Generic (PLEG): container finished" podID="cc694bce-8c25-4729-b452-29d44d3efe6e" containerID="4e7d8f7d3d60ebdc79d83491f148013a2c65198dbd3b53c026bfcf65f06699ba" exitCode=0 Oct 12 08:10:58 crc kubenswrapper[4599]: I1012 08:10:58.624843 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" event={"ID":"cc694bce-8c25-4729-b452-29d44d3efe6e","Type":"ContainerDied","Data":"4e7d8f7d3d60ebdc79d83491f148013a2c65198dbd3b53c026bfcf65f06699ba"} Oct 12 08:10:58 crc kubenswrapper[4599]: I1012 08:10:58.625033 4599 scope.go:117] "RemoveContainer" containerID="27e98f9f32a7679119f49b8ab56e856d08b70cad719b2a459bd215581f62e802" Oct 12 08:10:58 crc kubenswrapper[4599]: I1012 08:10:58.625552 4599 scope.go:117] "RemoveContainer" containerID="4e7d8f7d3d60ebdc79d83491f148013a2c65198dbd3b53c026bfcf65f06699ba" Oct 12 08:10:58 crc kubenswrapper[4599]: E1012 08:10:58.625816 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5mz5c_openshift-machine-config-operator(cc694bce-8c25-4729-b452-29d44d3efe6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" Oct 12 08:10:59 crc kubenswrapper[4599]: I1012 08:10:59.633029 4599 generic.go:334] "Generic (PLEG): container finished" podID="a489145f-f0fe-4e55-a9eb-29df1419aa2b" containerID="3870ab6a136add39ded69316b945ffcf8bd9b5765c4f8437169d1911b446568a" exitCode=0 Oct 12 08:10:59 crc kubenswrapper[4599]: I1012 08:10:59.633065 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-b67cz" event={"ID":"a489145f-f0fe-4e55-a9eb-29df1419aa2b","Type":"ContainerDied","Data":"3870ab6a136add39ded69316b945ffcf8bd9b5765c4f8437169d1911b446568a"} Oct 12 08:11:00 crc kubenswrapper[4599]: I1012 08:11:00.918305 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-b67cz" Oct 12 08:11:00 crc kubenswrapper[4599]: I1012 08:11:00.944698 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a489145f-f0fe-4e55-a9eb-29df1419aa2b-inventory\") pod \"a489145f-f0fe-4e55-a9eb-29df1419aa2b\" (UID: \"a489145f-f0fe-4e55-a9eb-29df1419aa2b\") " Oct 12 08:11:00 crc kubenswrapper[4599]: I1012 08:11:00.944862 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a489145f-f0fe-4e55-a9eb-29df1419aa2b-telemetry-combined-ca-bundle\") pod \"a489145f-f0fe-4e55-a9eb-29df1419aa2b\" (UID: \"a489145f-f0fe-4e55-a9eb-29df1419aa2b\") " Oct 12 08:11:00 crc kubenswrapper[4599]: I1012 08:11:00.944896 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/a489145f-f0fe-4e55-a9eb-29df1419aa2b-ceilometer-compute-config-data-1\") pod \"a489145f-f0fe-4e55-a9eb-29df1419aa2b\" (UID: \"a489145f-f0fe-4e55-a9eb-29df1419aa2b\") " Oct 12 08:11:00 crc kubenswrapper[4599]: I1012 08:11:00.945008 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/a489145f-f0fe-4e55-a9eb-29df1419aa2b-ceilometer-compute-config-data-0\") pod \"a489145f-f0fe-4e55-a9eb-29df1419aa2b\" (UID: \"a489145f-f0fe-4e55-a9eb-29df1419aa2b\") " Oct 12 08:11:00 crc kubenswrapper[4599]: I1012 08:11:00.945050 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h85zr\" (UniqueName: \"kubernetes.io/projected/a489145f-f0fe-4e55-a9eb-29df1419aa2b-kube-api-access-h85zr\") pod \"a489145f-f0fe-4e55-a9eb-29df1419aa2b\" (UID: \"a489145f-f0fe-4e55-a9eb-29df1419aa2b\") " Oct 12 08:11:00 crc kubenswrapper[4599]: I1012 08:11:00.945067 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a489145f-f0fe-4e55-a9eb-29df1419aa2b-ssh-key\") pod \"a489145f-f0fe-4e55-a9eb-29df1419aa2b\" (UID: \"a489145f-f0fe-4e55-a9eb-29df1419aa2b\") " Oct 12 08:11:00 crc kubenswrapper[4599]: I1012 08:11:00.945659 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/a489145f-f0fe-4e55-a9eb-29df1419aa2b-ceilometer-compute-config-data-2\") pod \"a489145f-f0fe-4e55-a9eb-29df1419aa2b\" (UID: \"a489145f-f0fe-4e55-a9eb-29df1419aa2b\") " Oct 12 08:11:00 crc kubenswrapper[4599]: I1012 08:11:00.950169 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a489145f-f0fe-4e55-a9eb-29df1419aa2b-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "a489145f-f0fe-4e55-a9eb-29df1419aa2b" (UID: "a489145f-f0fe-4e55-a9eb-29df1419aa2b"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 08:11:00 crc kubenswrapper[4599]: I1012 08:11:00.950345 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a489145f-f0fe-4e55-a9eb-29df1419aa2b-kube-api-access-h85zr" (OuterVolumeSpecName: "kube-api-access-h85zr") pod "a489145f-f0fe-4e55-a9eb-29df1419aa2b" (UID: "a489145f-f0fe-4e55-a9eb-29df1419aa2b"). InnerVolumeSpecName "kube-api-access-h85zr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 08:11:00 crc kubenswrapper[4599]: I1012 08:11:00.966537 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a489145f-f0fe-4e55-a9eb-29df1419aa2b-inventory" (OuterVolumeSpecName: "inventory") pod "a489145f-f0fe-4e55-a9eb-29df1419aa2b" (UID: "a489145f-f0fe-4e55-a9eb-29df1419aa2b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 08:11:00 crc kubenswrapper[4599]: I1012 08:11:00.966951 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a489145f-f0fe-4e55-a9eb-29df1419aa2b-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "a489145f-f0fe-4e55-a9eb-29df1419aa2b" (UID: "a489145f-f0fe-4e55-a9eb-29df1419aa2b"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 08:11:00 crc kubenswrapper[4599]: I1012 08:11:00.967290 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a489145f-f0fe-4e55-a9eb-29df1419aa2b-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a489145f-f0fe-4e55-a9eb-29df1419aa2b" (UID: "a489145f-f0fe-4e55-a9eb-29df1419aa2b"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 08:11:00 crc kubenswrapper[4599]: I1012 08:11:00.967910 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a489145f-f0fe-4e55-a9eb-29df1419aa2b-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "a489145f-f0fe-4e55-a9eb-29df1419aa2b" (UID: "a489145f-f0fe-4e55-a9eb-29df1419aa2b"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 08:11:00 crc kubenswrapper[4599]: I1012 08:11:00.968656 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a489145f-f0fe-4e55-a9eb-29df1419aa2b-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "a489145f-f0fe-4e55-a9eb-29df1419aa2b" (UID: "a489145f-f0fe-4e55-a9eb-29df1419aa2b"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 08:11:01 crc kubenswrapper[4599]: I1012 08:11:01.047649 4599 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/a489145f-f0fe-4e55-a9eb-29df1419aa2b-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Oct 12 08:11:01 crc kubenswrapper[4599]: I1012 08:11:01.047676 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h85zr\" (UniqueName: \"kubernetes.io/projected/a489145f-f0fe-4e55-a9eb-29df1419aa2b-kube-api-access-h85zr\") on node \"crc\" DevicePath \"\"" Oct 12 08:11:01 crc kubenswrapper[4599]: I1012 08:11:01.047686 4599 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a489145f-f0fe-4e55-a9eb-29df1419aa2b-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 12 08:11:01 crc kubenswrapper[4599]: I1012 08:11:01.047693 4599 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/a489145f-f0fe-4e55-a9eb-29df1419aa2b-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Oct 12 08:11:01 crc kubenswrapper[4599]: I1012 08:11:01.047702 4599 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a489145f-f0fe-4e55-a9eb-29df1419aa2b-inventory\") on node \"crc\" DevicePath \"\"" Oct 12 08:11:01 crc kubenswrapper[4599]: I1012 08:11:01.047712 4599 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a489145f-f0fe-4e55-a9eb-29df1419aa2b-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 08:11:01 crc kubenswrapper[4599]: I1012 08:11:01.047723 4599 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/a489145f-f0fe-4e55-a9eb-29df1419aa2b-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Oct 12 08:11:01 crc kubenswrapper[4599]: I1012 08:11:01.644867 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-b67cz" event={"ID":"a489145f-f0fe-4e55-a9eb-29df1419aa2b","Type":"ContainerDied","Data":"bddb8a95227e0b59b3b4ab1d8790ebd765af287b17b15f7fcd5aaba4293896c4"} Oct 12 08:11:01 crc kubenswrapper[4599]: I1012 08:11:01.644902 4599 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bddb8a95227e0b59b3b4ab1d8790ebd765af287b17b15f7fcd5aaba4293896c4" Oct 12 08:11:01 crc kubenswrapper[4599]: I1012 08:11:01.644907 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-b67cz" Oct 12 08:11:12 crc kubenswrapper[4599]: I1012 08:11:12.545169 4599 scope.go:117] "RemoveContainer" containerID="4e7d8f7d3d60ebdc79d83491f148013a2c65198dbd3b53c026bfcf65f06699ba" Oct 12 08:11:12 crc kubenswrapper[4599]: E1012 08:11:12.545822 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5mz5c_openshift-machine-config-operator(cc694bce-8c25-4729-b452-29d44d3efe6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" Oct 12 08:11:23 crc kubenswrapper[4599]: I1012 08:11:23.549685 4599 scope.go:117] "RemoveContainer" containerID="4e7d8f7d3d60ebdc79d83491f148013a2c65198dbd3b53c026bfcf65f06699ba" Oct 12 08:11:23 crc kubenswrapper[4599]: E1012 08:11:23.550388 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5mz5c_openshift-machine-config-operator(cc694bce-8c25-4729-b452-29d44d3efe6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" Oct 12 08:11:34 crc kubenswrapper[4599]: I1012 08:11:34.544622 4599 scope.go:117] "RemoveContainer" containerID="4e7d8f7d3d60ebdc79d83491f148013a2c65198dbd3b53c026bfcf65f06699ba" Oct 12 08:11:34 crc kubenswrapper[4599]: E1012 08:11:34.546274 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5mz5c_openshift-machine-config-operator(cc694bce-8c25-4729-b452-29d44d3efe6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" Oct 12 08:11:46 crc kubenswrapper[4599]: I1012 08:11:46.765399 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Oct 12 08:11:46 crc kubenswrapper[4599]: E1012 08:11:46.766120 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9318aead-0123-4f06-b29c-fb98dcc9fe31" containerName="extract-utilities" Oct 12 08:11:46 crc kubenswrapper[4599]: I1012 08:11:46.766133 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="9318aead-0123-4f06-b29c-fb98dcc9fe31" containerName="extract-utilities" Oct 12 08:11:46 crc kubenswrapper[4599]: E1012 08:11:46.766153 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9318aead-0123-4f06-b29c-fb98dcc9fe31" containerName="extract-content" Oct 12 08:11:46 crc kubenswrapper[4599]: I1012 08:11:46.766160 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="9318aead-0123-4f06-b29c-fb98dcc9fe31" containerName="extract-content" Oct 12 08:11:46 crc kubenswrapper[4599]: E1012 08:11:46.766169 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9318aead-0123-4f06-b29c-fb98dcc9fe31" containerName="registry-server" Oct 12 08:11:46 crc kubenswrapper[4599]: I1012 08:11:46.766175 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="9318aead-0123-4f06-b29c-fb98dcc9fe31" containerName="registry-server" Oct 12 08:11:46 crc kubenswrapper[4599]: E1012 08:11:46.766186 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a489145f-f0fe-4e55-a9eb-29df1419aa2b" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 12 08:11:46 crc kubenswrapper[4599]: I1012 08:11:46.766192 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="a489145f-f0fe-4e55-a9eb-29df1419aa2b" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 12 08:11:46 crc kubenswrapper[4599]: I1012 08:11:46.766358 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="a489145f-f0fe-4e55-a9eb-29df1419aa2b" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 12 08:11:46 crc kubenswrapper[4599]: I1012 08:11:46.766376 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="9318aead-0123-4f06-b29c-fb98dcc9fe31" containerName="registry-server" Oct 12 08:11:46 crc kubenswrapper[4599]: I1012 08:11:46.766897 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 12 08:11:46 crc kubenswrapper[4599]: I1012 08:11:46.768423 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Oct 12 08:11:46 crc kubenswrapper[4599]: I1012 08:11:46.768495 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Oct 12 08:11:46 crc kubenswrapper[4599]: I1012 08:11:46.768762 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-bdcjd" Oct 12 08:11:46 crc kubenswrapper[4599]: I1012 08:11:46.768869 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Oct 12 08:11:46 crc kubenswrapper[4599]: I1012 08:11:46.772124 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Oct 12 08:11:46 crc kubenswrapper[4599]: I1012 08:11:46.810822 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/bdf67e61-9c15-4079-9a9b-a74c40ad364f-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"bdf67e61-9c15-4079-9a9b-a74c40ad364f\") " pod="openstack/tempest-tests-tempest" Oct 12 08:11:46 crc kubenswrapper[4599]: I1012 08:11:46.810881 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/bdf67e61-9c15-4079-9a9b-a74c40ad364f-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"bdf67e61-9c15-4079-9a9b-a74c40ad364f\") " pod="openstack/tempest-tests-tempest" Oct 12 08:11:46 crc kubenswrapper[4599]: I1012 08:11:46.810991 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"tempest-tests-tempest\" (UID: \"bdf67e61-9c15-4079-9a9b-a74c40ad364f\") " pod="openstack/tempest-tests-tempest" Oct 12 08:11:46 crc kubenswrapper[4599]: I1012 08:11:46.811037 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwcrv\" (UniqueName: \"kubernetes.io/projected/bdf67e61-9c15-4079-9a9b-a74c40ad364f-kube-api-access-hwcrv\") pod \"tempest-tests-tempest\" (UID: \"bdf67e61-9c15-4079-9a9b-a74c40ad364f\") " pod="openstack/tempest-tests-tempest" Oct 12 08:11:46 crc kubenswrapper[4599]: I1012 08:11:46.811086 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bdf67e61-9c15-4079-9a9b-a74c40ad364f-config-data\") pod \"tempest-tests-tempest\" (UID: \"bdf67e61-9c15-4079-9a9b-a74c40ad364f\") " pod="openstack/tempest-tests-tempest" Oct 12 08:11:46 crc kubenswrapper[4599]: I1012 08:11:46.811120 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bdf67e61-9c15-4079-9a9b-a74c40ad364f-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"bdf67e61-9c15-4079-9a9b-a74c40ad364f\") " pod="openstack/tempest-tests-tempest" Oct 12 08:11:46 crc kubenswrapper[4599]: I1012 08:11:46.811171 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/bdf67e61-9c15-4079-9a9b-a74c40ad364f-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"bdf67e61-9c15-4079-9a9b-a74c40ad364f\") " pod="openstack/tempest-tests-tempest" Oct 12 08:11:46 crc kubenswrapper[4599]: I1012 08:11:46.811226 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/bdf67e61-9c15-4079-9a9b-a74c40ad364f-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"bdf67e61-9c15-4079-9a9b-a74c40ad364f\") " pod="openstack/tempest-tests-tempest" Oct 12 08:11:46 crc kubenswrapper[4599]: I1012 08:11:46.811285 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/bdf67e61-9c15-4079-9a9b-a74c40ad364f-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"bdf67e61-9c15-4079-9a9b-a74c40ad364f\") " pod="openstack/tempest-tests-tempest" Oct 12 08:11:46 crc kubenswrapper[4599]: I1012 08:11:46.912160 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/bdf67e61-9c15-4079-9a9b-a74c40ad364f-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"bdf67e61-9c15-4079-9a9b-a74c40ad364f\") " pod="openstack/tempest-tests-tempest" Oct 12 08:11:46 crc kubenswrapper[4599]: I1012 08:11:46.912200 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/bdf67e61-9c15-4079-9a9b-a74c40ad364f-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"bdf67e61-9c15-4079-9a9b-a74c40ad364f\") " pod="openstack/tempest-tests-tempest" Oct 12 08:11:46 crc kubenswrapper[4599]: I1012 08:11:46.912255 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/bdf67e61-9c15-4079-9a9b-a74c40ad364f-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"bdf67e61-9c15-4079-9a9b-a74c40ad364f\") " pod="openstack/tempest-tests-tempest" Oct 12 08:11:46 crc kubenswrapper[4599]: I1012 08:11:46.912290 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/bdf67e61-9c15-4079-9a9b-a74c40ad364f-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"bdf67e61-9c15-4079-9a9b-a74c40ad364f\") " pod="openstack/tempest-tests-tempest" Oct 12 08:11:46 crc kubenswrapper[4599]: I1012 08:11:46.912330 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"tempest-tests-tempest\" (UID: \"bdf67e61-9c15-4079-9a9b-a74c40ad364f\") " pod="openstack/tempest-tests-tempest" Oct 12 08:11:46 crc kubenswrapper[4599]: I1012 08:11:46.912382 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwcrv\" (UniqueName: \"kubernetes.io/projected/bdf67e61-9c15-4079-9a9b-a74c40ad364f-kube-api-access-hwcrv\") pod \"tempest-tests-tempest\" (UID: \"bdf67e61-9c15-4079-9a9b-a74c40ad364f\") " pod="openstack/tempest-tests-tempest" Oct 12 08:11:46 crc kubenswrapper[4599]: I1012 08:11:46.912421 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bdf67e61-9c15-4079-9a9b-a74c40ad364f-config-data\") pod \"tempest-tests-tempest\" (UID: \"bdf67e61-9c15-4079-9a9b-a74c40ad364f\") " pod="openstack/tempest-tests-tempest" Oct 12 08:11:46 crc kubenswrapper[4599]: I1012 08:11:46.912447 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bdf67e61-9c15-4079-9a9b-a74c40ad364f-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"bdf67e61-9c15-4079-9a9b-a74c40ad364f\") " pod="openstack/tempest-tests-tempest" Oct 12 08:11:46 crc kubenswrapper[4599]: I1012 08:11:46.912605 4599 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"tempest-tests-tempest\" (UID: \"bdf67e61-9c15-4079-9a9b-a74c40ad364f\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/tempest-tests-tempest" Oct 12 08:11:46 crc kubenswrapper[4599]: I1012 08:11:46.912842 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/bdf67e61-9c15-4079-9a9b-a74c40ad364f-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"bdf67e61-9c15-4079-9a9b-a74c40ad364f\") " pod="openstack/tempest-tests-tempest" Oct 12 08:11:46 crc kubenswrapper[4599]: I1012 08:11:46.912880 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/bdf67e61-9c15-4079-9a9b-a74c40ad364f-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"bdf67e61-9c15-4079-9a9b-a74c40ad364f\") " pod="openstack/tempest-tests-tempest" Oct 12 08:11:46 crc kubenswrapper[4599]: I1012 08:11:46.913416 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/bdf67e61-9c15-4079-9a9b-a74c40ad364f-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"bdf67e61-9c15-4079-9a9b-a74c40ad364f\") " pod="openstack/tempest-tests-tempest" Oct 12 08:11:46 crc kubenswrapper[4599]: I1012 08:11:46.913482 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bdf67e61-9c15-4079-9a9b-a74c40ad364f-config-data\") pod \"tempest-tests-tempest\" (UID: \"bdf67e61-9c15-4079-9a9b-a74c40ad364f\") " pod="openstack/tempest-tests-tempest" Oct 12 08:11:46 crc kubenswrapper[4599]: I1012 08:11:46.913515 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/bdf67e61-9c15-4079-9a9b-a74c40ad364f-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"bdf67e61-9c15-4079-9a9b-a74c40ad364f\") " pod="openstack/tempest-tests-tempest" Oct 12 08:11:46 crc kubenswrapper[4599]: I1012 08:11:46.916812 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bdf67e61-9c15-4079-9a9b-a74c40ad364f-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"bdf67e61-9c15-4079-9a9b-a74c40ad364f\") " pod="openstack/tempest-tests-tempest" Oct 12 08:11:46 crc kubenswrapper[4599]: I1012 08:11:46.917876 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/bdf67e61-9c15-4079-9a9b-a74c40ad364f-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"bdf67e61-9c15-4079-9a9b-a74c40ad364f\") " pod="openstack/tempest-tests-tempest" Oct 12 08:11:46 crc kubenswrapper[4599]: I1012 08:11:46.918657 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/bdf67e61-9c15-4079-9a9b-a74c40ad364f-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"bdf67e61-9c15-4079-9a9b-a74c40ad364f\") " pod="openstack/tempest-tests-tempest" Oct 12 08:11:46 crc kubenswrapper[4599]: I1012 08:11:46.924463 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwcrv\" (UniqueName: \"kubernetes.io/projected/bdf67e61-9c15-4079-9a9b-a74c40ad364f-kube-api-access-hwcrv\") pod \"tempest-tests-tempest\" (UID: \"bdf67e61-9c15-4079-9a9b-a74c40ad364f\") " pod="openstack/tempest-tests-tempest" Oct 12 08:11:46 crc kubenswrapper[4599]: I1012 08:11:46.931677 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"tempest-tests-tempest\" (UID: \"bdf67e61-9c15-4079-9a9b-a74c40ad364f\") " pod="openstack/tempest-tests-tempest" Oct 12 08:11:47 crc kubenswrapper[4599]: I1012 08:11:47.082662 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 12 08:11:47 crc kubenswrapper[4599]: I1012 08:11:47.440739 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Oct 12 08:11:47 crc kubenswrapper[4599]: I1012 08:11:47.544936 4599 scope.go:117] "RemoveContainer" containerID="4e7d8f7d3d60ebdc79d83491f148013a2c65198dbd3b53c026bfcf65f06699ba" Oct 12 08:11:47 crc kubenswrapper[4599]: E1012 08:11:47.545173 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5mz5c_openshift-machine-config-operator(cc694bce-8c25-4729-b452-29d44d3efe6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" Oct 12 08:11:47 crc kubenswrapper[4599]: I1012 08:11:47.938527 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"bdf67e61-9c15-4079-9a9b-a74c40ad364f","Type":"ContainerStarted","Data":"a7b48c8de6e13c722cc3bb00b3b1e5faebea30affa8e71d3b41e20a19f64add4"} Oct 12 08:11:59 crc kubenswrapper[4599]: I1012 08:11:59.545557 4599 scope.go:117] "RemoveContainer" containerID="4e7d8f7d3d60ebdc79d83491f148013a2c65198dbd3b53c026bfcf65f06699ba" Oct 12 08:11:59 crc kubenswrapper[4599]: E1012 08:11:59.546418 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5mz5c_openshift-machine-config-operator(cc694bce-8c25-4729-b452-29d44d3efe6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" Oct 12 08:12:11 crc kubenswrapper[4599]: I1012 08:12:11.545095 4599 scope.go:117] "RemoveContainer" containerID="4e7d8f7d3d60ebdc79d83491f148013a2c65198dbd3b53c026bfcf65f06699ba" Oct 12 08:12:11 crc kubenswrapper[4599]: E1012 08:12:11.545834 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5mz5c_openshift-machine-config-operator(cc694bce-8c25-4729-b452-29d44d3efe6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" Oct 12 08:12:22 crc kubenswrapper[4599]: I1012 08:12:22.546580 4599 scope.go:117] "RemoveContainer" containerID="4e7d8f7d3d60ebdc79d83491f148013a2c65198dbd3b53c026bfcf65f06699ba" Oct 12 08:12:22 crc kubenswrapper[4599]: E1012 08:12:22.548738 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5mz5c_openshift-machine-config-operator(cc694bce-8c25-4729-b452-29d44d3efe6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" Oct 12 08:12:36 crc kubenswrapper[4599]: I1012 08:12:36.545921 4599 scope.go:117] "RemoveContainer" containerID="4e7d8f7d3d60ebdc79d83491f148013a2c65198dbd3b53c026bfcf65f06699ba" Oct 12 08:12:36 crc kubenswrapper[4599]: E1012 08:12:36.546823 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5mz5c_openshift-machine-config-operator(cc694bce-8c25-4729-b452-29d44d3efe6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" Oct 12 08:12:36 crc kubenswrapper[4599]: I1012 08:12:36.951443 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-849fk"] Oct 12 08:12:36 crc kubenswrapper[4599]: I1012 08:12:36.965982 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-849fk"] Oct 12 08:12:36 crc kubenswrapper[4599]: I1012 08:12:36.966209 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-849fk" Oct 12 08:12:37 crc kubenswrapper[4599]: I1012 08:12:37.083321 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a17b908-fab0-45b2-a224-3de60dbc3ce4-catalog-content\") pod \"certified-operators-849fk\" (UID: \"1a17b908-fab0-45b2-a224-3de60dbc3ce4\") " pod="openshift-marketplace/certified-operators-849fk" Oct 12 08:12:37 crc kubenswrapper[4599]: I1012 08:12:37.083397 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlzv8\" (UniqueName: \"kubernetes.io/projected/1a17b908-fab0-45b2-a224-3de60dbc3ce4-kube-api-access-wlzv8\") pod \"certified-operators-849fk\" (UID: \"1a17b908-fab0-45b2-a224-3de60dbc3ce4\") " pod="openshift-marketplace/certified-operators-849fk" Oct 12 08:12:37 crc kubenswrapper[4599]: I1012 08:12:37.083480 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a17b908-fab0-45b2-a224-3de60dbc3ce4-utilities\") pod \"certified-operators-849fk\" (UID: \"1a17b908-fab0-45b2-a224-3de60dbc3ce4\") " pod="openshift-marketplace/certified-operators-849fk" Oct 12 08:12:37 crc kubenswrapper[4599]: I1012 08:12:37.184809 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a17b908-fab0-45b2-a224-3de60dbc3ce4-catalog-content\") pod \"certified-operators-849fk\" (UID: \"1a17b908-fab0-45b2-a224-3de60dbc3ce4\") " pod="openshift-marketplace/certified-operators-849fk" Oct 12 08:12:37 crc kubenswrapper[4599]: I1012 08:12:37.184871 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlzv8\" (UniqueName: \"kubernetes.io/projected/1a17b908-fab0-45b2-a224-3de60dbc3ce4-kube-api-access-wlzv8\") pod \"certified-operators-849fk\" (UID: \"1a17b908-fab0-45b2-a224-3de60dbc3ce4\") " pod="openshift-marketplace/certified-operators-849fk" Oct 12 08:12:37 crc kubenswrapper[4599]: I1012 08:12:37.185017 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a17b908-fab0-45b2-a224-3de60dbc3ce4-utilities\") pod \"certified-operators-849fk\" (UID: \"1a17b908-fab0-45b2-a224-3de60dbc3ce4\") " pod="openshift-marketplace/certified-operators-849fk" Oct 12 08:12:37 crc kubenswrapper[4599]: I1012 08:12:37.185237 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a17b908-fab0-45b2-a224-3de60dbc3ce4-catalog-content\") pod \"certified-operators-849fk\" (UID: \"1a17b908-fab0-45b2-a224-3de60dbc3ce4\") " pod="openshift-marketplace/certified-operators-849fk" Oct 12 08:12:37 crc kubenswrapper[4599]: I1012 08:12:37.185480 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a17b908-fab0-45b2-a224-3de60dbc3ce4-utilities\") pod \"certified-operators-849fk\" (UID: \"1a17b908-fab0-45b2-a224-3de60dbc3ce4\") " pod="openshift-marketplace/certified-operators-849fk" Oct 12 08:12:37 crc kubenswrapper[4599]: I1012 08:12:37.203291 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlzv8\" (UniqueName: \"kubernetes.io/projected/1a17b908-fab0-45b2-a224-3de60dbc3ce4-kube-api-access-wlzv8\") pod \"certified-operators-849fk\" (UID: \"1a17b908-fab0-45b2-a224-3de60dbc3ce4\") " pod="openshift-marketplace/certified-operators-849fk" Oct 12 08:12:37 crc kubenswrapper[4599]: I1012 08:12:37.296485 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-849fk" Oct 12 08:12:39 crc kubenswrapper[4599]: E1012 08:12:39.189023 4599 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-tempest-all:92672cd85fd36317d65faa0525acf849" Oct 12 08:12:39 crc kubenswrapper[4599]: E1012 08:12:39.189457 4599 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-tempest-all:92672cd85fd36317d65faa0525acf849" Oct 12 08:12:39 crc kubenswrapper[4599]: E1012 08:12:39.189614 4599 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-tempest-all:92672cd85fd36317d65faa0525acf849,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hwcrv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(bdf67e61-9c15-4079-9a9b-a74c40ad364f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 12 08:12:39 crc kubenswrapper[4599]: E1012 08:12:39.190921 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="bdf67e61-9c15-4079-9a9b-a74c40ad364f" Oct 12 08:12:39 crc kubenswrapper[4599]: E1012 08:12:39.372345 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-antelope-centos9/openstack-tempest-all:92672cd85fd36317d65faa0525acf849\\\"\"" pod="openstack/tempest-tests-tempest" podUID="bdf67e61-9c15-4079-9a9b-a74c40ad364f" Oct 12 08:12:39 crc kubenswrapper[4599]: W1012 08:12:39.556187 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a17b908_fab0_45b2_a224_3de60dbc3ce4.slice/crio-27d58c4432ad636c248558d0a7b7f3f81771f12ed31d5ed25c9249ce3b9251aa WatchSource:0}: Error finding container 27d58c4432ad636c248558d0a7b7f3f81771f12ed31d5ed25c9249ce3b9251aa: Status 404 returned error can't find the container with id 27d58c4432ad636c248558d0a7b7f3f81771f12ed31d5ed25c9249ce3b9251aa Oct 12 08:12:39 crc kubenswrapper[4599]: I1012 08:12:39.558855 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-849fk"] Oct 12 08:12:40 crc kubenswrapper[4599]: I1012 08:12:40.379702 4599 generic.go:334] "Generic (PLEG): container finished" podID="1a17b908-fab0-45b2-a224-3de60dbc3ce4" containerID="b63d624739e7030f40768b26fae6e138f3c42e6a7c13a39d545f18d322f1198d" exitCode=0 Oct 12 08:12:40 crc kubenswrapper[4599]: I1012 08:12:40.379938 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-849fk" event={"ID":"1a17b908-fab0-45b2-a224-3de60dbc3ce4","Type":"ContainerDied","Data":"b63d624739e7030f40768b26fae6e138f3c42e6a7c13a39d545f18d322f1198d"} Oct 12 08:12:40 crc kubenswrapper[4599]: I1012 08:12:40.379960 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-849fk" event={"ID":"1a17b908-fab0-45b2-a224-3de60dbc3ce4","Type":"ContainerStarted","Data":"27d58c4432ad636c248558d0a7b7f3f81771f12ed31d5ed25c9249ce3b9251aa"} Oct 12 08:12:41 crc kubenswrapper[4599]: I1012 08:12:41.392665 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-849fk" event={"ID":"1a17b908-fab0-45b2-a224-3de60dbc3ce4","Type":"ContainerStarted","Data":"7ed91372cd8ded88685803c1f3557a6a202a38709991beb18c66ea87a26090f9"} Oct 12 08:12:42 crc kubenswrapper[4599]: I1012 08:12:42.405711 4599 generic.go:334] "Generic (PLEG): container finished" podID="1a17b908-fab0-45b2-a224-3de60dbc3ce4" containerID="7ed91372cd8ded88685803c1f3557a6a202a38709991beb18c66ea87a26090f9" exitCode=0 Oct 12 08:12:42 crc kubenswrapper[4599]: I1012 08:12:42.405778 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-849fk" event={"ID":"1a17b908-fab0-45b2-a224-3de60dbc3ce4","Type":"ContainerDied","Data":"7ed91372cd8ded88685803c1f3557a6a202a38709991beb18c66ea87a26090f9"} Oct 12 08:12:43 crc kubenswrapper[4599]: I1012 08:12:43.416847 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-849fk" event={"ID":"1a17b908-fab0-45b2-a224-3de60dbc3ce4","Type":"ContainerStarted","Data":"07293bb57fd04ec0db9da38c5bef63d6c2654826a5b45e631b0fdda1085dd63a"} Oct 12 08:12:43 crc kubenswrapper[4599]: I1012 08:12:43.441153 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-849fk" podStartSLOduration=4.835422369 podStartE2EDuration="7.441131837s" podCreationTimestamp="2025-10-12 08:12:36 +0000 UTC" firstStartedPulling="2025-10-12 08:12:40.38200603 +0000 UTC m=+2257.171201532" lastFinishedPulling="2025-10-12 08:12:42.987715497 +0000 UTC m=+2259.776911000" observedRunningTime="2025-10-12 08:12:43.4330848 +0000 UTC m=+2260.222280302" watchObservedRunningTime="2025-10-12 08:12:43.441131837 +0000 UTC m=+2260.230327339" Oct 12 08:12:47 crc kubenswrapper[4599]: I1012 08:12:47.297159 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-849fk" Oct 12 08:12:47 crc kubenswrapper[4599]: I1012 08:12:47.297781 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-849fk" Oct 12 08:12:47 crc kubenswrapper[4599]: I1012 08:12:47.333454 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-849fk" Oct 12 08:12:47 crc kubenswrapper[4599]: I1012 08:12:47.544929 4599 scope.go:117] "RemoveContainer" containerID="4e7d8f7d3d60ebdc79d83491f148013a2c65198dbd3b53c026bfcf65f06699ba" Oct 12 08:12:47 crc kubenswrapper[4599]: E1012 08:12:47.545586 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5mz5c_openshift-machine-config-operator(cc694bce-8c25-4729-b452-29d44d3efe6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" Oct 12 08:12:54 crc kubenswrapper[4599]: I1012 08:12:54.295924 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Oct 12 08:12:55 crc kubenswrapper[4599]: I1012 08:12:55.558566 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"bdf67e61-9c15-4079-9a9b-a74c40ad364f","Type":"ContainerStarted","Data":"374d6d02b7a167e355b4e5c7f6c1e0680c1a93b8820598ad630e6d950b12eccd"} Oct 12 08:12:55 crc kubenswrapper[4599]: I1012 08:12:55.582014 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=3.7336347009999997 podStartE2EDuration="1m10.581995535s" podCreationTimestamp="2025-10-12 08:11:45 +0000 UTC" firstStartedPulling="2025-10-12 08:11:47.444569432 +0000 UTC m=+2204.233764934" lastFinishedPulling="2025-10-12 08:12:54.292930266 +0000 UTC m=+2271.082125768" observedRunningTime="2025-10-12 08:12:55.575879659 +0000 UTC m=+2272.365075162" watchObservedRunningTime="2025-10-12 08:12:55.581995535 +0000 UTC m=+2272.371191036" Oct 12 08:12:57 crc kubenswrapper[4599]: I1012 08:12:57.337533 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-849fk" Oct 12 08:12:57 crc kubenswrapper[4599]: I1012 08:12:57.379360 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-849fk"] Oct 12 08:12:57 crc kubenswrapper[4599]: I1012 08:12:57.572699 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-849fk" podUID="1a17b908-fab0-45b2-a224-3de60dbc3ce4" containerName="registry-server" containerID="cri-o://07293bb57fd04ec0db9da38c5bef63d6c2654826a5b45e631b0fdda1085dd63a" gracePeriod=2 Oct 12 08:12:57 crc kubenswrapper[4599]: I1012 08:12:57.953500 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-849fk" Oct 12 08:12:58 crc kubenswrapper[4599]: I1012 08:12:58.144518 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wlzv8\" (UniqueName: \"kubernetes.io/projected/1a17b908-fab0-45b2-a224-3de60dbc3ce4-kube-api-access-wlzv8\") pod \"1a17b908-fab0-45b2-a224-3de60dbc3ce4\" (UID: \"1a17b908-fab0-45b2-a224-3de60dbc3ce4\") " Oct 12 08:12:58 crc kubenswrapper[4599]: I1012 08:12:58.144665 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a17b908-fab0-45b2-a224-3de60dbc3ce4-catalog-content\") pod \"1a17b908-fab0-45b2-a224-3de60dbc3ce4\" (UID: \"1a17b908-fab0-45b2-a224-3de60dbc3ce4\") " Oct 12 08:12:58 crc kubenswrapper[4599]: I1012 08:12:58.145060 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a17b908-fab0-45b2-a224-3de60dbc3ce4-utilities\") pod \"1a17b908-fab0-45b2-a224-3de60dbc3ce4\" (UID: \"1a17b908-fab0-45b2-a224-3de60dbc3ce4\") " Oct 12 08:12:58 crc kubenswrapper[4599]: I1012 08:12:58.145710 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a17b908-fab0-45b2-a224-3de60dbc3ce4-utilities" (OuterVolumeSpecName: "utilities") pod "1a17b908-fab0-45b2-a224-3de60dbc3ce4" (UID: "1a17b908-fab0-45b2-a224-3de60dbc3ce4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 08:12:58 crc kubenswrapper[4599]: I1012 08:12:58.166456 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a17b908-fab0-45b2-a224-3de60dbc3ce4-kube-api-access-wlzv8" (OuterVolumeSpecName: "kube-api-access-wlzv8") pod "1a17b908-fab0-45b2-a224-3de60dbc3ce4" (UID: "1a17b908-fab0-45b2-a224-3de60dbc3ce4"). InnerVolumeSpecName "kube-api-access-wlzv8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 08:12:58 crc kubenswrapper[4599]: I1012 08:12:58.183073 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a17b908-fab0-45b2-a224-3de60dbc3ce4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1a17b908-fab0-45b2-a224-3de60dbc3ce4" (UID: "1a17b908-fab0-45b2-a224-3de60dbc3ce4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 08:12:58 crc kubenswrapper[4599]: I1012 08:12:58.248050 4599 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a17b908-fab0-45b2-a224-3de60dbc3ce4-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 08:12:58 crc kubenswrapper[4599]: I1012 08:12:58.248084 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wlzv8\" (UniqueName: \"kubernetes.io/projected/1a17b908-fab0-45b2-a224-3de60dbc3ce4-kube-api-access-wlzv8\") on node \"crc\" DevicePath \"\"" Oct 12 08:12:58 crc kubenswrapper[4599]: I1012 08:12:58.248096 4599 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a17b908-fab0-45b2-a224-3de60dbc3ce4-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 08:12:58 crc kubenswrapper[4599]: I1012 08:12:58.596165 4599 generic.go:334] "Generic (PLEG): container finished" podID="1a17b908-fab0-45b2-a224-3de60dbc3ce4" containerID="07293bb57fd04ec0db9da38c5bef63d6c2654826a5b45e631b0fdda1085dd63a" exitCode=0 Oct 12 08:12:58 crc kubenswrapper[4599]: I1012 08:12:58.596222 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-849fk" Oct 12 08:12:58 crc kubenswrapper[4599]: I1012 08:12:58.596212 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-849fk" event={"ID":"1a17b908-fab0-45b2-a224-3de60dbc3ce4","Type":"ContainerDied","Data":"07293bb57fd04ec0db9da38c5bef63d6c2654826a5b45e631b0fdda1085dd63a"} Oct 12 08:12:58 crc kubenswrapper[4599]: I1012 08:12:58.596728 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-849fk" event={"ID":"1a17b908-fab0-45b2-a224-3de60dbc3ce4","Type":"ContainerDied","Data":"27d58c4432ad636c248558d0a7b7f3f81771f12ed31d5ed25c9249ce3b9251aa"} Oct 12 08:12:58 crc kubenswrapper[4599]: I1012 08:12:58.596754 4599 scope.go:117] "RemoveContainer" containerID="07293bb57fd04ec0db9da38c5bef63d6c2654826a5b45e631b0fdda1085dd63a" Oct 12 08:12:58 crc kubenswrapper[4599]: I1012 08:12:58.616771 4599 scope.go:117] "RemoveContainer" containerID="7ed91372cd8ded88685803c1f3557a6a202a38709991beb18c66ea87a26090f9" Oct 12 08:12:58 crc kubenswrapper[4599]: I1012 08:12:58.623289 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-849fk"] Oct 12 08:12:58 crc kubenswrapper[4599]: I1012 08:12:58.629967 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-849fk"] Oct 12 08:12:58 crc kubenswrapper[4599]: I1012 08:12:58.646240 4599 scope.go:117] "RemoveContainer" containerID="b63d624739e7030f40768b26fae6e138f3c42e6a7c13a39d545f18d322f1198d" Oct 12 08:12:58 crc kubenswrapper[4599]: I1012 08:12:58.672153 4599 scope.go:117] "RemoveContainer" containerID="07293bb57fd04ec0db9da38c5bef63d6c2654826a5b45e631b0fdda1085dd63a" Oct 12 08:12:58 crc kubenswrapper[4599]: E1012 08:12:58.672575 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07293bb57fd04ec0db9da38c5bef63d6c2654826a5b45e631b0fdda1085dd63a\": container with ID starting with 07293bb57fd04ec0db9da38c5bef63d6c2654826a5b45e631b0fdda1085dd63a not found: ID does not exist" containerID="07293bb57fd04ec0db9da38c5bef63d6c2654826a5b45e631b0fdda1085dd63a" Oct 12 08:12:58 crc kubenswrapper[4599]: I1012 08:12:58.672603 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07293bb57fd04ec0db9da38c5bef63d6c2654826a5b45e631b0fdda1085dd63a"} err="failed to get container status \"07293bb57fd04ec0db9da38c5bef63d6c2654826a5b45e631b0fdda1085dd63a\": rpc error: code = NotFound desc = could not find container \"07293bb57fd04ec0db9da38c5bef63d6c2654826a5b45e631b0fdda1085dd63a\": container with ID starting with 07293bb57fd04ec0db9da38c5bef63d6c2654826a5b45e631b0fdda1085dd63a not found: ID does not exist" Oct 12 08:12:58 crc kubenswrapper[4599]: I1012 08:12:58.672621 4599 scope.go:117] "RemoveContainer" containerID="7ed91372cd8ded88685803c1f3557a6a202a38709991beb18c66ea87a26090f9" Oct 12 08:12:58 crc kubenswrapper[4599]: E1012 08:12:58.672901 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ed91372cd8ded88685803c1f3557a6a202a38709991beb18c66ea87a26090f9\": container with ID starting with 7ed91372cd8ded88685803c1f3557a6a202a38709991beb18c66ea87a26090f9 not found: ID does not exist" containerID="7ed91372cd8ded88685803c1f3557a6a202a38709991beb18c66ea87a26090f9" Oct 12 08:12:58 crc kubenswrapper[4599]: I1012 08:12:58.672919 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ed91372cd8ded88685803c1f3557a6a202a38709991beb18c66ea87a26090f9"} err="failed to get container status \"7ed91372cd8ded88685803c1f3557a6a202a38709991beb18c66ea87a26090f9\": rpc error: code = NotFound desc = could not find container \"7ed91372cd8ded88685803c1f3557a6a202a38709991beb18c66ea87a26090f9\": container with ID starting with 7ed91372cd8ded88685803c1f3557a6a202a38709991beb18c66ea87a26090f9 not found: ID does not exist" Oct 12 08:12:58 crc kubenswrapper[4599]: I1012 08:12:58.672932 4599 scope.go:117] "RemoveContainer" containerID="b63d624739e7030f40768b26fae6e138f3c42e6a7c13a39d545f18d322f1198d" Oct 12 08:12:58 crc kubenswrapper[4599]: E1012 08:12:58.673371 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b63d624739e7030f40768b26fae6e138f3c42e6a7c13a39d545f18d322f1198d\": container with ID starting with b63d624739e7030f40768b26fae6e138f3c42e6a7c13a39d545f18d322f1198d not found: ID does not exist" containerID="b63d624739e7030f40768b26fae6e138f3c42e6a7c13a39d545f18d322f1198d" Oct 12 08:12:58 crc kubenswrapper[4599]: I1012 08:12:58.673403 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b63d624739e7030f40768b26fae6e138f3c42e6a7c13a39d545f18d322f1198d"} err="failed to get container status \"b63d624739e7030f40768b26fae6e138f3c42e6a7c13a39d545f18d322f1198d\": rpc error: code = NotFound desc = could not find container \"b63d624739e7030f40768b26fae6e138f3c42e6a7c13a39d545f18d322f1198d\": container with ID starting with b63d624739e7030f40768b26fae6e138f3c42e6a7c13a39d545f18d322f1198d not found: ID does not exist" Oct 12 08:12:59 crc kubenswrapper[4599]: I1012 08:12:59.556929 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a17b908-fab0-45b2-a224-3de60dbc3ce4" path="/var/lib/kubelet/pods/1a17b908-fab0-45b2-a224-3de60dbc3ce4/volumes" Oct 12 08:13:02 crc kubenswrapper[4599]: I1012 08:13:02.545041 4599 scope.go:117] "RemoveContainer" containerID="4e7d8f7d3d60ebdc79d83491f148013a2c65198dbd3b53c026bfcf65f06699ba" Oct 12 08:13:02 crc kubenswrapper[4599]: E1012 08:13:02.545757 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5mz5c_openshift-machine-config-operator(cc694bce-8c25-4729-b452-29d44d3efe6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" Oct 12 08:13:17 crc kubenswrapper[4599]: I1012 08:13:17.545326 4599 scope.go:117] "RemoveContainer" containerID="4e7d8f7d3d60ebdc79d83491f148013a2c65198dbd3b53c026bfcf65f06699ba" Oct 12 08:13:17 crc kubenswrapper[4599]: E1012 08:13:17.545936 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5mz5c_openshift-machine-config-operator(cc694bce-8c25-4729-b452-29d44d3efe6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" Oct 12 08:13:32 crc kubenswrapper[4599]: I1012 08:13:32.545700 4599 scope.go:117] "RemoveContainer" containerID="4e7d8f7d3d60ebdc79d83491f148013a2c65198dbd3b53c026bfcf65f06699ba" Oct 12 08:13:32 crc kubenswrapper[4599]: E1012 08:13:32.546299 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5mz5c_openshift-machine-config-operator(cc694bce-8c25-4729-b452-29d44d3efe6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" Oct 12 08:13:46 crc kubenswrapper[4599]: I1012 08:13:46.545247 4599 scope.go:117] "RemoveContainer" containerID="4e7d8f7d3d60ebdc79d83491f148013a2c65198dbd3b53c026bfcf65f06699ba" Oct 12 08:13:46 crc kubenswrapper[4599]: E1012 08:13:46.545995 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5mz5c_openshift-machine-config-operator(cc694bce-8c25-4729-b452-29d44d3efe6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" Oct 12 08:14:01 crc kubenswrapper[4599]: I1012 08:14:01.544846 4599 scope.go:117] "RemoveContainer" containerID="4e7d8f7d3d60ebdc79d83491f148013a2c65198dbd3b53c026bfcf65f06699ba" Oct 12 08:14:01 crc kubenswrapper[4599]: E1012 08:14:01.545662 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5mz5c_openshift-machine-config-operator(cc694bce-8c25-4729-b452-29d44d3efe6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" Oct 12 08:14:16 crc kubenswrapper[4599]: I1012 08:14:16.545049 4599 scope.go:117] "RemoveContainer" containerID="4e7d8f7d3d60ebdc79d83491f148013a2c65198dbd3b53c026bfcf65f06699ba" Oct 12 08:14:16 crc kubenswrapper[4599]: E1012 08:14:16.545873 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5mz5c_openshift-machine-config-operator(cc694bce-8c25-4729-b452-29d44d3efe6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" Oct 12 08:14:31 crc kubenswrapper[4599]: I1012 08:14:31.544954 4599 scope.go:117] "RemoveContainer" containerID="4e7d8f7d3d60ebdc79d83491f148013a2c65198dbd3b53c026bfcf65f06699ba" Oct 12 08:14:31 crc kubenswrapper[4599]: E1012 08:14:31.545652 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5mz5c_openshift-machine-config-operator(cc694bce-8c25-4729-b452-29d44d3efe6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" Oct 12 08:14:42 crc kubenswrapper[4599]: I1012 08:14:42.545248 4599 scope.go:117] "RemoveContainer" containerID="4e7d8f7d3d60ebdc79d83491f148013a2c65198dbd3b53c026bfcf65f06699ba" Oct 12 08:14:42 crc kubenswrapper[4599]: E1012 08:14:42.545931 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5mz5c_openshift-machine-config-operator(cc694bce-8c25-4729-b452-29d44d3efe6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" Oct 12 08:14:56 crc kubenswrapper[4599]: I1012 08:14:56.545961 4599 scope.go:117] "RemoveContainer" containerID="4e7d8f7d3d60ebdc79d83491f148013a2c65198dbd3b53c026bfcf65f06699ba" Oct 12 08:14:56 crc kubenswrapper[4599]: E1012 08:14:56.546590 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5mz5c_openshift-machine-config-operator(cc694bce-8c25-4729-b452-29d44d3efe6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" Oct 12 08:15:00 crc kubenswrapper[4599]: I1012 08:15:00.129612 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29337615-hjt9z"] Oct 12 08:15:00 crc kubenswrapper[4599]: E1012 08:15:00.130421 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a17b908-fab0-45b2-a224-3de60dbc3ce4" containerName="registry-server" Oct 12 08:15:00 crc kubenswrapper[4599]: I1012 08:15:00.130433 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a17b908-fab0-45b2-a224-3de60dbc3ce4" containerName="registry-server" Oct 12 08:15:00 crc kubenswrapper[4599]: E1012 08:15:00.130447 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a17b908-fab0-45b2-a224-3de60dbc3ce4" containerName="extract-utilities" Oct 12 08:15:00 crc kubenswrapper[4599]: I1012 08:15:00.130454 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a17b908-fab0-45b2-a224-3de60dbc3ce4" containerName="extract-utilities" Oct 12 08:15:00 crc kubenswrapper[4599]: E1012 08:15:00.130477 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a17b908-fab0-45b2-a224-3de60dbc3ce4" containerName="extract-content" Oct 12 08:15:00 crc kubenswrapper[4599]: I1012 08:15:00.130482 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a17b908-fab0-45b2-a224-3de60dbc3ce4" containerName="extract-content" Oct 12 08:15:00 crc kubenswrapper[4599]: I1012 08:15:00.130641 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a17b908-fab0-45b2-a224-3de60dbc3ce4" containerName="registry-server" Oct 12 08:15:00 crc kubenswrapper[4599]: I1012 08:15:00.131184 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29337615-hjt9z" Oct 12 08:15:00 crc kubenswrapper[4599]: I1012 08:15:00.132860 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 12 08:15:00 crc kubenswrapper[4599]: I1012 08:15:00.133847 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 12 08:15:00 crc kubenswrapper[4599]: I1012 08:15:00.135547 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29337615-hjt9z"] Oct 12 08:15:00 crc kubenswrapper[4599]: I1012 08:15:00.278529 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kx7v5\" (UniqueName: \"kubernetes.io/projected/3cd983e3-249e-4158-9a86-2bf0a7ade124-kube-api-access-kx7v5\") pod \"collect-profiles-29337615-hjt9z\" (UID: \"3cd983e3-249e-4158-9a86-2bf0a7ade124\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337615-hjt9z" Oct 12 08:15:00 crc kubenswrapper[4599]: I1012 08:15:00.278651 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3cd983e3-249e-4158-9a86-2bf0a7ade124-config-volume\") pod \"collect-profiles-29337615-hjt9z\" (UID: \"3cd983e3-249e-4158-9a86-2bf0a7ade124\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337615-hjt9z" Oct 12 08:15:00 crc kubenswrapper[4599]: I1012 08:15:00.278698 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3cd983e3-249e-4158-9a86-2bf0a7ade124-secret-volume\") pod \"collect-profiles-29337615-hjt9z\" (UID: \"3cd983e3-249e-4158-9a86-2bf0a7ade124\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337615-hjt9z" Oct 12 08:15:00 crc kubenswrapper[4599]: I1012 08:15:00.380499 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kx7v5\" (UniqueName: \"kubernetes.io/projected/3cd983e3-249e-4158-9a86-2bf0a7ade124-kube-api-access-kx7v5\") pod \"collect-profiles-29337615-hjt9z\" (UID: \"3cd983e3-249e-4158-9a86-2bf0a7ade124\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337615-hjt9z" Oct 12 08:15:00 crc kubenswrapper[4599]: I1012 08:15:00.380592 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3cd983e3-249e-4158-9a86-2bf0a7ade124-config-volume\") pod \"collect-profiles-29337615-hjt9z\" (UID: \"3cd983e3-249e-4158-9a86-2bf0a7ade124\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337615-hjt9z" Oct 12 08:15:00 crc kubenswrapper[4599]: I1012 08:15:00.380632 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3cd983e3-249e-4158-9a86-2bf0a7ade124-secret-volume\") pod \"collect-profiles-29337615-hjt9z\" (UID: \"3cd983e3-249e-4158-9a86-2bf0a7ade124\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337615-hjt9z" Oct 12 08:15:00 crc kubenswrapper[4599]: I1012 08:15:00.381435 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3cd983e3-249e-4158-9a86-2bf0a7ade124-config-volume\") pod \"collect-profiles-29337615-hjt9z\" (UID: \"3cd983e3-249e-4158-9a86-2bf0a7ade124\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337615-hjt9z" Oct 12 08:15:00 crc kubenswrapper[4599]: I1012 08:15:00.385464 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3cd983e3-249e-4158-9a86-2bf0a7ade124-secret-volume\") pod \"collect-profiles-29337615-hjt9z\" (UID: \"3cd983e3-249e-4158-9a86-2bf0a7ade124\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337615-hjt9z" Oct 12 08:15:00 crc kubenswrapper[4599]: I1012 08:15:00.394397 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kx7v5\" (UniqueName: \"kubernetes.io/projected/3cd983e3-249e-4158-9a86-2bf0a7ade124-kube-api-access-kx7v5\") pod \"collect-profiles-29337615-hjt9z\" (UID: \"3cd983e3-249e-4158-9a86-2bf0a7ade124\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337615-hjt9z" Oct 12 08:15:00 crc kubenswrapper[4599]: I1012 08:15:00.450514 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29337615-hjt9z" Oct 12 08:15:00 crc kubenswrapper[4599]: I1012 08:15:00.806140 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29337615-hjt9z"] Oct 12 08:15:01 crc kubenswrapper[4599]: I1012 08:15:01.431698 4599 generic.go:334] "Generic (PLEG): container finished" podID="3cd983e3-249e-4158-9a86-2bf0a7ade124" containerID="49bd4370b94556810b983e473a0dbb9438ca8940136aaa6ec274e2a889702df9" exitCode=0 Oct 12 08:15:01 crc kubenswrapper[4599]: I1012 08:15:01.431796 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29337615-hjt9z" event={"ID":"3cd983e3-249e-4158-9a86-2bf0a7ade124","Type":"ContainerDied","Data":"49bd4370b94556810b983e473a0dbb9438ca8940136aaa6ec274e2a889702df9"} Oct 12 08:15:01 crc kubenswrapper[4599]: I1012 08:15:01.431933 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29337615-hjt9z" event={"ID":"3cd983e3-249e-4158-9a86-2bf0a7ade124","Type":"ContainerStarted","Data":"436bcdd33809907d3f1123999a6d835db20819281db36f32da1bff0e8a51b845"} Oct 12 08:15:02 crc kubenswrapper[4599]: I1012 08:15:02.710919 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29337615-hjt9z" Oct 12 08:15:02 crc kubenswrapper[4599]: I1012 08:15:02.814323 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3cd983e3-249e-4158-9a86-2bf0a7ade124-secret-volume\") pod \"3cd983e3-249e-4158-9a86-2bf0a7ade124\" (UID: \"3cd983e3-249e-4158-9a86-2bf0a7ade124\") " Oct 12 08:15:02 crc kubenswrapper[4599]: I1012 08:15:02.814386 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3cd983e3-249e-4158-9a86-2bf0a7ade124-config-volume\") pod \"3cd983e3-249e-4158-9a86-2bf0a7ade124\" (UID: \"3cd983e3-249e-4158-9a86-2bf0a7ade124\") " Oct 12 08:15:02 crc kubenswrapper[4599]: I1012 08:15:02.814474 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kx7v5\" (UniqueName: \"kubernetes.io/projected/3cd983e3-249e-4158-9a86-2bf0a7ade124-kube-api-access-kx7v5\") pod \"3cd983e3-249e-4158-9a86-2bf0a7ade124\" (UID: \"3cd983e3-249e-4158-9a86-2bf0a7ade124\") " Oct 12 08:15:02 crc kubenswrapper[4599]: I1012 08:15:02.814931 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cd983e3-249e-4158-9a86-2bf0a7ade124-config-volume" (OuterVolumeSpecName: "config-volume") pod "3cd983e3-249e-4158-9a86-2bf0a7ade124" (UID: "3cd983e3-249e-4158-9a86-2bf0a7ade124"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 08:15:02 crc kubenswrapper[4599]: I1012 08:15:02.818610 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cd983e3-249e-4158-9a86-2bf0a7ade124-kube-api-access-kx7v5" (OuterVolumeSpecName: "kube-api-access-kx7v5") pod "3cd983e3-249e-4158-9a86-2bf0a7ade124" (UID: "3cd983e3-249e-4158-9a86-2bf0a7ade124"). InnerVolumeSpecName "kube-api-access-kx7v5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 08:15:02 crc kubenswrapper[4599]: I1012 08:15:02.819624 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3cd983e3-249e-4158-9a86-2bf0a7ade124-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3cd983e3-249e-4158-9a86-2bf0a7ade124" (UID: "3cd983e3-249e-4158-9a86-2bf0a7ade124"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 08:15:02 crc kubenswrapper[4599]: I1012 08:15:02.916558 4599 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3cd983e3-249e-4158-9a86-2bf0a7ade124-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 12 08:15:02 crc kubenswrapper[4599]: I1012 08:15:02.916587 4599 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3cd983e3-249e-4158-9a86-2bf0a7ade124-config-volume\") on node \"crc\" DevicePath \"\"" Oct 12 08:15:02 crc kubenswrapper[4599]: I1012 08:15:02.916599 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kx7v5\" (UniqueName: \"kubernetes.io/projected/3cd983e3-249e-4158-9a86-2bf0a7ade124-kube-api-access-kx7v5\") on node \"crc\" DevicePath \"\"" Oct 12 08:15:03 crc kubenswrapper[4599]: I1012 08:15:03.446778 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29337615-hjt9z" event={"ID":"3cd983e3-249e-4158-9a86-2bf0a7ade124","Type":"ContainerDied","Data":"436bcdd33809907d3f1123999a6d835db20819281db36f32da1bff0e8a51b845"} Oct 12 08:15:03 crc kubenswrapper[4599]: I1012 08:15:03.446966 4599 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="436bcdd33809907d3f1123999a6d835db20819281db36f32da1bff0e8a51b845" Oct 12 08:15:03 crc kubenswrapper[4599]: I1012 08:15:03.446826 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29337615-hjt9z" Oct 12 08:15:03 crc kubenswrapper[4599]: I1012 08:15:03.757667 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29337570-l9r8r"] Oct 12 08:15:03 crc kubenswrapper[4599]: I1012 08:15:03.762701 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29337570-l9r8r"] Oct 12 08:15:05 crc kubenswrapper[4599]: I1012 08:15:05.561933 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a560875d-c07e-457b-a77b-809cc770867c" path="/var/lib/kubelet/pods/a560875d-c07e-457b-a77b-809cc770867c/volumes" Oct 12 08:15:07 crc kubenswrapper[4599]: I1012 08:15:07.666028 4599 scope.go:117] "RemoveContainer" containerID="09d77bf8350365feb41b8b233fae9fd3ad7ad85f82c388edfbf15103162997eb" Oct 12 08:15:11 crc kubenswrapper[4599]: I1012 08:15:11.546162 4599 scope.go:117] "RemoveContainer" containerID="4e7d8f7d3d60ebdc79d83491f148013a2c65198dbd3b53c026bfcf65f06699ba" Oct 12 08:15:11 crc kubenswrapper[4599]: E1012 08:15:11.546893 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5mz5c_openshift-machine-config-operator(cc694bce-8c25-4729-b452-29d44d3efe6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" Oct 12 08:15:24 crc kubenswrapper[4599]: I1012 08:15:24.545637 4599 scope.go:117] "RemoveContainer" containerID="4e7d8f7d3d60ebdc79d83491f148013a2c65198dbd3b53c026bfcf65f06699ba" Oct 12 08:15:24 crc kubenswrapper[4599]: E1012 08:15:24.546501 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5mz5c_openshift-machine-config-operator(cc694bce-8c25-4729-b452-29d44d3efe6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" Oct 12 08:15:35 crc kubenswrapper[4599]: I1012 08:15:35.545654 4599 scope.go:117] "RemoveContainer" containerID="4e7d8f7d3d60ebdc79d83491f148013a2c65198dbd3b53c026bfcf65f06699ba" Oct 12 08:15:35 crc kubenswrapper[4599]: E1012 08:15:35.546515 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5mz5c_openshift-machine-config-operator(cc694bce-8c25-4729-b452-29d44d3efe6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" Oct 12 08:15:49 crc kubenswrapper[4599]: I1012 08:15:49.545494 4599 scope.go:117] "RemoveContainer" containerID="4e7d8f7d3d60ebdc79d83491f148013a2c65198dbd3b53c026bfcf65f06699ba" Oct 12 08:15:49 crc kubenswrapper[4599]: E1012 08:15:49.546130 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5mz5c_openshift-machine-config-operator(cc694bce-8c25-4729-b452-29d44d3efe6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" Oct 12 08:15:56 crc kubenswrapper[4599]: I1012 08:15:56.863059 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-xt4fz"] Oct 12 08:15:56 crc kubenswrapper[4599]: E1012 08:15:56.863849 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cd983e3-249e-4158-9a86-2bf0a7ade124" containerName="collect-profiles" Oct 12 08:15:56 crc kubenswrapper[4599]: I1012 08:15:56.863861 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cd983e3-249e-4158-9a86-2bf0a7ade124" containerName="collect-profiles" Oct 12 08:15:56 crc kubenswrapper[4599]: I1012 08:15:56.864024 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="3cd983e3-249e-4158-9a86-2bf0a7ade124" containerName="collect-profiles" Oct 12 08:15:56 crc kubenswrapper[4599]: I1012 08:15:56.865133 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xt4fz" Oct 12 08:15:56 crc kubenswrapper[4599]: I1012 08:15:56.873157 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xt4fz"] Oct 12 08:15:56 crc kubenswrapper[4599]: I1012 08:15:56.955438 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfw4h\" (UniqueName: \"kubernetes.io/projected/74692910-d735-4a55-bcfc-3fa5da2c48ab-kube-api-access-jfw4h\") pod \"redhat-marketplace-xt4fz\" (UID: \"74692910-d735-4a55-bcfc-3fa5da2c48ab\") " pod="openshift-marketplace/redhat-marketplace-xt4fz" Oct 12 08:15:56 crc kubenswrapper[4599]: I1012 08:15:56.955509 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74692910-d735-4a55-bcfc-3fa5da2c48ab-catalog-content\") pod \"redhat-marketplace-xt4fz\" (UID: \"74692910-d735-4a55-bcfc-3fa5da2c48ab\") " pod="openshift-marketplace/redhat-marketplace-xt4fz" Oct 12 08:15:56 crc kubenswrapper[4599]: I1012 08:15:56.955615 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74692910-d735-4a55-bcfc-3fa5da2c48ab-utilities\") pod \"redhat-marketplace-xt4fz\" (UID: \"74692910-d735-4a55-bcfc-3fa5da2c48ab\") " pod="openshift-marketplace/redhat-marketplace-xt4fz" Oct 12 08:15:57 crc kubenswrapper[4599]: I1012 08:15:57.056831 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfw4h\" (UniqueName: \"kubernetes.io/projected/74692910-d735-4a55-bcfc-3fa5da2c48ab-kube-api-access-jfw4h\") pod \"redhat-marketplace-xt4fz\" (UID: \"74692910-d735-4a55-bcfc-3fa5da2c48ab\") " pod="openshift-marketplace/redhat-marketplace-xt4fz" Oct 12 08:15:57 crc kubenswrapper[4599]: I1012 08:15:57.056892 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74692910-d735-4a55-bcfc-3fa5da2c48ab-catalog-content\") pod \"redhat-marketplace-xt4fz\" (UID: \"74692910-d735-4a55-bcfc-3fa5da2c48ab\") " pod="openshift-marketplace/redhat-marketplace-xt4fz" Oct 12 08:15:57 crc kubenswrapper[4599]: I1012 08:15:57.056968 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74692910-d735-4a55-bcfc-3fa5da2c48ab-utilities\") pod \"redhat-marketplace-xt4fz\" (UID: \"74692910-d735-4a55-bcfc-3fa5da2c48ab\") " pod="openshift-marketplace/redhat-marketplace-xt4fz" Oct 12 08:15:57 crc kubenswrapper[4599]: I1012 08:15:57.057387 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74692910-d735-4a55-bcfc-3fa5da2c48ab-utilities\") pod \"redhat-marketplace-xt4fz\" (UID: \"74692910-d735-4a55-bcfc-3fa5da2c48ab\") " pod="openshift-marketplace/redhat-marketplace-xt4fz" Oct 12 08:15:57 crc kubenswrapper[4599]: I1012 08:15:57.057444 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74692910-d735-4a55-bcfc-3fa5da2c48ab-catalog-content\") pod \"redhat-marketplace-xt4fz\" (UID: \"74692910-d735-4a55-bcfc-3fa5da2c48ab\") " pod="openshift-marketplace/redhat-marketplace-xt4fz" Oct 12 08:15:57 crc kubenswrapper[4599]: I1012 08:15:57.072433 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfw4h\" (UniqueName: \"kubernetes.io/projected/74692910-d735-4a55-bcfc-3fa5da2c48ab-kube-api-access-jfw4h\") pod \"redhat-marketplace-xt4fz\" (UID: \"74692910-d735-4a55-bcfc-3fa5da2c48ab\") " pod="openshift-marketplace/redhat-marketplace-xt4fz" Oct 12 08:15:57 crc kubenswrapper[4599]: I1012 08:15:57.188099 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xt4fz" Oct 12 08:15:57 crc kubenswrapper[4599]: I1012 08:15:57.567353 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xt4fz"] Oct 12 08:15:57 crc kubenswrapper[4599]: I1012 08:15:57.789526 4599 generic.go:334] "Generic (PLEG): container finished" podID="74692910-d735-4a55-bcfc-3fa5da2c48ab" containerID="08e3dfa1a82162354c72f5e4934083e3f3029eff9d35aee3fc818ed25d928e81" exitCode=0 Oct 12 08:15:57 crc kubenswrapper[4599]: I1012 08:15:57.789568 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xt4fz" event={"ID":"74692910-d735-4a55-bcfc-3fa5da2c48ab","Type":"ContainerDied","Data":"08e3dfa1a82162354c72f5e4934083e3f3029eff9d35aee3fc818ed25d928e81"} Oct 12 08:15:57 crc kubenswrapper[4599]: I1012 08:15:57.789595 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xt4fz" event={"ID":"74692910-d735-4a55-bcfc-3fa5da2c48ab","Type":"ContainerStarted","Data":"5d26a9b3467d7f7905014cb6c2c981b0dfbb771674d48e61555faa80f946ced0"} Oct 12 08:15:57 crc kubenswrapper[4599]: I1012 08:15:57.792132 4599 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 12 08:15:58 crc kubenswrapper[4599]: I1012 08:15:58.797542 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xt4fz" event={"ID":"74692910-d735-4a55-bcfc-3fa5da2c48ab","Type":"ContainerStarted","Data":"f90008f6bf6da86d3d3e6df3cb680998cfdad4cfa0ca33b87a4a848174b35d65"} Oct 12 08:15:59 crc kubenswrapper[4599]: I1012 08:15:59.804898 4599 generic.go:334] "Generic (PLEG): container finished" podID="74692910-d735-4a55-bcfc-3fa5da2c48ab" containerID="f90008f6bf6da86d3d3e6df3cb680998cfdad4cfa0ca33b87a4a848174b35d65" exitCode=0 Oct 12 08:15:59 crc kubenswrapper[4599]: I1012 08:15:59.804942 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xt4fz" event={"ID":"74692910-d735-4a55-bcfc-3fa5da2c48ab","Type":"ContainerDied","Data":"f90008f6bf6da86d3d3e6df3cb680998cfdad4cfa0ca33b87a4a848174b35d65"} Oct 12 08:16:00 crc kubenswrapper[4599]: I1012 08:16:00.813415 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xt4fz" event={"ID":"74692910-d735-4a55-bcfc-3fa5da2c48ab","Type":"ContainerStarted","Data":"085ed2701bd2960b81c759285e7d74d8cfe7f4f4557b2fdf8b6c37fa186ce933"} Oct 12 08:16:00 crc kubenswrapper[4599]: I1012 08:16:00.830380 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-xt4fz" podStartSLOduration=2.319192252 podStartE2EDuration="4.830363748s" podCreationTimestamp="2025-10-12 08:15:56 +0000 UTC" firstStartedPulling="2025-10-12 08:15:57.791885687 +0000 UTC m=+2454.581081189" lastFinishedPulling="2025-10-12 08:16:00.303057183 +0000 UTC m=+2457.092252685" observedRunningTime="2025-10-12 08:16:00.825201403 +0000 UTC m=+2457.614396904" watchObservedRunningTime="2025-10-12 08:16:00.830363748 +0000 UTC m=+2457.619559250" Oct 12 08:16:01 crc kubenswrapper[4599]: I1012 08:16:01.545794 4599 scope.go:117] "RemoveContainer" containerID="4e7d8f7d3d60ebdc79d83491f148013a2c65198dbd3b53c026bfcf65f06699ba" Oct 12 08:16:01 crc kubenswrapper[4599]: I1012 08:16:01.821807 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" event={"ID":"cc694bce-8c25-4729-b452-29d44d3efe6e","Type":"ContainerStarted","Data":"c2c141d22e067a6383af16f2655ff809ed93e78d4a047bd3c7fdced41256a1ea"} Oct 12 08:16:07 crc kubenswrapper[4599]: I1012 08:16:07.188718 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-xt4fz" Oct 12 08:16:07 crc kubenswrapper[4599]: I1012 08:16:07.189241 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-xt4fz" Oct 12 08:16:07 crc kubenswrapper[4599]: I1012 08:16:07.222765 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-xt4fz" Oct 12 08:16:07 crc kubenswrapper[4599]: I1012 08:16:07.888097 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-xt4fz" Oct 12 08:16:07 crc kubenswrapper[4599]: I1012 08:16:07.920481 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xt4fz"] Oct 12 08:16:09 crc kubenswrapper[4599]: I1012 08:16:09.868793 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-xt4fz" podUID="74692910-d735-4a55-bcfc-3fa5da2c48ab" containerName="registry-server" containerID="cri-o://085ed2701bd2960b81c759285e7d74d8cfe7f4f4557b2fdf8b6c37fa186ce933" gracePeriod=2 Oct 12 08:16:10 crc kubenswrapper[4599]: I1012 08:16:10.264842 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xt4fz" Oct 12 08:16:10 crc kubenswrapper[4599]: I1012 08:16:10.363496 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jfw4h\" (UniqueName: \"kubernetes.io/projected/74692910-d735-4a55-bcfc-3fa5da2c48ab-kube-api-access-jfw4h\") pod \"74692910-d735-4a55-bcfc-3fa5da2c48ab\" (UID: \"74692910-d735-4a55-bcfc-3fa5da2c48ab\") " Oct 12 08:16:10 crc kubenswrapper[4599]: I1012 08:16:10.363594 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74692910-d735-4a55-bcfc-3fa5da2c48ab-utilities\") pod \"74692910-d735-4a55-bcfc-3fa5da2c48ab\" (UID: \"74692910-d735-4a55-bcfc-3fa5da2c48ab\") " Oct 12 08:16:10 crc kubenswrapper[4599]: I1012 08:16:10.363614 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74692910-d735-4a55-bcfc-3fa5da2c48ab-catalog-content\") pod \"74692910-d735-4a55-bcfc-3fa5da2c48ab\" (UID: \"74692910-d735-4a55-bcfc-3fa5da2c48ab\") " Oct 12 08:16:10 crc kubenswrapper[4599]: I1012 08:16:10.364350 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74692910-d735-4a55-bcfc-3fa5da2c48ab-utilities" (OuterVolumeSpecName: "utilities") pod "74692910-d735-4a55-bcfc-3fa5da2c48ab" (UID: "74692910-d735-4a55-bcfc-3fa5da2c48ab"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 08:16:10 crc kubenswrapper[4599]: I1012 08:16:10.368584 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74692910-d735-4a55-bcfc-3fa5da2c48ab-kube-api-access-jfw4h" (OuterVolumeSpecName: "kube-api-access-jfw4h") pod "74692910-d735-4a55-bcfc-3fa5da2c48ab" (UID: "74692910-d735-4a55-bcfc-3fa5da2c48ab"). InnerVolumeSpecName "kube-api-access-jfw4h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 08:16:10 crc kubenswrapper[4599]: I1012 08:16:10.372690 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74692910-d735-4a55-bcfc-3fa5da2c48ab-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "74692910-d735-4a55-bcfc-3fa5da2c48ab" (UID: "74692910-d735-4a55-bcfc-3fa5da2c48ab"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 08:16:10 crc kubenswrapper[4599]: I1012 08:16:10.465252 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jfw4h\" (UniqueName: \"kubernetes.io/projected/74692910-d735-4a55-bcfc-3fa5da2c48ab-kube-api-access-jfw4h\") on node \"crc\" DevicePath \"\"" Oct 12 08:16:10 crc kubenswrapper[4599]: I1012 08:16:10.465276 4599 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74692910-d735-4a55-bcfc-3fa5da2c48ab-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 08:16:10 crc kubenswrapper[4599]: I1012 08:16:10.465285 4599 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74692910-d735-4a55-bcfc-3fa5da2c48ab-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 08:16:10 crc kubenswrapper[4599]: I1012 08:16:10.876667 4599 generic.go:334] "Generic (PLEG): container finished" podID="74692910-d735-4a55-bcfc-3fa5da2c48ab" containerID="085ed2701bd2960b81c759285e7d74d8cfe7f4f4557b2fdf8b6c37fa186ce933" exitCode=0 Oct 12 08:16:10 crc kubenswrapper[4599]: I1012 08:16:10.876722 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xt4fz" Oct 12 08:16:10 crc kubenswrapper[4599]: I1012 08:16:10.876745 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xt4fz" event={"ID":"74692910-d735-4a55-bcfc-3fa5da2c48ab","Type":"ContainerDied","Data":"085ed2701bd2960b81c759285e7d74d8cfe7f4f4557b2fdf8b6c37fa186ce933"} Oct 12 08:16:10 crc kubenswrapper[4599]: I1012 08:16:10.877008 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xt4fz" event={"ID":"74692910-d735-4a55-bcfc-3fa5da2c48ab","Type":"ContainerDied","Data":"5d26a9b3467d7f7905014cb6c2c981b0dfbb771674d48e61555faa80f946ced0"} Oct 12 08:16:10 crc kubenswrapper[4599]: I1012 08:16:10.877028 4599 scope.go:117] "RemoveContainer" containerID="085ed2701bd2960b81c759285e7d74d8cfe7f4f4557b2fdf8b6c37fa186ce933" Oct 12 08:16:10 crc kubenswrapper[4599]: I1012 08:16:10.892026 4599 scope.go:117] "RemoveContainer" containerID="f90008f6bf6da86d3d3e6df3cb680998cfdad4cfa0ca33b87a4a848174b35d65" Oct 12 08:16:10 crc kubenswrapper[4599]: I1012 08:16:10.901533 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xt4fz"] Oct 12 08:16:10 crc kubenswrapper[4599]: I1012 08:16:10.906812 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-xt4fz"] Oct 12 08:16:10 crc kubenswrapper[4599]: I1012 08:16:10.909081 4599 scope.go:117] "RemoveContainer" containerID="08e3dfa1a82162354c72f5e4934083e3f3029eff9d35aee3fc818ed25d928e81" Oct 12 08:16:10 crc kubenswrapper[4599]: I1012 08:16:10.943248 4599 scope.go:117] "RemoveContainer" containerID="085ed2701bd2960b81c759285e7d74d8cfe7f4f4557b2fdf8b6c37fa186ce933" Oct 12 08:16:10 crc kubenswrapper[4599]: E1012 08:16:10.943612 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"085ed2701bd2960b81c759285e7d74d8cfe7f4f4557b2fdf8b6c37fa186ce933\": container with ID starting with 085ed2701bd2960b81c759285e7d74d8cfe7f4f4557b2fdf8b6c37fa186ce933 not found: ID does not exist" containerID="085ed2701bd2960b81c759285e7d74d8cfe7f4f4557b2fdf8b6c37fa186ce933" Oct 12 08:16:10 crc kubenswrapper[4599]: I1012 08:16:10.943646 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"085ed2701bd2960b81c759285e7d74d8cfe7f4f4557b2fdf8b6c37fa186ce933"} err="failed to get container status \"085ed2701bd2960b81c759285e7d74d8cfe7f4f4557b2fdf8b6c37fa186ce933\": rpc error: code = NotFound desc = could not find container \"085ed2701bd2960b81c759285e7d74d8cfe7f4f4557b2fdf8b6c37fa186ce933\": container with ID starting with 085ed2701bd2960b81c759285e7d74d8cfe7f4f4557b2fdf8b6c37fa186ce933 not found: ID does not exist" Oct 12 08:16:10 crc kubenswrapper[4599]: I1012 08:16:10.943667 4599 scope.go:117] "RemoveContainer" containerID="f90008f6bf6da86d3d3e6df3cb680998cfdad4cfa0ca33b87a4a848174b35d65" Oct 12 08:16:10 crc kubenswrapper[4599]: E1012 08:16:10.943995 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f90008f6bf6da86d3d3e6df3cb680998cfdad4cfa0ca33b87a4a848174b35d65\": container with ID starting with f90008f6bf6da86d3d3e6df3cb680998cfdad4cfa0ca33b87a4a848174b35d65 not found: ID does not exist" containerID="f90008f6bf6da86d3d3e6df3cb680998cfdad4cfa0ca33b87a4a848174b35d65" Oct 12 08:16:10 crc kubenswrapper[4599]: I1012 08:16:10.944021 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f90008f6bf6da86d3d3e6df3cb680998cfdad4cfa0ca33b87a4a848174b35d65"} err="failed to get container status \"f90008f6bf6da86d3d3e6df3cb680998cfdad4cfa0ca33b87a4a848174b35d65\": rpc error: code = NotFound desc = could not find container \"f90008f6bf6da86d3d3e6df3cb680998cfdad4cfa0ca33b87a4a848174b35d65\": container with ID starting with f90008f6bf6da86d3d3e6df3cb680998cfdad4cfa0ca33b87a4a848174b35d65 not found: ID does not exist" Oct 12 08:16:10 crc kubenswrapper[4599]: I1012 08:16:10.944036 4599 scope.go:117] "RemoveContainer" containerID="08e3dfa1a82162354c72f5e4934083e3f3029eff9d35aee3fc818ed25d928e81" Oct 12 08:16:10 crc kubenswrapper[4599]: E1012 08:16:10.944289 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08e3dfa1a82162354c72f5e4934083e3f3029eff9d35aee3fc818ed25d928e81\": container with ID starting with 08e3dfa1a82162354c72f5e4934083e3f3029eff9d35aee3fc818ed25d928e81 not found: ID does not exist" containerID="08e3dfa1a82162354c72f5e4934083e3f3029eff9d35aee3fc818ed25d928e81" Oct 12 08:16:10 crc kubenswrapper[4599]: I1012 08:16:10.944324 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08e3dfa1a82162354c72f5e4934083e3f3029eff9d35aee3fc818ed25d928e81"} err="failed to get container status \"08e3dfa1a82162354c72f5e4934083e3f3029eff9d35aee3fc818ed25d928e81\": rpc error: code = NotFound desc = could not find container \"08e3dfa1a82162354c72f5e4934083e3f3029eff9d35aee3fc818ed25d928e81\": container with ID starting with 08e3dfa1a82162354c72f5e4934083e3f3029eff9d35aee3fc818ed25d928e81 not found: ID does not exist" Oct 12 08:16:11 crc kubenswrapper[4599]: I1012 08:16:11.553078 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74692910-d735-4a55-bcfc-3fa5da2c48ab" path="/var/lib/kubelet/pods/74692910-d735-4a55-bcfc-3fa5da2c48ab/volumes" Oct 12 08:18:28 crc kubenswrapper[4599]: I1012 08:18:28.321454 4599 patch_prober.go:28] interesting pod/machine-config-daemon-5mz5c container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 08:18:28 crc kubenswrapper[4599]: I1012 08:18:28.322523 4599 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 08:18:58 crc kubenswrapper[4599]: I1012 08:18:58.321975 4599 patch_prober.go:28] interesting pod/machine-config-daemon-5mz5c container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 08:18:58 crc kubenswrapper[4599]: I1012 08:18:58.322518 4599 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 08:19:28 crc kubenswrapper[4599]: I1012 08:19:28.321330 4599 patch_prober.go:28] interesting pod/machine-config-daemon-5mz5c container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 08:19:28 crc kubenswrapper[4599]: I1012 08:19:28.321714 4599 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 08:19:28 crc kubenswrapper[4599]: I1012 08:19:28.321748 4599 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" Oct 12 08:19:28 crc kubenswrapper[4599]: I1012 08:19:28.322166 4599 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c2c141d22e067a6383af16f2655ff809ed93e78d4a047bd3c7fdced41256a1ea"} pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 12 08:19:28 crc kubenswrapper[4599]: I1012 08:19:28.322204 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" containerName="machine-config-daemon" containerID="cri-o://c2c141d22e067a6383af16f2655ff809ed93e78d4a047bd3c7fdced41256a1ea" gracePeriod=600 Oct 12 08:19:29 crc kubenswrapper[4599]: I1012 08:19:29.178561 4599 generic.go:334] "Generic (PLEG): container finished" podID="cc694bce-8c25-4729-b452-29d44d3efe6e" containerID="c2c141d22e067a6383af16f2655ff809ed93e78d4a047bd3c7fdced41256a1ea" exitCode=0 Oct 12 08:19:29 crc kubenswrapper[4599]: I1012 08:19:29.178633 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" event={"ID":"cc694bce-8c25-4729-b452-29d44d3efe6e","Type":"ContainerDied","Data":"c2c141d22e067a6383af16f2655ff809ed93e78d4a047bd3c7fdced41256a1ea"} Oct 12 08:19:29 crc kubenswrapper[4599]: I1012 08:19:29.179047 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" event={"ID":"cc694bce-8c25-4729-b452-29d44d3efe6e","Type":"ContainerStarted","Data":"536fe0523d7f4d373d85a237a3866f6e04c0049c07c9b6780974dfce18c2815f"} Oct 12 08:19:29 crc kubenswrapper[4599]: I1012 08:19:29.179066 4599 scope.go:117] "RemoveContainer" containerID="4e7d8f7d3d60ebdc79d83491f148013a2c65198dbd3b53c026bfcf65f06699ba" Oct 12 08:21:12 crc kubenswrapper[4599]: I1012 08:21:12.426511 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-zq5xp"] Oct 12 08:21:12 crc kubenswrapper[4599]: E1012 08:21:12.427244 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74692910-d735-4a55-bcfc-3fa5da2c48ab" containerName="extract-utilities" Oct 12 08:21:12 crc kubenswrapper[4599]: I1012 08:21:12.427257 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="74692910-d735-4a55-bcfc-3fa5da2c48ab" containerName="extract-utilities" Oct 12 08:21:12 crc kubenswrapper[4599]: E1012 08:21:12.427293 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74692910-d735-4a55-bcfc-3fa5da2c48ab" containerName="extract-content" Oct 12 08:21:12 crc kubenswrapper[4599]: I1012 08:21:12.427298 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="74692910-d735-4a55-bcfc-3fa5da2c48ab" containerName="extract-content" Oct 12 08:21:12 crc kubenswrapper[4599]: E1012 08:21:12.427307 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74692910-d735-4a55-bcfc-3fa5da2c48ab" containerName="registry-server" Oct 12 08:21:12 crc kubenswrapper[4599]: I1012 08:21:12.427312 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="74692910-d735-4a55-bcfc-3fa5da2c48ab" containerName="registry-server" Oct 12 08:21:12 crc kubenswrapper[4599]: I1012 08:21:12.427514 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="74692910-d735-4a55-bcfc-3fa5da2c48ab" containerName="registry-server" Oct 12 08:21:12 crc kubenswrapper[4599]: I1012 08:21:12.428709 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zq5xp" Oct 12 08:21:12 crc kubenswrapper[4599]: I1012 08:21:12.448825 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zq5xp"] Oct 12 08:21:12 crc kubenswrapper[4599]: I1012 08:21:12.493850 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08d98dc2-79df-489a-929c-405f60a5930c-catalog-content\") pod \"community-operators-zq5xp\" (UID: \"08d98dc2-79df-489a-929c-405f60a5930c\") " pod="openshift-marketplace/community-operators-zq5xp" Oct 12 08:21:12 crc kubenswrapper[4599]: I1012 08:21:12.494365 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4247\" (UniqueName: \"kubernetes.io/projected/08d98dc2-79df-489a-929c-405f60a5930c-kube-api-access-n4247\") pod \"community-operators-zq5xp\" (UID: \"08d98dc2-79df-489a-929c-405f60a5930c\") " pod="openshift-marketplace/community-operators-zq5xp" Oct 12 08:21:12 crc kubenswrapper[4599]: I1012 08:21:12.494486 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08d98dc2-79df-489a-929c-405f60a5930c-utilities\") pod \"community-operators-zq5xp\" (UID: \"08d98dc2-79df-489a-929c-405f60a5930c\") " pod="openshift-marketplace/community-operators-zq5xp" Oct 12 08:21:12 crc kubenswrapper[4599]: I1012 08:21:12.595895 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08d98dc2-79df-489a-929c-405f60a5930c-catalog-content\") pod \"community-operators-zq5xp\" (UID: \"08d98dc2-79df-489a-929c-405f60a5930c\") " pod="openshift-marketplace/community-operators-zq5xp" Oct 12 08:21:12 crc kubenswrapper[4599]: I1012 08:21:12.596001 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4247\" (UniqueName: \"kubernetes.io/projected/08d98dc2-79df-489a-929c-405f60a5930c-kube-api-access-n4247\") pod \"community-operators-zq5xp\" (UID: \"08d98dc2-79df-489a-929c-405f60a5930c\") " pod="openshift-marketplace/community-operators-zq5xp" Oct 12 08:21:12 crc kubenswrapper[4599]: I1012 08:21:12.596034 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08d98dc2-79df-489a-929c-405f60a5930c-utilities\") pod \"community-operators-zq5xp\" (UID: \"08d98dc2-79df-489a-929c-405f60a5930c\") " pod="openshift-marketplace/community-operators-zq5xp" Oct 12 08:21:12 crc kubenswrapper[4599]: I1012 08:21:12.596293 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08d98dc2-79df-489a-929c-405f60a5930c-catalog-content\") pod \"community-operators-zq5xp\" (UID: \"08d98dc2-79df-489a-929c-405f60a5930c\") " pod="openshift-marketplace/community-operators-zq5xp" Oct 12 08:21:12 crc kubenswrapper[4599]: I1012 08:21:12.596767 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08d98dc2-79df-489a-929c-405f60a5930c-utilities\") pod \"community-operators-zq5xp\" (UID: \"08d98dc2-79df-489a-929c-405f60a5930c\") " pod="openshift-marketplace/community-operators-zq5xp" Oct 12 08:21:12 crc kubenswrapper[4599]: I1012 08:21:12.612305 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4247\" (UniqueName: \"kubernetes.io/projected/08d98dc2-79df-489a-929c-405f60a5930c-kube-api-access-n4247\") pod \"community-operators-zq5xp\" (UID: \"08d98dc2-79df-489a-929c-405f60a5930c\") " pod="openshift-marketplace/community-operators-zq5xp" Oct 12 08:21:12 crc kubenswrapper[4599]: I1012 08:21:12.745425 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zq5xp" Oct 12 08:21:13 crc kubenswrapper[4599]: I1012 08:21:13.123603 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zq5xp"] Oct 12 08:21:13 crc kubenswrapper[4599]: I1012 08:21:13.821479 4599 generic.go:334] "Generic (PLEG): container finished" podID="08d98dc2-79df-489a-929c-405f60a5930c" containerID="73d1f3d962814975abf3429575c1f38890802d9f08db71ef6b3f31114d778e77" exitCode=0 Oct 12 08:21:13 crc kubenswrapper[4599]: I1012 08:21:13.821531 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zq5xp" event={"ID":"08d98dc2-79df-489a-929c-405f60a5930c","Type":"ContainerDied","Data":"73d1f3d962814975abf3429575c1f38890802d9f08db71ef6b3f31114d778e77"} Oct 12 08:21:13 crc kubenswrapper[4599]: I1012 08:21:13.822815 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zq5xp" event={"ID":"08d98dc2-79df-489a-929c-405f60a5930c","Type":"ContainerStarted","Data":"ca772948092a23049228708905c72109666f8a11cf27f54f4fed5104b39d8f97"} Oct 12 08:21:13 crc kubenswrapper[4599]: I1012 08:21:13.822838 4599 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 12 08:21:14 crc kubenswrapper[4599]: I1012 08:21:14.829157 4599 generic.go:334] "Generic (PLEG): container finished" podID="08d98dc2-79df-489a-929c-405f60a5930c" containerID="922d72ccdd5084faaaaac93acae7af51c4fb821bd6505f3edac658d56a82ca63" exitCode=0 Oct 12 08:21:14 crc kubenswrapper[4599]: I1012 08:21:14.829221 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zq5xp" event={"ID":"08d98dc2-79df-489a-929c-405f60a5930c","Type":"ContainerDied","Data":"922d72ccdd5084faaaaac93acae7af51c4fb821bd6505f3edac658d56a82ca63"} Oct 12 08:21:15 crc kubenswrapper[4599]: I1012 08:21:15.840414 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zq5xp" event={"ID":"08d98dc2-79df-489a-929c-405f60a5930c","Type":"ContainerStarted","Data":"425ae3d58b10870ad3366bba671ea33d019d7e89606d8089b78c17decb1d41d6"} Oct 12 08:21:15 crc kubenswrapper[4599]: I1012 08:21:15.857822 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-zq5xp" podStartSLOduration=2.395219022 podStartE2EDuration="3.857807882s" podCreationTimestamp="2025-10-12 08:21:12 +0000 UTC" firstStartedPulling="2025-10-12 08:21:13.822651439 +0000 UTC m=+2770.611846941" lastFinishedPulling="2025-10-12 08:21:15.285240299 +0000 UTC m=+2772.074435801" observedRunningTime="2025-10-12 08:21:15.852634055 +0000 UTC m=+2772.641829557" watchObservedRunningTime="2025-10-12 08:21:15.857807882 +0000 UTC m=+2772.647003384" Oct 12 08:21:22 crc kubenswrapper[4599]: I1012 08:21:22.746184 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-zq5xp" Oct 12 08:21:22 crc kubenswrapper[4599]: I1012 08:21:22.746562 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-zq5xp" Oct 12 08:21:22 crc kubenswrapper[4599]: I1012 08:21:22.782681 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-zq5xp" Oct 12 08:21:22 crc kubenswrapper[4599]: I1012 08:21:22.911567 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-zq5xp" Oct 12 08:21:23 crc kubenswrapper[4599]: I1012 08:21:23.008777 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zq5xp"] Oct 12 08:21:24 crc kubenswrapper[4599]: I1012 08:21:24.890778 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-zq5xp" podUID="08d98dc2-79df-489a-929c-405f60a5930c" containerName="registry-server" containerID="cri-o://425ae3d58b10870ad3366bba671ea33d019d7e89606d8089b78c17decb1d41d6" gracePeriod=2 Oct 12 08:21:25 crc kubenswrapper[4599]: I1012 08:21:25.260318 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zq5xp" Oct 12 08:21:25 crc kubenswrapper[4599]: I1012 08:21:25.374983 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08d98dc2-79df-489a-929c-405f60a5930c-catalog-content\") pod \"08d98dc2-79df-489a-929c-405f60a5930c\" (UID: \"08d98dc2-79df-489a-929c-405f60a5930c\") " Oct 12 08:21:25 crc kubenswrapper[4599]: I1012 08:21:25.375048 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n4247\" (UniqueName: \"kubernetes.io/projected/08d98dc2-79df-489a-929c-405f60a5930c-kube-api-access-n4247\") pod \"08d98dc2-79df-489a-929c-405f60a5930c\" (UID: \"08d98dc2-79df-489a-929c-405f60a5930c\") " Oct 12 08:21:25 crc kubenswrapper[4599]: I1012 08:21:25.375085 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08d98dc2-79df-489a-929c-405f60a5930c-utilities\") pod \"08d98dc2-79df-489a-929c-405f60a5930c\" (UID: \"08d98dc2-79df-489a-929c-405f60a5930c\") " Oct 12 08:21:25 crc kubenswrapper[4599]: I1012 08:21:25.376167 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08d98dc2-79df-489a-929c-405f60a5930c-utilities" (OuterVolumeSpecName: "utilities") pod "08d98dc2-79df-489a-929c-405f60a5930c" (UID: "08d98dc2-79df-489a-929c-405f60a5930c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 08:21:25 crc kubenswrapper[4599]: I1012 08:21:25.389776 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08d98dc2-79df-489a-929c-405f60a5930c-kube-api-access-n4247" (OuterVolumeSpecName: "kube-api-access-n4247") pod "08d98dc2-79df-489a-929c-405f60a5930c" (UID: "08d98dc2-79df-489a-929c-405f60a5930c"). InnerVolumeSpecName "kube-api-access-n4247". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 08:21:25 crc kubenswrapper[4599]: I1012 08:21:25.412254 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08d98dc2-79df-489a-929c-405f60a5930c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "08d98dc2-79df-489a-929c-405f60a5930c" (UID: "08d98dc2-79df-489a-929c-405f60a5930c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 08:21:25 crc kubenswrapper[4599]: I1012 08:21:25.477417 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n4247\" (UniqueName: \"kubernetes.io/projected/08d98dc2-79df-489a-929c-405f60a5930c-kube-api-access-n4247\") on node \"crc\" DevicePath \"\"" Oct 12 08:21:25 crc kubenswrapper[4599]: I1012 08:21:25.477440 4599 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08d98dc2-79df-489a-929c-405f60a5930c-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 08:21:25 crc kubenswrapper[4599]: I1012 08:21:25.477450 4599 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08d98dc2-79df-489a-929c-405f60a5930c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 08:21:25 crc kubenswrapper[4599]: I1012 08:21:25.899571 4599 generic.go:334] "Generic (PLEG): container finished" podID="08d98dc2-79df-489a-929c-405f60a5930c" containerID="425ae3d58b10870ad3366bba671ea33d019d7e89606d8089b78c17decb1d41d6" exitCode=0 Oct 12 08:21:25 crc kubenswrapper[4599]: I1012 08:21:25.899635 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zq5xp" Oct 12 08:21:25 crc kubenswrapper[4599]: I1012 08:21:25.899655 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zq5xp" event={"ID":"08d98dc2-79df-489a-929c-405f60a5930c","Type":"ContainerDied","Data":"425ae3d58b10870ad3366bba671ea33d019d7e89606d8089b78c17decb1d41d6"} Oct 12 08:21:25 crc kubenswrapper[4599]: I1012 08:21:25.900230 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zq5xp" event={"ID":"08d98dc2-79df-489a-929c-405f60a5930c","Type":"ContainerDied","Data":"ca772948092a23049228708905c72109666f8a11cf27f54f4fed5104b39d8f97"} Oct 12 08:21:25 crc kubenswrapper[4599]: I1012 08:21:25.900265 4599 scope.go:117] "RemoveContainer" containerID="425ae3d58b10870ad3366bba671ea33d019d7e89606d8089b78c17decb1d41d6" Oct 12 08:21:25 crc kubenswrapper[4599]: I1012 08:21:25.915673 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zq5xp"] Oct 12 08:21:25 crc kubenswrapper[4599]: I1012 08:21:25.920491 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-zq5xp"] Oct 12 08:21:25 crc kubenswrapper[4599]: I1012 08:21:25.923666 4599 scope.go:117] "RemoveContainer" containerID="922d72ccdd5084faaaaac93acae7af51c4fb821bd6505f3edac658d56a82ca63" Oct 12 08:21:25 crc kubenswrapper[4599]: I1012 08:21:25.938553 4599 scope.go:117] "RemoveContainer" containerID="73d1f3d962814975abf3429575c1f38890802d9f08db71ef6b3f31114d778e77" Oct 12 08:21:25 crc kubenswrapper[4599]: I1012 08:21:25.968543 4599 scope.go:117] "RemoveContainer" containerID="425ae3d58b10870ad3366bba671ea33d019d7e89606d8089b78c17decb1d41d6" Oct 12 08:21:25 crc kubenswrapper[4599]: E1012 08:21:25.969167 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"425ae3d58b10870ad3366bba671ea33d019d7e89606d8089b78c17decb1d41d6\": container with ID starting with 425ae3d58b10870ad3366bba671ea33d019d7e89606d8089b78c17decb1d41d6 not found: ID does not exist" containerID="425ae3d58b10870ad3366bba671ea33d019d7e89606d8089b78c17decb1d41d6" Oct 12 08:21:25 crc kubenswrapper[4599]: I1012 08:21:25.969281 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"425ae3d58b10870ad3366bba671ea33d019d7e89606d8089b78c17decb1d41d6"} err="failed to get container status \"425ae3d58b10870ad3366bba671ea33d019d7e89606d8089b78c17decb1d41d6\": rpc error: code = NotFound desc = could not find container \"425ae3d58b10870ad3366bba671ea33d019d7e89606d8089b78c17decb1d41d6\": container with ID starting with 425ae3d58b10870ad3366bba671ea33d019d7e89606d8089b78c17decb1d41d6 not found: ID does not exist" Oct 12 08:21:25 crc kubenswrapper[4599]: I1012 08:21:25.969386 4599 scope.go:117] "RemoveContainer" containerID="922d72ccdd5084faaaaac93acae7af51c4fb821bd6505f3edac658d56a82ca63" Oct 12 08:21:25 crc kubenswrapper[4599]: E1012 08:21:25.970093 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"922d72ccdd5084faaaaac93acae7af51c4fb821bd6505f3edac658d56a82ca63\": container with ID starting with 922d72ccdd5084faaaaac93acae7af51c4fb821bd6505f3edac658d56a82ca63 not found: ID does not exist" containerID="922d72ccdd5084faaaaac93acae7af51c4fb821bd6505f3edac658d56a82ca63" Oct 12 08:21:25 crc kubenswrapper[4599]: I1012 08:21:25.970119 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"922d72ccdd5084faaaaac93acae7af51c4fb821bd6505f3edac658d56a82ca63"} err="failed to get container status \"922d72ccdd5084faaaaac93acae7af51c4fb821bd6505f3edac658d56a82ca63\": rpc error: code = NotFound desc = could not find container \"922d72ccdd5084faaaaac93acae7af51c4fb821bd6505f3edac658d56a82ca63\": container with ID starting with 922d72ccdd5084faaaaac93acae7af51c4fb821bd6505f3edac658d56a82ca63 not found: ID does not exist" Oct 12 08:21:25 crc kubenswrapper[4599]: I1012 08:21:25.970138 4599 scope.go:117] "RemoveContainer" containerID="73d1f3d962814975abf3429575c1f38890802d9f08db71ef6b3f31114d778e77" Oct 12 08:21:25 crc kubenswrapper[4599]: E1012 08:21:25.970600 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73d1f3d962814975abf3429575c1f38890802d9f08db71ef6b3f31114d778e77\": container with ID starting with 73d1f3d962814975abf3429575c1f38890802d9f08db71ef6b3f31114d778e77 not found: ID does not exist" containerID="73d1f3d962814975abf3429575c1f38890802d9f08db71ef6b3f31114d778e77" Oct 12 08:21:25 crc kubenswrapper[4599]: I1012 08:21:25.970687 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73d1f3d962814975abf3429575c1f38890802d9f08db71ef6b3f31114d778e77"} err="failed to get container status \"73d1f3d962814975abf3429575c1f38890802d9f08db71ef6b3f31114d778e77\": rpc error: code = NotFound desc = could not find container \"73d1f3d962814975abf3429575c1f38890802d9f08db71ef6b3f31114d778e77\": container with ID starting with 73d1f3d962814975abf3429575c1f38890802d9f08db71ef6b3f31114d778e77 not found: ID does not exist" Oct 12 08:21:27 crc kubenswrapper[4599]: I1012 08:21:27.552770 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08d98dc2-79df-489a-929c-405f60a5930c" path="/var/lib/kubelet/pods/08d98dc2-79df-489a-929c-405f60a5930c/volumes" Oct 12 08:21:28 crc kubenswrapper[4599]: I1012 08:21:28.321647 4599 patch_prober.go:28] interesting pod/machine-config-daemon-5mz5c container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 08:21:28 crc kubenswrapper[4599]: I1012 08:21:28.321693 4599 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 08:21:41 crc kubenswrapper[4599]: I1012 08:21:41.997361 4599 generic.go:334] "Generic (PLEG): container finished" podID="bdf67e61-9c15-4079-9a9b-a74c40ad364f" containerID="374d6d02b7a167e355b4e5c7f6c1e0680c1a93b8820598ad630e6d950b12eccd" exitCode=0 Oct 12 08:21:41 crc kubenswrapper[4599]: I1012 08:21:41.997380 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"bdf67e61-9c15-4079-9a9b-a74c40ad364f","Type":"ContainerDied","Data":"374d6d02b7a167e355b4e5c7f6c1e0680c1a93b8820598ad630e6d950b12eccd"} Oct 12 08:21:43 crc kubenswrapper[4599]: I1012 08:21:43.276076 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 12 08:21:43 crc kubenswrapper[4599]: I1012 08:21:43.463133 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bdf67e61-9c15-4079-9a9b-a74c40ad364f-ssh-key\") pod \"bdf67e61-9c15-4079-9a9b-a74c40ad364f\" (UID: \"bdf67e61-9c15-4079-9a9b-a74c40ad364f\") " Oct 12 08:21:43 crc kubenswrapper[4599]: I1012 08:21:43.463328 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"bdf67e61-9c15-4079-9a9b-a74c40ad364f\" (UID: \"bdf67e61-9c15-4079-9a9b-a74c40ad364f\") " Oct 12 08:21:43 crc kubenswrapper[4599]: I1012 08:21:43.463393 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/bdf67e61-9c15-4079-9a9b-a74c40ad364f-ca-certs\") pod \"bdf67e61-9c15-4079-9a9b-a74c40ad364f\" (UID: \"bdf67e61-9c15-4079-9a9b-a74c40ad364f\") " Oct 12 08:21:43 crc kubenswrapper[4599]: I1012 08:21:43.463439 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/bdf67e61-9c15-4079-9a9b-a74c40ad364f-test-operator-ephemeral-temporary\") pod \"bdf67e61-9c15-4079-9a9b-a74c40ad364f\" (UID: \"bdf67e61-9c15-4079-9a9b-a74c40ad364f\") " Oct 12 08:21:43 crc kubenswrapper[4599]: I1012 08:21:43.463480 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hwcrv\" (UniqueName: \"kubernetes.io/projected/bdf67e61-9c15-4079-9a9b-a74c40ad364f-kube-api-access-hwcrv\") pod \"bdf67e61-9c15-4079-9a9b-a74c40ad364f\" (UID: \"bdf67e61-9c15-4079-9a9b-a74c40ad364f\") " Oct 12 08:21:43 crc kubenswrapper[4599]: I1012 08:21:43.463552 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/bdf67e61-9c15-4079-9a9b-a74c40ad364f-openstack-config\") pod \"bdf67e61-9c15-4079-9a9b-a74c40ad364f\" (UID: \"bdf67e61-9c15-4079-9a9b-a74c40ad364f\") " Oct 12 08:21:43 crc kubenswrapper[4599]: I1012 08:21:43.463594 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/bdf67e61-9c15-4079-9a9b-a74c40ad364f-openstack-config-secret\") pod \"bdf67e61-9c15-4079-9a9b-a74c40ad364f\" (UID: \"bdf67e61-9c15-4079-9a9b-a74c40ad364f\") " Oct 12 08:21:43 crc kubenswrapper[4599]: I1012 08:21:43.463664 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/bdf67e61-9c15-4079-9a9b-a74c40ad364f-test-operator-ephemeral-workdir\") pod \"bdf67e61-9c15-4079-9a9b-a74c40ad364f\" (UID: \"bdf67e61-9c15-4079-9a9b-a74c40ad364f\") " Oct 12 08:21:43 crc kubenswrapper[4599]: I1012 08:21:43.463702 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bdf67e61-9c15-4079-9a9b-a74c40ad364f-config-data\") pod \"bdf67e61-9c15-4079-9a9b-a74c40ad364f\" (UID: \"bdf67e61-9c15-4079-9a9b-a74c40ad364f\") " Oct 12 08:21:43 crc kubenswrapper[4599]: I1012 08:21:43.464187 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bdf67e61-9c15-4079-9a9b-a74c40ad364f-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "bdf67e61-9c15-4079-9a9b-a74c40ad364f" (UID: "bdf67e61-9c15-4079-9a9b-a74c40ad364f"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 08:21:43 crc kubenswrapper[4599]: I1012 08:21:43.464790 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bdf67e61-9c15-4079-9a9b-a74c40ad364f-config-data" (OuterVolumeSpecName: "config-data") pod "bdf67e61-9c15-4079-9a9b-a74c40ad364f" (UID: "bdf67e61-9c15-4079-9a9b-a74c40ad364f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 08:21:43 crc kubenswrapper[4599]: I1012 08:21:43.469564 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "test-operator-logs") pod "bdf67e61-9c15-4079-9a9b-a74c40ad364f" (UID: "bdf67e61-9c15-4079-9a9b-a74c40ad364f"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 12 08:21:43 crc kubenswrapper[4599]: I1012 08:21:43.469745 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bdf67e61-9c15-4079-9a9b-a74c40ad364f-kube-api-access-hwcrv" (OuterVolumeSpecName: "kube-api-access-hwcrv") pod "bdf67e61-9c15-4079-9a9b-a74c40ad364f" (UID: "bdf67e61-9c15-4079-9a9b-a74c40ad364f"). InnerVolumeSpecName "kube-api-access-hwcrv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 08:21:43 crc kubenswrapper[4599]: I1012 08:21:43.476168 4599 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bdf67e61-9c15-4079-9a9b-a74c40ad364f-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 08:21:43 crc kubenswrapper[4599]: I1012 08:21:43.476216 4599 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Oct 12 08:21:43 crc kubenswrapper[4599]: I1012 08:21:43.476228 4599 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/bdf67e61-9c15-4079-9a9b-a74c40ad364f-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Oct 12 08:21:43 crc kubenswrapper[4599]: I1012 08:21:43.476238 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hwcrv\" (UniqueName: \"kubernetes.io/projected/bdf67e61-9c15-4079-9a9b-a74c40ad364f-kube-api-access-hwcrv\") on node \"crc\" DevicePath \"\"" Oct 12 08:21:43 crc kubenswrapper[4599]: I1012 08:21:43.477267 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bdf67e61-9c15-4079-9a9b-a74c40ad364f-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "bdf67e61-9c15-4079-9a9b-a74c40ad364f" (UID: "bdf67e61-9c15-4079-9a9b-a74c40ad364f"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 08:21:43 crc kubenswrapper[4599]: I1012 08:21:43.485351 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdf67e61-9c15-4079-9a9b-a74c40ad364f-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "bdf67e61-9c15-4079-9a9b-a74c40ad364f" (UID: "bdf67e61-9c15-4079-9a9b-a74c40ad364f"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 08:21:43 crc kubenswrapper[4599]: I1012 08:21:43.487002 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdf67e61-9c15-4079-9a9b-a74c40ad364f-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "bdf67e61-9c15-4079-9a9b-a74c40ad364f" (UID: "bdf67e61-9c15-4079-9a9b-a74c40ad364f"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 08:21:43 crc kubenswrapper[4599]: I1012 08:21:43.487834 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdf67e61-9c15-4079-9a9b-a74c40ad364f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "bdf67e61-9c15-4079-9a9b-a74c40ad364f" (UID: "bdf67e61-9c15-4079-9a9b-a74c40ad364f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 08:21:43 crc kubenswrapper[4599]: I1012 08:21:43.493082 4599 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Oct 12 08:21:43 crc kubenswrapper[4599]: I1012 08:21:43.499063 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bdf67e61-9c15-4079-9a9b-a74c40ad364f-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "bdf67e61-9c15-4079-9a9b-a74c40ad364f" (UID: "bdf67e61-9c15-4079-9a9b-a74c40ad364f"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 08:21:43 crc kubenswrapper[4599]: I1012 08:21:43.577276 4599 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/bdf67e61-9c15-4079-9a9b-a74c40ad364f-ca-certs\") on node \"crc\" DevicePath \"\"" Oct 12 08:21:43 crc kubenswrapper[4599]: I1012 08:21:43.577302 4599 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/bdf67e61-9c15-4079-9a9b-a74c40ad364f-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 12 08:21:43 crc kubenswrapper[4599]: I1012 08:21:43.577312 4599 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/bdf67e61-9c15-4079-9a9b-a74c40ad364f-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 12 08:21:43 crc kubenswrapper[4599]: I1012 08:21:43.577322 4599 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/bdf67e61-9c15-4079-9a9b-a74c40ad364f-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Oct 12 08:21:43 crc kubenswrapper[4599]: I1012 08:21:43.577332 4599 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bdf67e61-9c15-4079-9a9b-a74c40ad364f-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 12 08:21:43 crc kubenswrapper[4599]: I1012 08:21:43.577354 4599 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Oct 12 08:21:44 crc kubenswrapper[4599]: I1012 08:21:44.011385 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"bdf67e61-9c15-4079-9a9b-a74c40ad364f","Type":"ContainerDied","Data":"a7b48c8de6e13c722cc3bb00b3b1e5faebea30affa8e71d3b41e20a19f64add4"} Oct 12 08:21:44 crc kubenswrapper[4599]: I1012 08:21:44.011425 4599 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a7b48c8de6e13c722cc3bb00b3b1e5faebea30affa8e71d3b41e20a19f64add4" Oct 12 08:21:44 crc kubenswrapper[4599]: I1012 08:21:44.011469 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 12 08:21:51 crc kubenswrapper[4599]: I1012 08:21:51.153368 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 12 08:21:51 crc kubenswrapper[4599]: E1012 08:21:51.154183 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08d98dc2-79df-489a-929c-405f60a5930c" containerName="registry-server" Oct 12 08:21:51 crc kubenswrapper[4599]: I1012 08:21:51.154195 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="08d98dc2-79df-489a-929c-405f60a5930c" containerName="registry-server" Oct 12 08:21:51 crc kubenswrapper[4599]: E1012 08:21:51.154222 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08d98dc2-79df-489a-929c-405f60a5930c" containerName="extract-content" Oct 12 08:21:51 crc kubenswrapper[4599]: I1012 08:21:51.154229 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="08d98dc2-79df-489a-929c-405f60a5930c" containerName="extract-content" Oct 12 08:21:51 crc kubenswrapper[4599]: E1012 08:21:51.154240 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08d98dc2-79df-489a-929c-405f60a5930c" containerName="extract-utilities" Oct 12 08:21:51 crc kubenswrapper[4599]: I1012 08:21:51.154249 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="08d98dc2-79df-489a-929c-405f60a5930c" containerName="extract-utilities" Oct 12 08:21:51 crc kubenswrapper[4599]: E1012 08:21:51.154256 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdf67e61-9c15-4079-9a9b-a74c40ad364f" containerName="tempest-tests-tempest-tests-runner" Oct 12 08:21:51 crc kubenswrapper[4599]: I1012 08:21:51.154263 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdf67e61-9c15-4079-9a9b-a74c40ad364f" containerName="tempest-tests-tempest-tests-runner" Oct 12 08:21:51 crc kubenswrapper[4599]: I1012 08:21:51.154438 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="bdf67e61-9c15-4079-9a9b-a74c40ad364f" containerName="tempest-tests-tempest-tests-runner" Oct 12 08:21:51 crc kubenswrapper[4599]: I1012 08:21:51.154454 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="08d98dc2-79df-489a-929c-405f60a5930c" containerName="registry-server" Oct 12 08:21:51 crc kubenswrapper[4599]: I1012 08:21:51.155022 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 12 08:21:51 crc kubenswrapper[4599]: I1012 08:21:51.157829 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-bdcjd" Oct 12 08:21:51 crc kubenswrapper[4599]: I1012 08:21:51.159851 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 12 08:21:51 crc kubenswrapper[4599]: I1012 08:21:51.311743 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6mz7\" (UniqueName: \"kubernetes.io/projected/11baa2a2-2767-4dde-96b3-708570c6575d-kube-api-access-p6mz7\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"11baa2a2-2767-4dde-96b3-708570c6575d\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 12 08:21:51 crc kubenswrapper[4599]: I1012 08:21:51.312405 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"11baa2a2-2767-4dde-96b3-708570c6575d\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 12 08:21:51 crc kubenswrapper[4599]: I1012 08:21:51.422992 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"11baa2a2-2767-4dde-96b3-708570c6575d\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 12 08:21:51 crc kubenswrapper[4599]: I1012 08:21:51.423227 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6mz7\" (UniqueName: \"kubernetes.io/projected/11baa2a2-2767-4dde-96b3-708570c6575d-kube-api-access-p6mz7\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"11baa2a2-2767-4dde-96b3-708570c6575d\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 12 08:21:51 crc kubenswrapper[4599]: I1012 08:21:51.430895 4599 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"11baa2a2-2767-4dde-96b3-708570c6575d\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 12 08:21:51 crc kubenswrapper[4599]: I1012 08:21:51.528468 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"11baa2a2-2767-4dde-96b3-708570c6575d\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 12 08:21:51 crc kubenswrapper[4599]: I1012 08:21:51.531563 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6mz7\" (UniqueName: \"kubernetes.io/projected/11baa2a2-2767-4dde-96b3-708570c6575d-kube-api-access-p6mz7\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"11baa2a2-2767-4dde-96b3-708570c6575d\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 12 08:21:51 crc kubenswrapper[4599]: I1012 08:21:51.773242 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 12 08:21:52 crc kubenswrapper[4599]: W1012 08:21:52.163611 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod11baa2a2_2767_4dde_96b3_708570c6575d.slice/crio-e38e3a609cd49bc0247a680965f0b074dad1d6426c732c6feda5dfbd6b603de0 WatchSource:0}: Error finding container e38e3a609cd49bc0247a680965f0b074dad1d6426c732c6feda5dfbd6b603de0: Status 404 returned error can't find the container with id e38e3a609cd49bc0247a680965f0b074dad1d6426c732c6feda5dfbd6b603de0 Oct 12 08:21:52 crc kubenswrapper[4599]: I1012 08:21:52.164248 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 12 08:21:53 crc kubenswrapper[4599]: I1012 08:21:53.087177 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"11baa2a2-2767-4dde-96b3-708570c6575d","Type":"ContainerStarted","Data":"e38e3a609cd49bc0247a680965f0b074dad1d6426c732c6feda5dfbd6b603de0"} Oct 12 08:21:54 crc kubenswrapper[4599]: I1012 08:21:54.099137 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"11baa2a2-2767-4dde-96b3-708570c6575d","Type":"ContainerStarted","Data":"30c9482912277d9ef4640ec063a9a3045da776e5df89848c5c547485b5ef7126"} Oct 12 08:21:54 crc kubenswrapper[4599]: I1012 08:21:54.111909 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.203204078 podStartE2EDuration="3.111885962s" podCreationTimestamp="2025-10-12 08:21:51 +0000 UTC" firstStartedPulling="2025-10-12 08:21:52.167118777 +0000 UTC m=+2808.956314279" lastFinishedPulling="2025-10-12 08:21:53.075800662 +0000 UTC m=+2809.864996163" observedRunningTime="2025-10-12 08:21:54.109777337 +0000 UTC m=+2810.898972839" watchObservedRunningTime="2025-10-12 08:21:54.111885962 +0000 UTC m=+2810.901081465" Oct 12 08:21:58 crc kubenswrapper[4599]: I1012 08:21:58.321405 4599 patch_prober.go:28] interesting pod/machine-config-daemon-5mz5c container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 08:21:58 crc kubenswrapper[4599]: I1012 08:21:58.321599 4599 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 08:22:07 crc kubenswrapper[4599]: I1012 08:22:07.712886 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-s256c/must-gather-76gb2"] Oct 12 08:22:07 crc kubenswrapper[4599]: I1012 08:22:07.714565 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-s256c/must-gather-76gb2" Oct 12 08:22:07 crc kubenswrapper[4599]: I1012 08:22:07.729908 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-s256c/must-gather-76gb2"] Oct 12 08:22:07 crc kubenswrapper[4599]: I1012 08:22:07.731612 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-s256c"/"openshift-service-ca.crt" Oct 12 08:22:07 crc kubenswrapper[4599]: I1012 08:22:07.731653 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-s256c"/"default-dockercfg-d72fn" Oct 12 08:22:07 crc kubenswrapper[4599]: I1012 08:22:07.734239 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-s256c"/"kube-root-ca.crt" Oct 12 08:22:07 crc kubenswrapper[4599]: I1012 08:22:07.782816 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9qhk\" (UniqueName: \"kubernetes.io/projected/fb5ad21b-8e6a-43cb-b194-16397872fd71-kube-api-access-g9qhk\") pod \"must-gather-76gb2\" (UID: \"fb5ad21b-8e6a-43cb-b194-16397872fd71\") " pod="openshift-must-gather-s256c/must-gather-76gb2" Oct 12 08:22:07 crc kubenswrapper[4599]: I1012 08:22:07.782875 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/fb5ad21b-8e6a-43cb-b194-16397872fd71-must-gather-output\") pod \"must-gather-76gb2\" (UID: \"fb5ad21b-8e6a-43cb-b194-16397872fd71\") " pod="openshift-must-gather-s256c/must-gather-76gb2" Oct 12 08:22:07 crc kubenswrapper[4599]: I1012 08:22:07.884156 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9qhk\" (UniqueName: \"kubernetes.io/projected/fb5ad21b-8e6a-43cb-b194-16397872fd71-kube-api-access-g9qhk\") pod \"must-gather-76gb2\" (UID: \"fb5ad21b-8e6a-43cb-b194-16397872fd71\") " pod="openshift-must-gather-s256c/must-gather-76gb2" Oct 12 08:22:07 crc kubenswrapper[4599]: I1012 08:22:07.884230 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/fb5ad21b-8e6a-43cb-b194-16397872fd71-must-gather-output\") pod \"must-gather-76gb2\" (UID: \"fb5ad21b-8e6a-43cb-b194-16397872fd71\") " pod="openshift-must-gather-s256c/must-gather-76gb2" Oct 12 08:22:07 crc kubenswrapper[4599]: I1012 08:22:07.884636 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/fb5ad21b-8e6a-43cb-b194-16397872fd71-must-gather-output\") pod \"must-gather-76gb2\" (UID: \"fb5ad21b-8e6a-43cb-b194-16397872fd71\") " pod="openshift-must-gather-s256c/must-gather-76gb2" Oct 12 08:22:07 crc kubenswrapper[4599]: I1012 08:22:07.898722 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9qhk\" (UniqueName: \"kubernetes.io/projected/fb5ad21b-8e6a-43cb-b194-16397872fd71-kube-api-access-g9qhk\") pod \"must-gather-76gb2\" (UID: \"fb5ad21b-8e6a-43cb-b194-16397872fd71\") " pod="openshift-must-gather-s256c/must-gather-76gb2" Oct 12 08:22:08 crc kubenswrapper[4599]: I1012 08:22:08.027460 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-s256c/must-gather-76gb2" Oct 12 08:22:08 crc kubenswrapper[4599]: I1012 08:22:08.407653 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-s256c/must-gather-76gb2"] Oct 12 08:22:08 crc kubenswrapper[4599]: W1012 08:22:08.409928 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfb5ad21b_8e6a_43cb_b194_16397872fd71.slice/crio-3d50bf5476f95b5ee038e0b7e119ab47c3deda2168edabf0fdd60946a40ede86 WatchSource:0}: Error finding container 3d50bf5476f95b5ee038e0b7e119ab47c3deda2168edabf0fdd60946a40ede86: Status 404 returned error can't find the container with id 3d50bf5476f95b5ee038e0b7e119ab47c3deda2168edabf0fdd60946a40ede86 Oct 12 08:22:09 crc kubenswrapper[4599]: I1012 08:22:09.193077 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-s256c/must-gather-76gb2" event={"ID":"fb5ad21b-8e6a-43cb-b194-16397872fd71","Type":"ContainerStarted","Data":"3d50bf5476f95b5ee038e0b7e119ab47c3deda2168edabf0fdd60946a40ede86"} Oct 12 08:22:15 crc kubenswrapper[4599]: I1012 08:22:15.230225 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-s256c/must-gather-76gb2" event={"ID":"fb5ad21b-8e6a-43cb-b194-16397872fd71","Type":"ContainerStarted","Data":"3c9f5a86f36bd272a50df83079ffbd8469e49e79ed4e90bb337b06616f967174"} Oct 12 08:22:15 crc kubenswrapper[4599]: I1012 08:22:15.230566 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-s256c/must-gather-76gb2" event={"ID":"fb5ad21b-8e6a-43cb-b194-16397872fd71","Type":"ContainerStarted","Data":"f4e1f78f2906119c4c04399a39cad0e49ac30185490301f7ca759337d4deed6e"} Oct 12 08:22:15 crc kubenswrapper[4599]: I1012 08:22:15.245678 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-s256c/must-gather-76gb2" podStartSLOduration=2.528596106 podStartE2EDuration="8.245665917s" podCreationTimestamp="2025-10-12 08:22:07 +0000 UTC" firstStartedPulling="2025-10-12 08:22:08.411825837 +0000 UTC m=+2825.201021339" lastFinishedPulling="2025-10-12 08:22:14.128895649 +0000 UTC m=+2830.918091150" observedRunningTime="2025-10-12 08:22:15.244380984 +0000 UTC m=+2832.033576486" watchObservedRunningTime="2025-10-12 08:22:15.245665917 +0000 UTC m=+2832.034861419" Oct 12 08:22:16 crc kubenswrapper[4599]: I1012 08:22:16.754143 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-s256c/crc-debug-l2vcx"] Oct 12 08:22:16 crc kubenswrapper[4599]: I1012 08:22:16.755301 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-s256c/crc-debug-l2vcx" Oct 12 08:22:16 crc kubenswrapper[4599]: I1012 08:22:16.927268 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/709ce809-4b30-4942-99d0-5ee376b2bba5-host\") pod \"crc-debug-l2vcx\" (UID: \"709ce809-4b30-4942-99d0-5ee376b2bba5\") " pod="openshift-must-gather-s256c/crc-debug-l2vcx" Oct 12 08:22:16 crc kubenswrapper[4599]: I1012 08:22:16.927380 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmsmr\" (UniqueName: \"kubernetes.io/projected/709ce809-4b30-4942-99d0-5ee376b2bba5-kube-api-access-fmsmr\") pod \"crc-debug-l2vcx\" (UID: \"709ce809-4b30-4942-99d0-5ee376b2bba5\") " pod="openshift-must-gather-s256c/crc-debug-l2vcx" Oct 12 08:22:17 crc kubenswrapper[4599]: I1012 08:22:17.028899 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/709ce809-4b30-4942-99d0-5ee376b2bba5-host\") pod \"crc-debug-l2vcx\" (UID: \"709ce809-4b30-4942-99d0-5ee376b2bba5\") " pod="openshift-must-gather-s256c/crc-debug-l2vcx" Oct 12 08:22:17 crc kubenswrapper[4599]: I1012 08:22:17.028983 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmsmr\" (UniqueName: \"kubernetes.io/projected/709ce809-4b30-4942-99d0-5ee376b2bba5-kube-api-access-fmsmr\") pod \"crc-debug-l2vcx\" (UID: \"709ce809-4b30-4942-99d0-5ee376b2bba5\") " pod="openshift-must-gather-s256c/crc-debug-l2vcx" Oct 12 08:22:17 crc kubenswrapper[4599]: I1012 08:22:17.029054 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/709ce809-4b30-4942-99d0-5ee376b2bba5-host\") pod \"crc-debug-l2vcx\" (UID: \"709ce809-4b30-4942-99d0-5ee376b2bba5\") " pod="openshift-must-gather-s256c/crc-debug-l2vcx" Oct 12 08:22:17 crc kubenswrapper[4599]: I1012 08:22:17.043526 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmsmr\" (UniqueName: \"kubernetes.io/projected/709ce809-4b30-4942-99d0-5ee376b2bba5-kube-api-access-fmsmr\") pod \"crc-debug-l2vcx\" (UID: \"709ce809-4b30-4942-99d0-5ee376b2bba5\") " pod="openshift-must-gather-s256c/crc-debug-l2vcx" Oct 12 08:22:17 crc kubenswrapper[4599]: I1012 08:22:17.069409 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-s256c/crc-debug-l2vcx" Oct 12 08:22:17 crc kubenswrapper[4599]: W1012 08:22:17.093405 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod709ce809_4b30_4942_99d0_5ee376b2bba5.slice/crio-0962d953e8820d397d247c814f0c779c3940f8ec9241c558cce01e6d2b898d24 WatchSource:0}: Error finding container 0962d953e8820d397d247c814f0c779c3940f8ec9241c558cce01e6d2b898d24: Status 404 returned error can't find the container with id 0962d953e8820d397d247c814f0c779c3940f8ec9241c558cce01e6d2b898d24 Oct 12 08:22:17 crc kubenswrapper[4599]: I1012 08:22:17.244644 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-s256c/crc-debug-l2vcx" event={"ID":"709ce809-4b30-4942-99d0-5ee376b2bba5","Type":"ContainerStarted","Data":"0962d953e8820d397d247c814f0c779c3940f8ec9241c558cce01e6d2b898d24"} Oct 12 08:22:27 crc kubenswrapper[4599]: I1012 08:22:27.317801 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-s256c/crc-debug-l2vcx" event={"ID":"709ce809-4b30-4942-99d0-5ee376b2bba5","Type":"ContainerStarted","Data":"409bcca2550cca4c692327ea35754ac3b5d155a813911df185ff8cbb7ba911dc"} Oct 12 08:22:27 crc kubenswrapper[4599]: I1012 08:22:27.332328 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-s256c/crc-debug-l2vcx" podStartSLOduration=2.044087983 podStartE2EDuration="11.332317434s" podCreationTimestamp="2025-10-12 08:22:16 +0000 UTC" firstStartedPulling="2025-10-12 08:22:17.095665068 +0000 UTC m=+2833.884860570" lastFinishedPulling="2025-10-12 08:22:26.383894519 +0000 UTC m=+2843.173090021" observedRunningTime="2025-10-12 08:22:27.326804519 +0000 UTC m=+2844.116000020" watchObservedRunningTime="2025-10-12 08:22:27.332317434 +0000 UTC m=+2844.121512935" Oct 12 08:22:28 crc kubenswrapper[4599]: I1012 08:22:28.321769 4599 patch_prober.go:28] interesting pod/machine-config-daemon-5mz5c container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 08:22:28 crc kubenswrapper[4599]: I1012 08:22:28.322324 4599 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 08:22:28 crc kubenswrapper[4599]: I1012 08:22:28.322385 4599 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" Oct 12 08:22:28 crc kubenswrapper[4599]: I1012 08:22:28.323041 4599 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"536fe0523d7f4d373d85a237a3866f6e04c0049c07c9b6780974dfce18c2815f"} pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 12 08:22:28 crc kubenswrapper[4599]: I1012 08:22:28.323085 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" containerName="machine-config-daemon" containerID="cri-o://536fe0523d7f4d373d85a237a3866f6e04c0049c07c9b6780974dfce18c2815f" gracePeriod=600 Oct 12 08:22:28 crc kubenswrapper[4599]: E1012 08:22:28.453511 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5mz5c_openshift-machine-config-operator(cc694bce-8c25-4729-b452-29d44d3efe6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" Oct 12 08:22:29 crc kubenswrapper[4599]: I1012 08:22:29.334390 4599 generic.go:334] "Generic (PLEG): container finished" podID="cc694bce-8c25-4729-b452-29d44d3efe6e" containerID="536fe0523d7f4d373d85a237a3866f6e04c0049c07c9b6780974dfce18c2815f" exitCode=0 Oct 12 08:22:29 crc kubenswrapper[4599]: I1012 08:22:29.334471 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" event={"ID":"cc694bce-8c25-4729-b452-29d44d3efe6e","Type":"ContainerDied","Data":"536fe0523d7f4d373d85a237a3866f6e04c0049c07c9b6780974dfce18c2815f"} Oct 12 08:22:29 crc kubenswrapper[4599]: I1012 08:22:29.334773 4599 scope.go:117] "RemoveContainer" containerID="c2c141d22e067a6383af16f2655ff809ed93e78d4a047bd3c7fdced41256a1ea" Oct 12 08:22:29 crc kubenswrapper[4599]: I1012 08:22:29.335424 4599 scope.go:117] "RemoveContainer" containerID="536fe0523d7f4d373d85a237a3866f6e04c0049c07c9b6780974dfce18c2815f" Oct 12 08:22:29 crc kubenswrapper[4599]: E1012 08:22:29.335658 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5mz5c_openshift-machine-config-operator(cc694bce-8c25-4729-b452-29d44d3efe6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" Oct 12 08:22:43 crc kubenswrapper[4599]: I1012 08:22:43.551406 4599 scope.go:117] "RemoveContainer" containerID="536fe0523d7f4d373d85a237a3866f6e04c0049c07c9b6780974dfce18c2815f" Oct 12 08:22:43 crc kubenswrapper[4599]: E1012 08:22:43.552095 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5mz5c_openshift-machine-config-operator(cc694bce-8c25-4729-b452-29d44d3efe6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" Oct 12 08:22:54 crc kubenswrapper[4599]: I1012 08:22:54.545514 4599 scope.go:117] "RemoveContainer" containerID="536fe0523d7f4d373d85a237a3866f6e04c0049c07c9b6780974dfce18c2815f" Oct 12 08:22:54 crc kubenswrapper[4599]: E1012 08:22:54.546300 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5mz5c_openshift-machine-config-operator(cc694bce-8c25-4729-b452-29d44d3efe6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" Oct 12 08:22:55 crc kubenswrapper[4599]: I1012 08:22:55.531831 4599 generic.go:334] "Generic (PLEG): container finished" podID="709ce809-4b30-4942-99d0-5ee376b2bba5" containerID="409bcca2550cca4c692327ea35754ac3b5d155a813911df185ff8cbb7ba911dc" exitCode=0 Oct 12 08:22:55 crc kubenswrapper[4599]: I1012 08:22:55.531896 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-s256c/crc-debug-l2vcx" event={"ID":"709ce809-4b30-4942-99d0-5ee376b2bba5","Type":"ContainerDied","Data":"409bcca2550cca4c692327ea35754ac3b5d155a813911df185ff8cbb7ba911dc"} Oct 12 08:22:56 crc kubenswrapper[4599]: I1012 08:22:56.617123 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-s256c/crc-debug-l2vcx" Oct 12 08:22:56 crc kubenswrapper[4599]: I1012 08:22:56.648823 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-s256c/crc-debug-l2vcx"] Oct 12 08:22:56 crc kubenswrapper[4599]: I1012 08:22:56.654664 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-s256c/crc-debug-l2vcx"] Oct 12 08:22:56 crc kubenswrapper[4599]: I1012 08:22:56.754746 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmsmr\" (UniqueName: \"kubernetes.io/projected/709ce809-4b30-4942-99d0-5ee376b2bba5-kube-api-access-fmsmr\") pod \"709ce809-4b30-4942-99d0-5ee376b2bba5\" (UID: \"709ce809-4b30-4942-99d0-5ee376b2bba5\") " Oct 12 08:22:56 crc kubenswrapper[4599]: I1012 08:22:56.754864 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/709ce809-4b30-4942-99d0-5ee376b2bba5-host\") pod \"709ce809-4b30-4942-99d0-5ee376b2bba5\" (UID: \"709ce809-4b30-4942-99d0-5ee376b2bba5\") " Oct 12 08:22:56 crc kubenswrapper[4599]: I1012 08:22:56.754912 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/709ce809-4b30-4942-99d0-5ee376b2bba5-host" (OuterVolumeSpecName: "host") pod "709ce809-4b30-4942-99d0-5ee376b2bba5" (UID: "709ce809-4b30-4942-99d0-5ee376b2bba5"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 12 08:22:56 crc kubenswrapper[4599]: I1012 08:22:56.755498 4599 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/709ce809-4b30-4942-99d0-5ee376b2bba5-host\") on node \"crc\" DevicePath \"\"" Oct 12 08:22:56 crc kubenswrapper[4599]: I1012 08:22:56.760633 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/709ce809-4b30-4942-99d0-5ee376b2bba5-kube-api-access-fmsmr" (OuterVolumeSpecName: "kube-api-access-fmsmr") pod "709ce809-4b30-4942-99d0-5ee376b2bba5" (UID: "709ce809-4b30-4942-99d0-5ee376b2bba5"). InnerVolumeSpecName "kube-api-access-fmsmr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 08:22:56 crc kubenswrapper[4599]: I1012 08:22:56.856882 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fmsmr\" (UniqueName: \"kubernetes.io/projected/709ce809-4b30-4942-99d0-5ee376b2bba5-kube-api-access-fmsmr\") on node \"crc\" DevicePath \"\"" Oct 12 08:22:57 crc kubenswrapper[4599]: I1012 08:22:57.548790 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-s256c/crc-debug-l2vcx" Oct 12 08:22:57 crc kubenswrapper[4599]: I1012 08:22:57.553148 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="709ce809-4b30-4942-99d0-5ee376b2bba5" path="/var/lib/kubelet/pods/709ce809-4b30-4942-99d0-5ee376b2bba5/volumes" Oct 12 08:22:57 crc kubenswrapper[4599]: I1012 08:22:57.553789 4599 scope.go:117] "RemoveContainer" containerID="409bcca2550cca4c692327ea35754ac3b5d155a813911df185ff8cbb7ba911dc" Oct 12 08:22:57 crc kubenswrapper[4599]: I1012 08:22:57.822746 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-s256c/crc-debug-hktzw"] Oct 12 08:22:57 crc kubenswrapper[4599]: E1012 08:22:57.824226 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="709ce809-4b30-4942-99d0-5ee376b2bba5" containerName="container-00" Oct 12 08:22:57 crc kubenswrapper[4599]: I1012 08:22:57.824248 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="709ce809-4b30-4942-99d0-5ee376b2bba5" containerName="container-00" Oct 12 08:22:57 crc kubenswrapper[4599]: I1012 08:22:57.824709 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="709ce809-4b30-4942-99d0-5ee376b2bba5" containerName="container-00" Oct 12 08:22:57 crc kubenswrapper[4599]: I1012 08:22:57.826132 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-s256c/crc-debug-hktzw" Oct 12 08:22:57 crc kubenswrapper[4599]: I1012 08:22:57.979495 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9wbv\" (UniqueName: \"kubernetes.io/projected/d59c4aaa-4020-482e-8ade-0211754fd86e-kube-api-access-d9wbv\") pod \"crc-debug-hktzw\" (UID: \"d59c4aaa-4020-482e-8ade-0211754fd86e\") " pod="openshift-must-gather-s256c/crc-debug-hktzw" Oct 12 08:22:57 crc kubenswrapper[4599]: I1012 08:22:57.980062 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d59c4aaa-4020-482e-8ade-0211754fd86e-host\") pod \"crc-debug-hktzw\" (UID: \"d59c4aaa-4020-482e-8ade-0211754fd86e\") " pod="openshift-must-gather-s256c/crc-debug-hktzw" Oct 12 08:22:58 crc kubenswrapper[4599]: I1012 08:22:58.083675 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d59c4aaa-4020-482e-8ade-0211754fd86e-host\") pod \"crc-debug-hktzw\" (UID: \"d59c4aaa-4020-482e-8ade-0211754fd86e\") " pod="openshift-must-gather-s256c/crc-debug-hktzw" Oct 12 08:22:58 crc kubenswrapper[4599]: I1012 08:22:58.083844 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9wbv\" (UniqueName: \"kubernetes.io/projected/d59c4aaa-4020-482e-8ade-0211754fd86e-kube-api-access-d9wbv\") pod \"crc-debug-hktzw\" (UID: \"d59c4aaa-4020-482e-8ade-0211754fd86e\") " pod="openshift-must-gather-s256c/crc-debug-hktzw" Oct 12 08:22:58 crc kubenswrapper[4599]: I1012 08:22:58.083904 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d59c4aaa-4020-482e-8ade-0211754fd86e-host\") pod \"crc-debug-hktzw\" (UID: \"d59c4aaa-4020-482e-8ade-0211754fd86e\") " pod="openshift-must-gather-s256c/crc-debug-hktzw" Oct 12 08:22:58 crc kubenswrapper[4599]: I1012 08:22:58.103547 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9wbv\" (UniqueName: \"kubernetes.io/projected/d59c4aaa-4020-482e-8ade-0211754fd86e-kube-api-access-d9wbv\") pod \"crc-debug-hktzw\" (UID: \"d59c4aaa-4020-482e-8ade-0211754fd86e\") " pod="openshift-must-gather-s256c/crc-debug-hktzw" Oct 12 08:22:58 crc kubenswrapper[4599]: I1012 08:22:58.144172 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-s256c/crc-debug-hktzw" Oct 12 08:22:58 crc kubenswrapper[4599]: I1012 08:22:58.557160 4599 generic.go:334] "Generic (PLEG): container finished" podID="d59c4aaa-4020-482e-8ade-0211754fd86e" containerID="349a117da203ab33f3f3130f17d622fc6a6214c18bf646deda76b9ec4d8bb039" exitCode=0 Oct 12 08:22:58 crc kubenswrapper[4599]: I1012 08:22:58.557227 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-s256c/crc-debug-hktzw" event={"ID":"d59c4aaa-4020-482e-8ade-0211754fd86e","Type":"ContainerDied","Data":"349a117da203ab33f3f3130f17d622fc6a6214c18bf646deda76b9ec4d8bb039"} Oct 12 08:22:58 crc kubenswrapper[4599]: I1012 08:22:58.557877 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-s256c/crc-debug-hktzw" event={"ID":"d59c4aaa-4020-482e-8ade-0211754fd86e","Type":"ContainerStarted","Data":"b8aac4d44cfc0ab7bb0a2ef0c95d6fb99f3236b7d0053f32d10f7313d0114de4"} Oct 12 08:22:59 crc kubenswrapper[4599]: I1012 08:22:59.028948 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-s256c/crc-debug-hktzw"] Oct 12 08:22:59 crc kubenswrapper[4599]: I1012 08:22:59.036159 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-s256c/crc-debug-hktzw"] Oct 12 08:22:59 crc kubenswrapper[4599]: I1012 08:22:59.637368 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-s256c/crc-debug-hktzw" Oct 12 08:22:59 crc kubenswrapper[4599]: I1012 08:22:59.811827 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d59c4aaa-4020-482e-8ade-0211754fd86e-host\") pod \"d59c4aaa-4020-482e-8ade-0211754fd86e\" (UID: \"d59c4aaa-4020-482e-8ade-0211754fd86e\") " Oct 12 08:22:59 crc kubenswrapper[4599]: I1012 08:22:59.812060 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d9wbv\" (UniqueName: \"kubernetes.io/projected/d59c4aaa-4020-482e-8ade-0211754fd86e-kube-api-access-d9wbv\") pod \"d59c4aaa-4020-482e-8ade-0211754fd86e\" (UID: \"d59c4aaa-4020-482e-8ade-0211754fd86e\") " Oct 12 08:22:59 crc kubenswrapper[4599]: I1012 08:22:59.812166 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d59c4aaa-4020-482e-8ade-0211754fd86e-host" (OuterVolumeSpecName: "host") pod "d59c4aaa-4020-482e-8ade-0211754fd86e" (UID: "d59c4aaa-4020-482e-8ade-0211754fd86e"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 12 08:22:59 crc kubenswrapper[4599]: I1012 08:22:59.812647 4599 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d59c4aaa-4020-482e-8ade-0211754fd86e-host\") on node \"crc\" DevicePath \"\"" Oct 12 08:22:59 crc kubenswrapper[4599]: I1012 08:22:59.818767 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d59c4aaa-4020-482e-8ade-0211754fd86e-kube-api-access-d9wbv" (OuterVolumeSpecName: "kube-api-access-d9wbv") pod "d59c4aaa-4020-482e-8ade-0211754fd86e" (UID: "d59c4aaa-4020-482e-8ade-0211754fd86e"). InnerVolumeSpecName "kube-api-access-d9wbv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 08:22:59 crc kubenswrapper[4599]: I1012 08:22:59.914167 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d9wbv\" (UniqueName: \"kubernetes.io/projected/d59c4aaa-4020-482e-8ade-0211754fd86e-kube-api-access-d9wbv\") on node \"crc\" DevicePath \"\"" Oct 12 08:23:00 crc kubenswrapper[4599]: I1012 08:23:00.142591 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-s256c/crc-debug-t4ckk"] Oct 12 08:23:00 crc kubenswrapper[4599]: E1012 08:23:00.143496 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d59c4aaa-4020-482e-8ade-0211754fd86e" containerName="container-00" Oct 12 08:23:00 crc kubenswrapper[4599]: I1012 08:23:00.143590 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="d59c4aaa-4020-482e-8ade-0211754fd86e" containerName="container-00" Oct 12 08:23:00 crc kubenswrapper[4599]: I1012 08:23:00.143841 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="d59c4aaa-4020-482e-8ade-0211754fd86e" containerName="container-00" Oct 12 08:23:00 crc kubenswrapper[4599]: I1012 08:23:00.144449 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-s256c/crc-debug-t4ckk" Oct 12 08:23:00 crc kubenswrapper[4599]: I1012 08:23:00.320632 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42bp7\" (UniqueName: \"kubernetes.io/projected/a9a9df04-a178-4ddb-a9e3-16ff994ef030-kube-api-access-42bp7\") pod \"crc-debug-t4ckk\" (UID: \"a9a9df04-a178-4ddb-a9e3-16ff994ef030\") " pod="openshift-must-gather-s256c/crc-debug-t4ckk" Oct 12 08:23:00 crc kubenswrapper[4599]: I1012 08:23:00.320981 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a9a9df04-a178-4ddb-a9e3-16ff994ef030-host\") pod \"crc-debug-t4ckk\" (UID: \"a9a9df04-a178-4ddb-a9e3-16ff994ef030\") " pod="openshift-must-gather-s256c/crc-debug-t4ckk" Oct 12 08:23:00 crc kubenswrapper[4599]: I1012 08:23:00.421652 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42bp7\" (UniqueName: \"kubernetes.io/projected/a9a9df04-a178-4ddb-a9e3-16ff994ef030-kube-api-access-42bp7\") pod \"crc-debug-t4ckk\" (UID: \"a9a9df04-a178-4ddb-a9e3-16ff994ef030\") " pod="openshift-must-gather-s256c/crc-debug-t4ckk" Oct 12 08:23:00 crc kubenswrapper[4599]: I1012 08:23:00.421783 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a9a9df04-a178-4ddb-a9e3-16ff994ef030-host\") pod \"crc-debug-t4ckk\" (UID: \"a9a9df04-a178-4ddb-a9e3-16ff994ef030\") " pod="openshift-must-gather-s256c/crc-debug-t4ckk" Oct 12 08:23:00 crc kubenswrapper[4599]: I1012 08:23:00.421859 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a9a9df04-a178-4ddb-a9e3-16ff994ef030-host\") pod \"crc-debug-t4ckk\" (UID: \"a9a9df04-a178-4ddb-a9e3-16ff994ef030\") " pod="openshift-must-gather-s256c/crc-debug-t4ckk" Oct 12 08:23:00 crc kubenswrapper[4599]: I1012 08:23:00.435698 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42bp7\" (UniqueName: \"kubernetes.io/projected/a9a9df04-a178-4ddb-a9e3-16ff994ef030-kube-api-access-42bp7\") pod \"crc-debug-t4ckk\" (UID: \"a9a9df04-a178-4ddb-a9e3-16ff994ef030\") " pod="openshift-must-gather-s256c/crc-debug-t4ckk" Oct 12 08:23:00 crc kubenswrapper[4599]: I1012 08:23:00.458486 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-s256c/crc-debug-t4ckk" Oct 12 08:23:00 crc kubenswrapper[4599]: W1012 08:23:00.488263 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda9a9df04_a178_4ddb_a9e3_16ff994ef030.slice/crio-838aabfdb108912f6512397a5aad388f4e57696c7584b6a101b5ab55f70301e2 WatchSource:0}: Error finding container 838aabfdb108912f6512397a5aad388f4e57696c7584b6a101b5ab55f70301e2: Status 404 returned error can't find the container with id 838aabfdb108912f6512397a5aad388f4e57696c7584b6a101b5ab55f70301e2 Oct 12 08:23:00 crc kubenswrapper[4599]: I1012 08:23:00.573376 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-s256c/crc-debug-t4ckk" event={"ID":"a9a9df04-a178-4ddb-a9e3-16ff994ef030","Type":"ContainerStarted","Data":"838aabfdb108912f6512397a5aad388f4e57696c7584b6a101b5ab55f70301e2"} Oct 12 08:23:00 crc kubenswrapper[4599]: I1012 08:23:00.574971 4599 scope.go:117] "RemoveContainer" containerID="349a117da203ab33f3f3130f17d622fc6a6214c18bf646deda76b9ec4d8bb039" Oct 12 08:23:00 crc kubenswrapper[4599]: I1012 08:23:00.575125 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-s256c/crc-debug-hktzw" Oct 12 08:23:01 crc kubenswrapper[4599]: I1012 08:23:01.555870 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d59c4aaa-4020-482e-8ade-0211754fd86e" path="/var/lib/kubelet/pods/d59c4aaa-4020-482e-8ade-0211754fd86e/volumes" Oct 12 08:23:01 crc kubenswrapper[4599]: I1012 08:23:01.583918 4599 generic.go:334] "Generic (PLEG): container finished" podID="a9a9df04-a178-4ddb-a9e3-16ff994ef030" containerID="617ef708d1c6637a92e51b65089dc6205c2731aa430e2dec7f4a1decdbec48b3" exitCode=0 Oct 12 08:23:01 crc kubenswrapper[4599]: I1012 08:23:01.583954 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-s256c/crc-debug-t4ckk" event={"ID":"a9a9df04-a178-4ddb-a9e3-16ff994ef030","Type":"ContainerDied","Data":"617ef708d1c6637a92e51b65089dc6205c2731aa430e2dec7f4a1decdbec48b3"} Oct 12 08:23:01 crc kubenswrapper[4599]: I1012 08:23:01.611562 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-s256c/crc-debug-t4ckk"] Oct 12 08:23:01 crc kubenswrapper[4599]: I1012 08:23:01.621766 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-s256c/crc-debug-t4ckk"] Oct 12 08:23:01 crc kubenswrapper[4599]: I1012 08:23:01.874162 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-689dd94bf4-pwcdz_c247e243-5ad3-4e53-a733-a11d9407c42a/barbican-api/0.log" Oct 12 08:23:01 crc kubenswrapper[4599]: I1012 08:23:01.895031 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-689dd94bf4-pwcdz_c247e243-5ad3-4e53-a733-a11d9407c42a/barbican-api-log/0.log" Oct 12 08:23:02 crc kubenswrapper[4599]: I1012 08:23:02.022404 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-58d46f7854-2972q_82626116-d40d-45c2-8a6f-513cb12f6b19/barbican-keystone-listener/0.log" Oct 12 08:23:02 crc kubenswrapper[4599]: I1012 08:23:02.041708 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-58d46f7854-2972q_82626116-d40d-45c2-8a6f-513cb12f6b19/barbican-keystone-listener-log/0.log" Oct 12 08:23:02 crc kubenswrapper[4599]: I1012 08:23:02.117688 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6d4cb4975-47tpw_f8d2d027-f32a-4708-b7cb-5302f1def41f/barbican-worker/0.log" Oct 12 08:23:02 crc kubenswrapper[4599]: I1012 08:23:02.184031 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6d4cb4975-47tpw_f8d2d027-f32a-4708-b7cb-5302f1def41f/barbican-worker-log/0.log" Oct 12 08:23:02 crc kubenswrapper[4599]: I1012 08:23:02.279653 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-2jgtd_f2098dde-6e8b-4a07-80d7-fc8e6d2c665e/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Oct 12 08:23:02 crc kubenswrapper[4599]: I1012 08:23:02.324149 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_95d7fa90-5a03-4991-810a-59cf46e55ebf/ceilometer-central-agent/0.log" Oct 12 08:23:02 crc kubenswrapper[4599]: I1012 08:23:02.360099 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_95d7fa90-5a03-4991-810a-59cf46e55ebf/ceilometer-notification-agent/0.log" Oct 12 08:23:02 crc kubenswrapper[4599]: I1012 08:23:02.434393 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_95d7fa90-5a03-4991-810a-59cf46e55ebf/proxy-httpd/0.log" Oct 12 08:23:02 crc kubenswrapper[4599]: I1012 08:23:02.458645 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_95d7fa90-5a03-4991-810a-59cf46e55ebf/sg-core/0.log" Oct 12 08:23:02 crc kubenswrapper[4599]: I1012 08:23:02.536620 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_5d006848-2829-4dab-b441-dddfc1737bfa/cinder-api/0.log" Oct 12 08:23:02 crc kubenswrapper[4599]: I1012 08:23:02.583615 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_5d006848-2829-4dab-b441-dddfc1737bfa/cinder-api-log/0.log" Oct 12 08:23:02 crc kubenswrapper[4599]: I1012 08:23:02.644983 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_dfa968eb-bc29-434a-a2f6-6aebf5c8beda/cinder-scheduler/0.log" Oct 12 08:23:02 crc kubenswrapper[4599]: I1012 08:23:02.675058 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-s256c/crc-debug-t4ckk" Oct 12 08:23:02 crc kubenswrapper[4599]: I1012 08:23:02.761129 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_dfa968eb-bc29-434a-a2f6-6aebf5c8beda/probe/0.log" Oct 12 08:23:02 crc kubenswrapper[4599]: I1012 08:23:02.801832 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-s5xrx_9988d105-d7f0-459a-a8d9-056ac0d3abab/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 12 08:23:02 crc kubenswrapper[4599]: I1012 08:23:02.868396 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42bp7\" (UniqueName: \"kubernetes.io/projected/a9a9df04-a178-4ddb-a9e3-16ff994ef030-kube-api-access-42bp7\") pod \"a9a9df04-a178-4ddb-a9e3-16ff994ef030\" (UID: \"a9a9df04-a178-4ddb-a9e3-16ff994ef030\") " Oct 12 08:23:02 crc kubenswrapper[4599]: I1012 08:23:02.868582 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a9a9df04-a178-4ddb-a9e3-16ff994ef030-host\") pod \"a9a9df04-a178-4ddb-a9e3-16ff994ef030\" (UID: \"a9a9df04-a178-4ddb-a9e3-16ff994ef030\") " Oct 12 08:23:02 crc kubenswrapper[4599]: I1012 08:23:02.868868 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a9a9df04-a178-4ddb-a9e3-16ff994ef030-host" (OuterVolumeSpecName: "host") pod "a9a9df04-a178-4ddb-a9e3-16ff994ef030" (UID: "a9a9df04-a178-4ddb-a9e3-16ff994ef030"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 12 08:23:02 crc kubenswrapper[4599]: I1012 08:23:02.874328 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9a9df04-a178-4ddb-a9e3-16ff994ef030-kube-api-access-42bp7" (OuterVolumeSpecName: "kube-api-access-42bp7") pod "a9a9df04-a178-4ddb-a9e3-16ff994ef030" (UID: "a9a9df04-a178-4ddb-a9e3-16ff994ef030"). InnerVolumeSpecName "kube-api-access-42bp7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 08:23:02 crc kubenswrapper[4599]: I1012 08:23:02.925062 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-n2smx_dd9a9999-dc26-4df4-b259-dbdbc31766f3/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 12 08:23:02 crc kubenswrapper[4599]: I1012 08:23:02.970886 4599 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a9a9df04-a178-4ddb-a9e3-16ff994ef030-host\") on node \"crc\" DevicePath \"\"" Oct 12 08:23:02 crc kubenswrapper[4599]: I1012 08:23:02.970923 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42bp7\" (UniqueName: \"kubernetes.io/projected/a9a9df04-a178-4ddb-a9e3-16ff994ef030-kube-api-access-42bp7\") on node \"crc\" DevicePath \"\"" Oct 12 08:23:02 crc kubenswrapper[4599]: I1012 08:23:02.988833 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-sck65_9fd7ec79-0a36-4ac6-a81a-486df9b2ba89/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 12 08:23:03 crc kubenswrapper[4599]: I1012 08:23:03.091133 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5d885c8d8c-mm9q2_aa8d5577-ea50-40e9-8549-7c7ad4da7ee6/init/0.log" Oct 12 08:23:03 crc kubenswrapper[4599]: I1012 08:23:03.216084 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5d885c8d8c-mm9q2_aa8d5577-ea50-40e9-8549-7c7ad4da7ee6/init/0.log" Oct 12 08:23:03 crc kubenswrapper[4599]: I1012 08:23:03.233664 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-bphhh_f37be313-7217-4822-82a3-b1c6edd70a45/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Oct 12 08:23:03 crc kubenswrapper[4599]: I1012 08:23:03.261898 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5d885c8d8c-mm9q2_aa8d5577-ea50-40e9-8549-7c7ad4da7ee6/dnsmasq-dns/0.log" Oct 12 08:23:03 crc kubenswrapper[4599]: I1012 08:23:03.392132 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_a26e3618-b309-4ac5-b0e1-39feba422ef6/glance-httpd/0.log" Oct 12 08:23:03 crc kubenswrapper[4599]: I1012 08:23:03.400017 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_a26e3618-b309-4ac5-b0e1-39feba422ef6/glance-log/0.log" Oct 12 08:23:03 crc kubenswrapper[4599]: I1012 08:23:03.554017 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9a9df04-a178-4ddb-a9e3-16ff994ef030" path="/var/lib/kubelet/pods/a9a9df04-a178-4ddb-a9e3-16ff994ef030/volumes" Oct 12 08:23:03 crc kubenswrapper[4599]: I1012 08:23:03.563483 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_7fec8099-a2ba-4cbd-af30-75a787e3ead1/glance-log/0.log" Oct 12 08:23:03 crc kubenswrapper[4599]: I1012 08:23:03.566735 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_7fec8099-a2ba-4cbd-af30-75a787e3ead1/glance-httpd/0.log" Oct 12 08:23:03 crc kubenswrapper[4599]: I1012 08:23:03.603556 4599 scope.go:117] "RemoveContainer" containerID="617ef708d1c6637a92e51b65089dc6205c2731aa430e2dec7f4a1decdbec48b3" Oct 12 08:23:03 crc kubenswrapper[4599]: I1012 08:23:03.603608 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-s256c/crc-debug-t4ckk" Oct 12 08:23:03 crc kubenswrapper[4599]: I1012 08:23:03.618828 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-2hsx4_bc8b881e-7904-44fe-ae99-975ece57dc4c/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Oct 12 08:23:03 crc kubenswrapper[4599]: I1012 08:23:03.754146 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-6ngmq_e862c4d5-24d0-42ee-82f5-7a17fc6773aa/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 12 08:23:03 crc kubenswrapper[4599]: I1012 08:23:03.921256 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-5d9d5b7486-4r48s_5d11880e-4007-4266-bf3b-8c1e3eea20b8/keystone-api/0.log" Oct 12 08:23:03 crc kubenswrapper[4599]: I1012 08:23:03.965345 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29337601-bb6kg_53717d04-56c1-42cc-af4a-f7edc51e3611/keystone-cron/0.log" Oct 12 08:23:04 crc kubenswrapper[4599]: I1012 08:23:04.072761 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_03a40b71-5a8f-42cd-97dd-e1b360a15b68/kube-state-metrics/0.log" Oct 12 08:23:04 crc kubenswrapper[4599]: I1012 08:23:04.152145 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-rpvwh_ed0b094a-51d6-4287-b4a4-4a0934139fa2/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Oct 12 08:23:04 crc kubenswrapper[4599]: I1012 08:23:04.407733 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6dcf447b8f-d5qql_1f21bc7c-6371-45de-9766-ce9ad07df644/neutron-api/0.log" Oct 12 08:23:04 crc kubenswrapper[4599]: I1012 08:23:04.411110 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6dcf447b8f-d5qql_1f21bc7c-6371-45de-9766-ce9ad07df644/neutron-httpd/0.log" Oct 12 08:23:04 crc kubenswrapper[4599]: I1012 08:23:04.631165 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-7rb25_d5a50516-f480-4da5-adb8-853dd9ce7b6c/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Oct 12 08:23:04 crc kubenswrapper[4599]: I1012 08:23:04.983480 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_65c5adc9-0a5f-4631-bee5-c87a70c0d0a2/nova-cell0-conductor-conductor/0.log" Oct 12 08:23:05 crc kubenswrapper[4599]: I1012 08:23:05.034704 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_fd526957-c4dc-40c6-87e3-eb3784e09fb5/nova-api-log/0.log" Oct 12 08:23:05 crc kubenswrapper[4599]: I1012 08:23:05.132458 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_fd526957-c4dc-40c6-87e3-eb3784e09fb5/nova-api-api/0.log" Oct 12 08:23:05 crc kubenswrapper[4599]: I1012 08:23:05.276042 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_c21fe8be-d815-4e07-9ea8-e22d73e2dd8f/nova-cell1-conductor-conductor/0.log" Oct 12 08:23:05 crc kubenswrapper[4599]: I1012 08:23:05.316586 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_2abe9d8b-a086-4e4c-8873-3b50714935c9/nova-cell1-novncproxy-novncproxy/0.log" Oct 12 08:23:05 crc kubenswrapper[4599]: I1012 08:23:05.464040 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-qxq8q_c01400c9-ebac-486d-ac74-9cec09171386/nova-edpm-deployment-openstack-edpm-ipam/0.log" Oct 12 08:23:05 crc kubenswrapper[4599]: I1012 08:23:05.545152 4599 scope.go:117] "RemoveContainer" containerID="536fe0523d7f4d373d85a237a3866f6e04c0049c07c9b6780974dfce18c2815f" Oct 12 08:23:05 crc kubenswrapper[4599]: E1012 08:23:05.545728 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5mz5c_openshift-machine-config-operator(cc694bce-8c25-4729-b452-29d44d3efe6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" Oct 12 08:23:05 crc kubenswrapper[4599]: I1012 08:23:05.647520 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_092769ed-436e-4aee-b3e9-4b2eeb4c487e/nova-metadata-log/0.log" Oct 12 08:23:05 crc kubenswrapper[4599]: I1012 08:23:05.770795 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_223d3e1a-86ac-49d9-a231-b77957770434/nova-scheduler-scheduler/0.log" Oct 12 08:23:05 crc kubenswrapper[4599]: I1012 08:23:05.826982 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_e76c3f9c-bea3-4b35-852c-65d48f177d8a/mysql-bootstrap/0.log" Oct 12 08:23:05 crc kubenswrapper[4599]: I1012 08:23:05.981647 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_e76c3f9c-bea3-4b35-852c-65d48f177d8a/mysql-bootstrap/0.log" Oct 12 08:23:06 crc kubenswrapper[4599]: I1012 08:23:06.098880 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_e76c3f9c-bea3-4b35-852c-65d48f177d8a/galera/0.log" Oct 12 08:23:06 crc kubenswrapper[4599]: I1012 08:23:06.217646 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_4a035bca-ccfe-4dc6-949a-44d2ddf0fa26/mysql-bootstrap/0.log" Oct 12 08:23:06 crc kubenswrapper[4599]: I1012 08:23:06.359152 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_092769ed-436e-4aee-b3e9-4b2eeb4c487e/nova-metadata-metadata/0.log" Oct 12 08:23:06 crc kubenswrapper[4599]: I1012 08:23:06.365368 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_4a035bca-ccfe-4dc6-949a-44d2ddf0fa26/mysql-bootstrap/0.log" Oct 12 08:23:06 crc kubenswrapper[4599]: I1012 08:23:06.376985 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_4a035bca-ccfe-4dc6-949a-44d2ddf0fa26/galera/0.log" Oct 12 08:23:06 crc kubenswrapper[4599]: I1012 08:23:06.515458 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_b6db97ed-496b-4f4d-bb27-2bce6e003912/openstackclient/0.log" Oct 12 08:23:06 crc kubenswrapper[4599]: I1012 08:23:06.571462 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-9rbk6_ff6de4a7-bd76-46bc-a376-b1ec8c5ab712/ovn-controller/0.log" Oct 12 08:23:06 crc kubenswrapper[4599]: I1012 08:23:06.732621 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-4dn56_ce16e02d-17cf-467a-aca5-944a67d4cd79/openstack-network-exporter/0.log" Oct 12 08:23:06 crc kubenswrapper[4599]: I1012 08:23:06.772685 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-8s62q_78cb767a-31ee-4e29-b075-e773a43272c2/ovsdb-server-init/0.log" Oct 12 08:23:07 crc kubenswrapper[4599]: I1012 08:23:07.016257 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-8s62q_78cb767a-31ee-4e29-b075-e773a43272c2/ovs-vswitchd/0.log" Oct 12 08:23:07 crc kubenswrapper[4599]: I1012 08:23:07.019308 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-8s62q_78cb767a-31ee-4e29-b075-e773a43272c2/ovsdb-server/0.log" Oct 12 08:23:07 crc kubenswrapper[4599]: I1012 08:23:07.019523 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-8s62q_78cb767a-31ee-4e29-b075-e773a43272c2/ovsdb-server-init/0.log" Oct 12 08:23:07 crc kubenswrapper[4599]: I1012 08:23:07.202905 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-nmnvn_29a6cb6f-6a2a-405e-a24b-5d49ed9288cd/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Oct 12 08:23:07 crc kubenswrapper[4599]: I1012 08:23:07.278311 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_07a217da-3192-46dd-a935-7b124b5e6961/openstack-network-exporter/0.log" Oct 12 08:23:07 crc kubenswrapper[4599]: I1012 08:23:07.302879 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_07a217da-3192-46dd-a935-7b124b5e6961/ovn-northd/0.log" Oct 12 08:23:07 crc kubenswrapper[4599]: I1012 08:23:07.471081 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_9d3345d9-bc30-42dc-98e0-bfd24fee35ab/ovsdbserver-nb/0.log" Oct 12 08:23:07 crc kubenswrapper[4599]: I1012 08:23:07.621995 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_9d3345d9-bc30-42dc-98e0-bfd24fee35ab/openstack-network-exporter/0.log" Oct 12 08:23:07 crc kubenswrapper[4599]: I1012 08:23:07.760952 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_b7b21b67-7112-4507-a5d2-9036f09a3cdf/openstack-network-exporter/0.log" Oct 12 08:23:07 crc kubenswrapper[4599]: I1012 08:23:07.819740 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_b7b21b67-7112-4507-a5d2-9036f09a3cdf/ovsdbserver-sb/0.log" Oct 12 08:23:07 crc kubenswrapper[4599]: I1012 08:23:07.922747 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-55dcb544bd-wvf5d_6d346d4c-1358-4305-89ac-c9c012143de6/placement-api/0.log" Oct 12 08:23:08 crc kubenswrapper[4599]: I1012 08:23:08.008809 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_a9c90266-89e1-4527-8fa2-91826cbcc778/setup-container/0.log" Oct 12 08:23:08 crc kubenswrapper[4599]: I1012 08:23:08.027907 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-55dcb544bd-wvf5d_6d346d4c-1358-4305-89ac-c9c012143de6/placement-log/0.log" Oct 12 08:23:08 crc kubenswrapper[4599]: I1012 08:23:08.232500 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_a9c90266-89e1-4527-8fa2-91826cbcc778/setup-container/0.log" Oct 12 08:23:08 crc kubenswrapper[4599]: I1012 08:23:08.243778 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_a9c90266-89e1-4527-8fa2-91826cbcc778/rabbitmq/0.log" Oct 12 08:23:08 crc kubenswrapper[4599]: I1012 08:23:08.296252 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_ab388f40-a761-44f6-812f-df5cf4b02b73/setup-container/0.log" Oct 12 08:23:08 crc kubenswrapper[4599]: I1012 08:23:08.488036 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-rvv95_e741b043-6773-4cbb-88fc-d3dc8cd7d39d/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 12 08:23:08 crc kubenswrapper[4599]: I1012 08:23:08.505125 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_ab388f40-a761-44f6-812f-df5cf4b02b73/setup-container/0.log" Oct 12 08:23:08 crc kubenswrapper[4599]: I1012 08:23:08.532923 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_ab388f40-a761-44f6-812f-df5cf4b02b73/rabbitmq/0.log" Oct 12 08:23:08 crc kubenswrapper[4599]: I1012 08:23:08.658256 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-sx5tv_39f84a87-f390-4864-85f1-d4df13fe6b93/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Oct 12 08:23:08 crc kubenswrapper[4599]: I1012 08:23:08.732810 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-gmzjl_dc14ab80-62e8-47ec-bf5b-370ccfd95eff/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Oct 12 08:23:08 crc kubenswrapper[4599]: I1012 08:23:08.887354 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-ssdqc_53901e82-60d6-4dd0-9ec9-15851ecb4215/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 12 08:23:08 crc kubenswrapper[4599]: I1012 08:23:08.962452 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-kbv4c_88ad049a-ede5-4ac3-842b-c1ab9199014a/ssh-known-hosts-edpm-deployment/0.log" Oct 12 08:23:09 crc kubenswrapper[4599]: I1012 08:23:09.124449 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-786bc7649f-6qt66_9a10375e-8317-473b-87b4-07c82831ac41/proxy-server/0.log" Oct 12 08:23:09 crc kubenswrapper[4599]: I1012 08:23:09.154577 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-786bc7649f-6qt66_9a10375e-8317-473b-87b4-07c82831ac41/proxy-httpd/0.log" Oct 12 08:23:09 crc kubenswrapper[4599]: I1012 08:23:09.289810 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-w8g6f_3f44cfe9-f015-4084-b100-fbb08f528667/swift-ring-rebalance/0.log" Oct 12 08:23:09 crc kubenswrapper[4599]: I1012 08:23:09.381152 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_7f1c5e37-2b6d-4058-86e1-466baaa0f6c4/account-auditor/0.log" Oct 12 08:23:09 crc kubenswrapper[4599]: I1012 08:23:09.388754 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_7f1c5e37-2b6d-4058-86e1-466baaa0f6c4/account-reaper/0.log" Oct 12 08:23:09 crc kubenswrapper[4599]: I1012 08:23:09.487426 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_7f1c5e37-2b6d-4058-86e1-466baaa0f6c4/account-server/0.log" Oct 12 08:23:09 crc kubenswrapper[4599]: I1012 08:23:09.558009 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_7f1c5e37-2b6d-4058-86e1-466baaa0f6c4/account-replicator/0.log" Oct 12 08:23:09 crc kubenswrapper[4599]: I1012 08:23:09.586510 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_7f1c5e37-2b6d-4058-86e1-466baaa0f6c4/container-auditor/0.log" Oct 12 08:23:09 crc kubenswrapper[4599]: I1012 08:23:09.620762 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_7f1c5e37-2b6d-4058-86e1-466baaa0f6c4/container-replicator/0.log" Oct 12 08:23:09 crc kubenswrapper[4599]: I1012 08:23:09.702992 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_7f1c5e37-2b6d-4058-86e1-466baaa0f6c4/container-server/0.log" Oct 12 08:23:09 crc kubenswrapper[4599]: I1012 08:23:09.750877 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_7f1c5e37-2b6d-4058-86e1-466baaa0f6c4/container-updater/0.log" Oct 12 08:23:09 crc kubenswrapper[4599]: I1012 08:23:09.784481 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_7f1c5e37-2b6d-4058-86e1-466baaa0f6c4/object-auditor/0.log" Oct 12 08:23:09 crc kubenswrapper[4599]: I1012 08:23:09.808554 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_7f1c5e37-2b6d-4058-86e1-466baaa0f6c4/object-expirer/0.log" Oct 12 08:23:09 crc kubenswrapper[4599]: I1012 08:23:09.904175 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_7f1c5e37-2b6d-4058-86e1-466baaa0f6c4/object-replicator/0.log" Oct 12 08:23:09 crc kubenswrapper[4599]: I1012 08:23:09.909735 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_7f1c5e37-2b6d-4058-86e1-466baaa0f6c4/object-server/0.log" Oct 12 08:23:09 crc kubenswrapper[4599]: I1012 08:23:09.985556 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_7f1c5e37-2b6d-4058-86e1-466baaa0f6c4/object-updater/0.log" Oct 12 08:23:10 crc kubenswrapper[4599]: I1012 08:23:10.004583 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_7f1c5e37-2b6d-4058-86e1-466baaa0f6c4/rsync/0.log" Oct 12 08:23:10 crc kubenswrapper[4599]: I1012 08:23:10.096999 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_7f1c5e37-2b6d-4058-86e1-466baaa0f6c4/swift-recon-cron/0.log" Oct 12 08:23:10 crc kubenswrapper[4599]: I1012 08:23:10.207346 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-b67cz_a489145f-f0fe-4e55-a9eb-29df1419aa2b/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Oct 12 08:23:10 crc kubenswrapper[4599]: I1012 08:23:10.292872 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_bdf67e61-9c15-4079-9a9b-a74c40ad364f/tempest-tests-tempest-tests-runner/0.log" Oct 12 08:23:10 crc kubenswrapper[4599]: I1012 08:23:10.395716 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_11baa2a2-2767-4dde-96b3-708570c6575d/test-operator-logs-container/0.log" Oct 12 08:23:10 crc kubenswrapper[4599]: I1012 08:23:10.522884 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-d787p_0027b20a-21c6-437b-b807-50484ab21289/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 12 08:23:18 crc kubenswrapper[4599]: I1012 08:23:18.302597 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_fe3787ed-cb03-457b-ad65-b33044cccffd/memcached/0.log" Oct 12 08:23:18 crc kubenswrapper[4599]: I1012 08:23:18.545068 4599 scope.go:117] "RemoveContainer" containerID="536fe0523d7f4d373d85a237a3866f6e04c0049c07c9b6780974dfce18c2815f" Oct 12 08:23:18 crc kubenswrapper[4599]: E1012 08:23:18.545312 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5mz5c_openshift-machine-config-operator(cc694bce-8c25-4729-b452-29d44d3efe6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" Oct 12 08:23:29 crc kubenswrapper[4599]: I1012 08:23:29.484890 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-658bdf4b74-65btt_390a00af-1983-41ce-b7f2-3190e2d1594e/kube-rbac-proxy/0.log" Oct 12 08:23:29 crc kubenswrapper[4599]: I1012 08:23:29.544907 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-658bdf4b74-65btt_390a00af-1983-41ce-b7f2-3190e2d1594e/manager/0.log" Oct 12 08:23:29 crc kubenswrapper[4599]: I1012 08:23:29.654129 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365bb5qpj_a64bef86-ebe2-411a-92d3-f63087030b92/util/0.log" Oct 12 08:23:29 crc kubenswrapper[4599]: I1012 08:23:29.820322 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365bb5qpj_a64bef86-ebe2-411a-92d3-f63087030b92/util/0.log" Oct 12 08:23:29 crc kubenswrapper[4599]: I1012 08:23:29.828052 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365bb5qpj_a64bef86-ebe2-411a-92d3-f63087030b92/pull/0.log" Oct 12 08:23:29 crc kubenswrapper[4599]: I1012 08:23:29.832177 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365bb5qpj_a64bef86-ebe2-411a-92d3-f63087030b92/pull/0.log" Oct 12 08:23:29 crc kubenswrapper[4599]: I1012 08:23:29.981014 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365bb5qpj_a64bef86-ebe2-411a-92d3-f63087030b92/util/0.log" Oct 12 08:23:29 crc kubenswrapper[4599]: I1012 08:23:29.984922 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365bb5qpj_a64bef86-ebe2-411a-92d3-f63087030b92/extract/0.log" Oct 12 08:23:29 crc kubenswrapper[4599]: I1012 08:23:29.992143 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365bb5qpj_a64bef86-ebe2-411a-92d3-f63087030b92/pull/0.log" Oct 12 08:23:30 crc kubenswrapper[4599]: I1012 08:23:30.143545 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-7b7fb68549-nwqxv_f13311d0-566a-4c8d-823c-fae47384cd53/kube-rbac-proxy/0.log" Oct 12 08:23:30 crc kubenswrapper[4599]: I1012 08:23:30.153849 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-7b7fb68549-nwqxv_f13311d0-566a-4c8d-823c-fae47384cd53/manager/0.log" Oct 12 08:23:30 crc kubenswrapper[4599]: I1012 08:23:30.192635 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-85d5d9dd78-rd8wv_37355d3f-b321-446a-b0ac-5d3a770bd0c5/kube-rbac-proxy/0.log" Oct 12 08:23:30 crc kubenswrapper[4599]: I1012 08:23:30.301341 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-84b9b84486-kbqh6_4f32c957-e414-456c-b06e-6f38553efe85/kube-rbac-proxy/0.log" Oct 12 08:23:30 crc kubenswrapper[4599]: I1012 08:23:30.325600 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-85d5d9dd78-rd8wv_37355d3f-b321-446a-b0ac-5d3a770bd0c5/manager/0.log" Oct 12 08:23:30 crc kubenswrapper[4599]: I1012 08:23:30.448540 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-84b9b84486-kbqh6_4f32c957-e414-456c-b06e-6f38553efe85/manager/0.log" Oct 12 08:23:30 crc kubenswrapper[4599]: I1012 08:23:30.492412 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-858f76bbdd-n6scs_986744d6-7f19-4f08-9dfd-03629fe2ca58/kube-rbac-proxy/0.log" Oct 12 08:23:30 crc kubenswrapper[4599]: I1012 08:23:30.520273 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-858f76bbdd-n6scs_986744d6-7f19-4f08-9dfd-03629fe2ca58/manager/0.log" Oct 12 08:23:30 crc kubenswrapper[4599]: I1012 08:23:30.617892 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-7ffbcb7588-hlvdz_8ce364df-e28b-45cf-ae95-92ae415392f0/kube-rbac-proxy/0.log" Oct 12 08:23:30 crc kubenswrapper[4599]: I1012 08:23:30.661511 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-7ffbcb7588-hlvdz_8ce364df-e28b-45cf-ae95-92ae415392f0/manager/0.log" Oct 12 08:23:30 crc kubenswrapper[4599]: I1012 08:23:30.747829 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-656bcbd775-6xt98_32a9cc54-3488-4659-a83f-0a6dc0c402c9/kube-rbac-proxy/0.log" Oct 12 08:23:30 crc kubenswrapper[4599]: I1012 08:23:30.826303 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-9c5c78d49-gq5k6_f70b2a0c-df5a-4a41-89db-e1bf314ee45a/kube-rbac-proxy/0.log" Oct 12 08:23:30 crc kubenswrapper[4599]: I1012 08:23:30.914365 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-656bcbd775-6xt98_32a9cc54-3488-4659-a83f-0a6dc0c402c9/manager/0.log" Oct 12 08:23:30 crc kubenswrapper[4599]: I1012 08:23:30.917657 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-9c5c78d49-gq5k6_f70b2a0c-df5a-4a41-89db-e1bf314ee45a/manager/0.log" Oct 12 08:23:31 crc kubenswrapper[4599]: I1012 08:23:31.070489 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-55b6b7c7b8-7zfpt_297195f2-cb07-4bd8-9994-82f16e2f83f3/kube-rbac-proxy/0.log" Oct 12 08:23:31 crc kubenswrapper[4599]: I1012 08:23:31.112584 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-55b6b7c7b8-7zfpt_297195f2-cb07-4bd8-9994-82f16e2f83f3/manager/0.log" Oct 12 08:23:31 crc kubenswrapper[4599]: I1012 08:23:31.174472 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-5f67fbc655-sbgr8_6d340683-0013-4bf4-b98b-32610996ded4/kube-rbac-proxy/0.log" Oct 12 08:23:31 crc kubenswrapper[4599]: I1012 08:23:31.200300 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-5f67fbc655-sbgr8_6d340683-0013-4bf4-b98b-32610996ded4/manager/0.log" Oct 12 08:23:31 crc kubenswrapper[4599]: I1012 08:23:31.270574 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-f9fb45f8f-5zhzq_07c5394f-62de-4eca-86c0-c534788aead5/kube-rbac-proxy/0.log" Oct 12 08:23:31 crc kubenswrapper[4599]: I1012 08:23:31.370220 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-f9fb45f8f-5zhzq_07c5394f-62de-4eca-86c0-c534788aead5/manager/0.log" Oct 12 08:23:31 crc kubenswrapper[4599]: I1012 08:23:31.404786 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-79d585cb66-cz2h5_e88fb634-df40-47e9-a349-e7ac89e134f2/kube-rbac-proxy/0.log" Oct 12 08:23:31 crc kubenswrapper[4599]: I1012 08:23:31.455473 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-79d585cb66-cz2h5_e88fb634-df40-47e9-a349-e7ac89e134f2/manager/0.log" Oct 12 08:23:31 crc kubenswrapper[4599]: I1012 08:23:31.535455 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5df598886f-md9kr_84280387-ac26-4496-8c00-72673a91cb12/kube-rbac-proxy/0.log" Oct 12 08:23:31 crc kubenswrapper[4599]: I1012 08:23:31.544840 4599 scope.go:117] "RemoveContainer" containerID="536fe0523d7f4d373d85a237a3866f6e04c0049c07c9b6780974dfce18c2815f" Oct 12 08:23:31 crc kubenswrapper[4599]: E1012 08:23:31.545069 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5mz5c_openshift-machine-config-operator(cc694bce-8c25-4729-b452-29d44d3efe6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" Oct 12 08:23:31 crc kubenswrapper[4599]: I1012 08:23:31.635791 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5df598886f-md9kr_84280387-ac26-4496-8c00-72673a91cb12/manager/0.log" Oct 12 08:23:31 crc kubenswrapper[4599]: I1012 08:23:31.721281 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-69fdcfc5f5-8ccz2_bdeb0fae-d9af-4253-a90c-8a50255cc6fe/kube-rbac-proxy/0.log" Oct 12 08:23:31 crc kubenswrapper[4599]: I1012 08:23:31.722176 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-69fdcfc5f5-8ccz2_bdeb0fae-d9af-4253-a90c-8a50255cc6fe/manager/0.log" Oct 12 08:23:31 crc kubenswrapper[4599]: I1012 08:23:31.843656 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-5956dffb7b2gdkj_0cdab794-6175-4fb9-bd9d-c1080d45ee30/kube-rbac-proxy/0.log" Oct 12 08:23:31 crc kubenswrapper[4599]: I1012 08:23:31.873966 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-5956dffb7b2gdkj_0cdab794-6175-4fb9-bd9d-c1080d45ee30/manager/0.log" Oct 12 08:23:31 crc kubenswrapper[4599]: I1012 08:23:31.991276 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5b95c8954b-z5lsg_f4099eb0-0999-42a2-b525-5ae6b0ad984b/kube-rbac-proxy/0.log" Oct 12 08:23:32 crc kubenswrapper[4599]: I1012 08:23:32.072506 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-688d597459-gfhbq_bc697113-b995-44e9-92d2-070e55b12965/kube-rbac-proxy/0.log" Oct 12 08:23:32 crc kubenswrapper[4599]: I1012 08:23:32.305083 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-688d597459-gfhbq_bc697113-b995-44e9-92d2-070e55b12965/operator/0.log" Oct 12 08:23:32 crc kubenswrapper[4599]: I1012 08:23:32.315070 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-ds62j_2167f650-c160-4ede-ae67-5c8fd1f86b25/registry-server/0.log" Oct 12 08:23:32 crc kubenswrapper[4599]: I1012 08:23:32.491921 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-79df5fb58c-2fhbx_dc9b0db0-82c1-492b-94de-c8f93e96364f/kube-rbac-proxy/0.log" Oct 12 08:23:32 crc kubenswrapper[4599]: I1012 08:23:32.504292 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-79df5fb58c-2fhbx_dc9b0db0-82c1-492b-94de-c8f93e96364f/manager/0.log" Oct 12 08:23:32 crc kubenswrapper[4599]: I1012 08:23:32.547501 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-68b6c87b68-d2q7d_b1de56e0-d6e0-4c5a-9c4e-c725f171e142/kube-rbac-proxy/0.log" Oct 12 08:23:32 crc kubenswrapper[4599]: I1012 08:23:32.672517 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-68b6c87b68-d2q7d_b1de56e0-d6e0-4c5a-9c4e-c725f171e142/manager/0.log" Oct 12 08:23:32 crc kubenswrapper[4599]: I1012 08:23:32.714383 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-7sh6m_d5252e14-f285-43af-ace5-375bcfbe4c68/operator/0.log" Oct 12 08:23:32 crc kubenswrapper[4599]: I1012 08:23:32.882385 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-db6d7f97b-gzlp6_caf862ab-9fa9-4c44-8e6c-35599bcc45a1/kube-rbac-proxy/0.log" Oct 12 08:23:32 crc kubenswrapper[4599]: I1012 08:23:32.888774 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5b95c8954b-z5lsg_f4099eb0-0999-42a2-b525-5ae6b0ad984b/manager/0.log" Oct 12 08:23:32 crc kubenswrapper[4599]: I1012 08:23:32.912665 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-db6d7f97b-gzlp6_caf862ab-9fa9-4c44-8e6c-35599bcc45a1/manager/0.log" Oct 12 08:23:32 crc kubenswrapper[4599]: I1012 08:23:32.956472 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-67cfc6749b-m9l8f_5c565798-c0f8-4d14-b531-386b1b0efc63/kube-rbac-proxy/0.log" Oct 12 08:23:33 crc kubenswrapper[4599]: I1012 08:23:33.036841 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5458f77c4-b6bwb_5f9bd209-b904-4998-843c-d4573b0a2cd0/kube-rbac-proxy/0.log" Oct 12 08:23:33 crc kubenswrapper[4599]: I1012 08:23:33.059649 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5458f77c4-b6bwb_5f9bd209-b904-4998-843c-d4573b0a2cd0/manager/0.log" Oct 12 08:23:33 crc kubenswrapper[4599]: I1012 08:23:33.074201 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-67cfc6749b-m9l8f_5c565798-c0f8-4d14-b531-386b1b0efc63/manager/0.log" Oct 12 08:23:33 crc kubenswrapper[4599]: I1012 08:23:33.177396 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-7f554bff7b-ffpvz_20be3b1c-5de2-4c22-a3af-215e2272d586/kube-rbac-proxy/0.log" Oct 12 08:23:33 crc kubenswrapper[4599]: I1012 08:23:33.199370 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-7f554bff7b-ffpvz_20be3b1c-5de2-4c22-a3af-215e2272d586/manager/0.log" Oct 12 08:23:44 crc kubenswrapper[4599]: I1012 08:23:44.545419 4599 scope.go:117] "RemoveContainer" containerID="536fe0523d7f4d373d85a237a3866f6e04c0049c07c9b6780974dfce18c2815f" Oct 12 08:23:44 crc kubenswrapper[4599]: E1012 08:23:44.546011 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5mz5c_openshift-machine-config-operator(cc694bce-8c25-4729-b452-29d44d3efe6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" Oct 12 08:23:44 crc kubenswrapper[4599]: I1012 08:23:44.745714 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-44z27_b22d491a-9add-4ec5-ad3e-e593f9ca93bd/control-plane-machine-set-operator/0.log" Oct 12 08:23:44 crc kubenswrapper[4599]: I1012 08:23:44.944359 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-tgck5_1e511cbc-ca72-4380-8626-d9cade8ce3e2/kube-rbac-proxy/0.log" Oct 12 08:23:44 crc kubenswrapper[4599]: I1012 08:23:44.986377 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-tgck5_1e511cbc-ca72-4380-8626-d9cade8ce3e2/machine-api-operator/0.log" Oct 12 08:23:54 crc kubenswrapper[4599]: I1012 08:23:54.530490 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-7v5dn_70c0dff5-0cd7-4399-a044-c95469bea793/cert-manager-controller/0.log" Oct 12 08:23:54 crc kubenswrapper[4599]: I1012 08:23:54.651478 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-kpxnw_88150699-5d1d-4b47-ad1e-bbe4cf006a3e/cert-manager-cainjector/0.log" Oct 12 08:23:54 crc kubenswrapper[4599]: I1012 08:23:54.674040 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-tp9pm_c6d2a135-7c1d-4cfb-b8ee-fa9737f62776/cert-manager-webhook/0.log" Oct 12 08:23:57 crc kubenswrapper[4599]: I1012 08:23:57.545229 4599 scope.go:117] "RemoveContainer" containerID="536fe0523d7f4d373d85a237a3866f6e04c0049c07c9b6780974dfce18c2815f" Oct 12 08:23:57 crc kubenswrapper[4599]: E1012 08:23:57.545730 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5mz5c_openshift-machine-config-operator(cc694bce-8c25-4729-b452-29d44d3efe6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" Oct 12 08:24:03 crc kubenswrapper[4599]: I1012 08:24:03.236037 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6b874cbd85-klx8j_2ff98253-bb25-450e-9202-817788dab660/nmstate-console-plugin/0.log" Oct 12 08:24:03 crc kubenswrapper[4599]: I1012 08:24:03.370872 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-mxzhr_cf76296b-ab43-4e40-83c9-ee507169ea4c/nmstate-handler/0.log" Oct 12 08:24:03 crc kubenswrapper[4599]: I1012 08:24:03.432197 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-tls82_126c69f6-12a5-46a8-a817-23b97dc624d7/kube-rbac-proxy/0.log" Oct 12 08:24:03 crc kubenswrapper[4599]: I1012 08:24:03.449941 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-tls82_126c69f6-12a5-46a8-a817-23b97dc624d7/nmstate-metrics/0.log" Oct 12 08:24:03 crc kubenswrapper[4599]: I1012 08:24:03.583365 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-858ddd8f98-vz57h_87a55b6b-2189-4332-beb0-5bf12c1ded00/nmstate-operator/0.log" Oct 12 08:24:03 crc kubenswrapper[4599]: I1012 08:24:03.608038 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6cdbc54649-rb46k_fe7e44b3-5972-4d03-8919-8a67214fee06/nmstate-webhook/0.log" Oct 12 08:24:11 crc kubenswrapper[4599]: I1012 08:24:11.545616 4599 scope.go:117] "RemoveContainer" containerID="536fe0523d7f4d373d85a237a3866f6e04c0049c07c9b6780974dfce18c2815f" Oct 12 08:24:11 crc kubenswrapper[4599]: E1012 08:24:11.546514 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5mz5c_openshift-machine-config-operator(cc694bce-8c25-4729-b452-29d44d3efe6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" Oct 12 08:24:13 crc kubenswrapper[4599]: I1012 08:24:13.572465 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-v2d4g_a8d05e19-feeb-41e3-ab30-f55af42472ca/kube-rbac-proxy/0.log" Oct 12 08:24:13 crc kubenswrapper[4599]: I1012 08:24:13.656104 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-v2d4g_a8d05e19-feeb-41e3-ab30-f55af42472ca/controller/0.log" Oct 12 08:24:13 crc kubenswrapper[4599]: I1012 08:24:13.727121 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-76vb2_e2411c1e-4e34-4ca5-aa25-0b317652dd35/cp-frr-files/0.log" Oct 12 08:24:13 crc kubenswrapper[4599]: I1012 08:24:13.902654 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-76vb2_e2411c1e-4e34-4ca5-aa25-0b317652dd35/cp-metrics/0.log" Oct 12 08:24:13 crc kubenswrapper[4599]: I1012 08:24:13.912977 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-76vb2_e2411c1e-4e34-4ca5-aa25-0b317652dd35/cp-frr-files/0.log" Oct 12 08:24:13 crc kubenswrapper[4599]: I1012 08:24:13.923773 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-76vb2_e2411c1e-4e34-4ca5-aa25-0b317652dd35/cp-reloader/0.log" Oct 12 08:24:13 crc kubenswrapper[4599]: I1012 08:24:13.950236 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-76vb2_e2411c1e-4e34-4ca5-aa25-0b317652dd35/cp-reloader/0.log" Oct 12 08:24:14 crc kubenswrapper[4599]: I1012 08:24:14.063375 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-76vb2_e2411c1e-4e34-4ca5-aa25-0b317652dd35/cp-frr-files/0.log" Oct 12 08:24:14 crc kubenswrapper[4599]: I1012 08:24:14.066767 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-76vb2_e2411c1e-4e34-4ca5-aa25-0b317652dd35/cp-reloader/0.log" Oct 12 08:24:14 crc kubenswrapper[4599]: I1012 08:24:14.079417 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-76vb2_e2411c1e-4e34-4ca5-aa25-0b317652dd35/cp-metrics/0.log" Oct 12 08:24:14 crc kubenswrapper[4599]: I1012 08:24:14.102941 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-76vb2_e2411c1e-4e34-4ca5-aa25-0b317652dd35/cp-metrics/0.log" Oct 12 08:24:14 crc kubenswrapper[4599]: I1012 08:24:14.228798 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-76vb2_e2411c1e-4e34-4ca5-aa25-0b317652dd35/cp-metrics/0.log" Oct 12 08:24:14 crc kubenswrapper[4599]: I1012 08:24:14.230618 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-76vb2_e2411c1e-4e34-4ca5-aa25-0b317652dd35/cp-reloader/0.log" Oct 12 08:24:14 crc kubenswrapper[4599]: I1012 08:24:14.254966 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-76vb2_e2411c1e-4e34-4ca5-aa25-0b317652dd35/cp-frr-files/0.log" Oct 12 08:24:14 crc kubenswrapper[4599]: I1012 08:24:14.257061 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-76vb2_e2411c1e-4e34-4ca5-aa25-0b317652dd35/controller/0.log" Oct 12 08:24:14 crc kubenswrapper[4599]: I1012 08:24:14.394158 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-76vb2_e2411c1e-4e34-4ca5-aa25-0b317652dd35/kube-rbac-proxy/0.log" Oct 12 08:24:14 crc kubenswrapper[4599]: I1012 08:24:14.402797 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-76vb2_e2411c1e-4e34-4ca5-aa25-0b317652dd35/frr-metrics/0.log" Oct 12 08:24:14 crc kubenswrapper[4599]: I1012 08:24:14.416230 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-76vb2_e2411c1e-4e34-4ca5-aa25-0b317652dd35/kube-rbac-proxy-frr/0.log" Oct 12 08:24:14 crc kubenswrapper[4599]: I1012 08:24:14.553387 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-76vb2_e2411c1e-4e34-4ca5-aa25-0b317652dd35/reloader/0.log" Oct 12 08:24:14 crc kubenswrapper[4599]: I1012 08:24:14.616292 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-64bf5d555-zcx46_6f5b7478-9c00-4302-b356-cae717338202/frr-k8s-webhook-server/0.log" Oct 12 08:24:14 crc kubenswrapper[4599]: I1012 08:24:14.816055 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-75d566c47b-2dhk8_14e65bd0-37ae-438c-9d25-b2d4b70556e7/manager/0.log" Oct 12 08:24:14 crc kubenswrapper[4599]: I1012 08:24:14.917165 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-5c94cffdb4-lj8nf_bb4359a2-e2a1-4e37-b7df-420ab49781c6/webhook-server/0.log" Oct 12 08:24:15 crc kubenswrapper[4599]: I1012 08:24:15.095158 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-g4nt4_12d002b3-67b4-4405-ab2f-930346bfc610/kube-rbac-proxy/0.log" Oct 12 08:24:15 crc kubenswrapper[4599]: I1012 08:24:15.522890 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-g4nt4_12d002b3-67b4-4405-ab2f-930346bfc610/speaker/0.log" Oct 12 08:24:15 crc kubenswrapper[4599]: I1012 08:24:15.572735 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-76vb2_e2411c1e-4e34-4ca5-aa25-0b317652dd35/frr/0.log" Oct 12 08:24:23 crc kubenswrapper[4599]: I1012 08:24:23.549328 4599 scope.go:117] "RemoveContainer" containerID="536fe0523d7f4d373d85a237a3866f6e04c0049c07c9b6780974dfce18c2815f" Oct 12 08:24:23 crc kubenswrapper[4599]: E1012 08:24:23.549947 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5mz5c_openshift-machine-config-operator(cc694bce-8c25-4729-b452-29d44d3efe6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" Oct 12 08:24:24 crc kubenswrapper[4599]: I1012 08:24:24.199987 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xcf9v_492ddcff-667c-4dda-b878-741413cb8aa1/util/0.log" Oct 12 08:24:24 crc kubenswrapper[4599]: I1012 08:24:24.340829 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xcf9v_492ddcff-667c-4dda-b878-741413cb8aa1/util/0.log" Oct 12 08:24:24 crc kubenswrapper[4599]: I1012 08:24:24.357638 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xcf9v_492ddcff-667c-4dda-b878-741413cb8aa1/pull/0.log" Oct 12 08:24:24 crc kubenswrapper[4599]: I1012 08:24:24.364065 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xcf9v_492ddcff-667c-4dda-b878-741413cb8aa1/pull/0.log" Oct 12 08:24:24 crc kubenswrapper[4599]: I1012 08:24:24.505740 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xcf9v_492ddcff-667c-4dda-b878-741413cb8aa1/util/0.log" Oct 12 08:24:24 crc kubenswrapper[4599]: I1012 08:24:24.511569 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xcf9v_492ddcff-667c-4dda-b878-741413cb8aa1/pull/0.log" Oct 12 08:24:24 crc kubenswrapper[4599]: I1012 08:24:24.552853 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xcf9v_492ddcff-667c-4dda-b878-741413cb8aa1/extract/0.log" Oct 12 08:24:24 crc kubenswrapper[4599]: I1012 08:24:24.652081 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sn9x6_c3548e29-bf51-474a-9110-60bfad743fd3/extract-utilities/0.log" Oct 12 08:24:24 crc kubenswrapper[4599]: I1012 08:24:24.834536 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sn9x6_c3548e29-bf51-474a-9110-60bfad743fd3/extract-content/0.log" Oct 12 08:24:24 crc kubenswrapper[4599]: I1012 08:24:24.837436 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sn9x6_c3548e29-bf51-474a-9110-60bfad743fd3/extract-utilities/0.log" Oct 12 08:24:24 crc kubenswrapper[4599]: I1012 08:24:24.848525 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sn9x6_c3548e29-bf51-474a-9110-60bfad743fd3/extract-content/0.log" Oct 12 08:24:25 crc kubenswrapper[4599]: I1012 08:24:25.023258 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sn9x6_c3548e29-bf51-474a-9110-60bfad743fd3/extract-utilities/0.log" Oct 12 08:24:25 crc kubenswrapper[4599]: I1012 08:24:25.075864 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sn9x6_c3548e29-bf51-474a-9110-60bfad743fd3/extract-content/0.log" Oct 12 08:24:25 crc kubenswrapper[4599]: I1012 08:24:25.234137 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-vz257_45933fcb-2999-4505-8356-fb50f8d1e2c7/extract-utilities/0.log" Oct 12 08:24:25 crc kubenswrapper[4599]: I1012 08:24:25.339470 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sn9x6_c3548e29-bf51-474a-9110-60bfad743fd3/registry-server/0.log" Oct 12 08:24:25 crc kubenswrapper[4599]: I1012 08:24:25.401114 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-vz257_45933fcb-2999-4505-8356-fb50f8d1e2c7/extract-utilities/0.log" Oct 12 08:24:25 crc kubenswrapper[4599]: I1012 08:24:25.429837 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-vz257_45933fcb-2999-4505-8356-fb50f8d1e2c7/extract-content/0.log" Oct 12 08:24:25 crc kubenswrapper[4599]: I1012 08:24:25.445608 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-vz257_45933fcb-2999-4505-8356-fb50f8d1e2c7/extract-content/0.log" Oct 12 08:24:25 crc kubenswrapper[4599]: I1012 08:24:25.574237 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-vz257_45933fcb-2999-4505-8356-fb50f8d1e2c7/extract-utilities/0.log" Oct 12 08:24:25 crc kubenswrapper[4599]: I1012 08:24:25.597136 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-vz257_45933fcb-2999-4505-8356-fb50f8d1e2c7/extract-content/0.log" Oct 12 08:24:25 crc kubenswrapper[4599]: I1012 08:24:25.766822 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cg6fsr_82699d9d-fb0c-46be-9020-16bb7e4cb65c/util/0.log" Oct 12 08:24:25 crc kubenswrapper[4599]: I1012 08:24:25.928721 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cg6fsr_82699d9d-fb0c-46be-9020-16bb7e4cb65c/pull/0.log" Oct 12 08:24:25 crc kubenswrapper[4599]: I1012 08:24:25.989276 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cg6fsr_82699d9d-fb0c-46be-9020-16bb7e4cb65c/util/0.log" Oct 12 08:24:25 crc kubenswrapper[4599]: I1012 08:24:25.993870 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cg6fsr_82699d9d-fb0c-46be-9020-16bb7e4cb65c/pull/0.log" Oct 12 08:24:26 crc kubenswrapper[4599]: I1012 08:24:26.049774 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-vz257_45933fcb-2999-4505-8356-fb50f8d1e2c7/registry-server/0.log" Oct 12 08:24:26 crc kubenswrapper[4599]: I1012 08:24:26.156245 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cg6fsr_82699d9d-fb0c-46be-9020-16bb7e4cb65c/util/0.log" Oct 12 08:24:26 crc kubenswrapper[4599]: I1012 08:24:26.183201 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cg6fsr_82699d9d-fb0c-46be-9020-16bb7e4cb65c/pull/0.log" Oct 12 08:24:26 crc kubenswrapper[4599]: I1012 08:24:26.228347 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cg6fsr_82699d9d-fb0c-46be-9020-16bb7e4cb65c/extract/0.log" Oct 12 08:24:26 crc kubenswrapper[4599]: I1012 08:24:26.331284 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-2vntc_04a727ef-7194-4df1-b0a2-0107085a972d/marketplace-operator/0.log" Oct 12 08:24:26 crc kubenswrapper[4599]: I1012 08:24:26.369593 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xkm4x_4b6aaa9e-4f52-45ec-994b-c0b4e600c00f/extract-utilities/0.log" Oct 12 08:24:26 crc kubenswrapper[4599]: I1012 08:24:26.493851 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xkm4x_4b6aaa9e-4f52-45ec-994b-c0b4e600c00f/extract-utilities/0.log" Oct 12 08:24:26 crc kubenswrapper[4599]: I1012 08:24:26.517601 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xkm4x_4b6aaa9e-4f52-45ec-994b-c0b4e600c00f/extract-content/0.log" Oct 12 08:24:26 crc kubenswrapper[4599]: I1012 08:24:26.521046 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xkm4x_4b6aaa9e-4f52-45ec-994b-c0b4e600c00f/extract-content/0.log" Oct 12 08:24:26 crc kubenswrapper[4599]: I1012 08:24:26.649325 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xkm4x_4b6aaa9e-4f52-45ec-994b-c0b4e600c00f/extract-content/0.log" Oct 12 08:24:26 crc kubenswrapper[4599]: I1012 08:24:26.656744 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xkm4x_4b6aaa9e-4f52-45ec-994b-c0b4e600c00f/extract-utilities/0.log" Oct 12 08:24:26 crc kubenswrapper[4599]: I1012 08:24:26.749289 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xkm4x_4b6aaa9e-4f52-45ec-994b-c0b4e600c00f/registry-server/0.log" Oct 12 08:24:26 crc kubenswrapper[4599]: I1012 08:24:26.831801 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9d992_ff5e25ae-0293-4ddc-8e3e-1c27d4aa9d89/extract-utilities/0.log" Oct 12 08:24:26 crc kubenswrapper[4599]: I1012 08:24:26.975266 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9d992_ff5e25ae-0293-4ddc-8e3e-1c27d4aa9d89/extract-content/0.log" Oct 12 08:24:26 crc kubenswrapper[4599]: I1012 08:24:26.977450 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9d992_ff5e25ae-0293-4ddc-8e3e-1c27d4aa9d89/extract-utilities/0.log" Oct 12 08:24:27 crc kubenswrapper[4599]: I1012 08:24:27.003622 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9d992_ff5e25ae-0293-4ddc-8e3e-1c27d4aa9d89/extract-content/0.log" Oct 12 08:24:27 crc kubenswrapper[4599]: I1012 08:24:27.106918 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9d992_ff5e25ae-0293-4ddc-8e3e-1c27d4aa9d89/extract-content/0.log" Oct 12 08:24:27 crc kubenswrapper[4599]: I1012 08:24:27.111455 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9d992_ff5e25ae-0293-4ddc-8e3e-1c27d4aa9d89/extract-utilities/0.log" Oct 12 08:24:27 crc kubenswrapper[4599]: I1012 08:24:27.454450 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9d992_ff5e25ae-0293-4ddc-8e3e-1c27d4aa9d89/registry-server/0.log" Oct 12 08:24:35 crc kubenswrapper[4599]: I1012 08:24:35.545278 4599 scope.go:117] "RemoveContainer" containerID="536fe0523d7f4d373d85a237a3866f6e04c0049c07c9b6780974dfce18c2815f" Oct 12 08:24:35 crc kubenswrapper[4599]: E1012 08:24:35.545926 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5mz5c_openshift-machine-config-operator(cc694bce-8c25-4729-b452-29d44d3efe6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" Oct 12 08:24:49 crc kubenswrapper[4599]: I1012 08:24:49.545398 4599 scope.go:117] "RemoveContainer" containerID="536fe0523d7f4d373d85a237a3866f6e04c0049c07c9b6780974dfce18c2815f" Oct 12 08:24:49 crc kubenswrapper[4599]: E1012 08:24:49.547526 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5mz5c_openshift-machine-config-operator(cc694bce-8c25-4729-b452-29d44d3efe6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" Oct 12 08:25:04 crc kubenswrapper[4599]: I1012 08:25:04.545136 4599 scope.go:117] "RemoveContainer" containerID="536fe0523d7f4d373d85a237a3866f6e04c0049c07c9b6780974dfce18c2815f" Oct 12 08:25:04 crc kubenswrapper[4599]: E1012 08:25:04.545673 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5mz5c_openshift-machine-config-operator(cc694bce-8c25-4729-b452-29d44d3efe6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" Oct 12 08:25:15 crc kubenswrapper[4599]: I1012 08:25:15.545521 4599 scope.go:117] "RemoveContainer" containerID="536fe0523d7f4d373d85a237a3866f6e04c0049c07c9b6780974dfce18c2815f" Oct 12 08:25:15 crc kubenswrapper[4599]: E1012 08:25:15.546071 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5mz5c_openshift-machine-config-operator(cc694bce-8c25-4729-b452-29d44d3efe6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" Oct 12 08:25:30 crc kubenswrapper[4599]: I1012 08:25:30.545172 4599 scope.go:117] "RemoveContainer" containerID="536fe0523d7f4d373d85a237a3866f6e04c0049c07c9b6780974dfce18c2815f" Oct 12 08:25:30 crc kubenswrapper[4599]: E1012 08:25:30.545706 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5mz5c_openshift-machine-config-operator(cc694bce-8c25-4729-b452-29d44d3efe6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" Oct 12 08:25:42 crc kubenswrapper[4599]: I1012 08:25:42.824065 4599 generic.go:334] "Generic (PLEG): container finished" podID="fb5ad21b-8e6a-43cb-b194-16397872fd71" containerID="f4e1f78f2906119c4c04399a39cad0e49ac30185490301f7ca759337d4deed6e" exitCode=0 Oct 12 08:25:42 crc kubenswrapper[4599]: I1012 08:25:42.824254 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-s256c/must-gather-76gb2" event={"ID":"fb5ad21b-8e6a-43cb-b194-16397872fd71","Type":"ContainerDied","Data":"f4e1f78f2906119c4c04399a39cad0e49ac30185490301f7ca759337d4deed6e"} Oct 12 08:25:42 crc kubenswrapper[4599]: I1012 08:25:42.825085 4599 scope.go:117] "RemoveContainer" containerID="f4e1f78f2906119c4c04399a39cad0e49ac30185490301f7ca759337d4deed6e" Oct 12 08:25:42 crc kubenswrapper[4599]: I1012 08:25:42.870813 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-s256c_must-gather-76gb2_fb5ad21b-8e6a-43cb-b194-16397872fd71/gather/0.log" Oct 12 08:25:43 crc kubenswrapper[4599]: I1012 08:25:43.551193 4599 scope.go:117] "RemoveContainer" containerID="536fe0523d7f4d373d85a237a3866f6e04c0049c07c9b6780974dfce18c2815f" Oct 12 08:25:43 crc kubenswrapper[4599]: E1012 08:25:43.551700 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5mz5c_openshift-machine-config-operator(cc694bce-8c25-4729-b452-29d44d3efe6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" Oct 12 08:25:50 crc kubenswrapper[4599]: I1012 08:25:50.025834 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-s256c/must-gather-76gb2"] Oct 12 08:25:50 crc kubenswrapper[4599]: I1012 08:25:50.026462 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-s256c/must-gather-76gb2" podUID="fb5ad21b-8e6a-43cb-b194-16397872fd71" containerName="copy" containerID="cri-o://3c9f5a86f36bd272a50df83079ffbd8469e49e79ed4e90bb337b06616f967174" gracePeriod=2 Oct 12 08:25:50 crc kubenswrapper[4599]: I1012 08:25:50.029040 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-s256c/must-gather-76gb2"] Oct 12 08:25:50 crc kubenswrapper[4599]: I1012 08:25:50.372281 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-s256c_must-gather-76gb2_fb5ad21b-8e6a-43cb-b194-16397872fd71/copy/0.log" Oct 12 08:25:50 crc kubenswrapper[4599]: I1012 08:25:50.372914 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-s256c/must-gather-76gb2" Oct 12 08:25:50 crc kubenswrapper[4599]: I1012 08:25:50.448296 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/fb5ad21b-8e6a-43cb-b194-16397872fd71-must-gather-output\") pod \"fb5ad21b-8e6a-43cb-b194-16397872fd71\" (UID: \"fb5ad21b-8e6a-43cb-b194-16397872fd71\") " Oct 12 08:25:50 crc kubenswrapper[4599]: I1012 08:25:50.448517 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9qhk\" (UniqueName: \"kubernetes.io/projected/fb5ad21b-8e6a-43cb-b194-16397872fd71-kube-api-access-g9qhk\") pod \"fb5ad21b-8e6a-43cb-b194-16397872fd71\" (UID: \"fb5ad21b-8e6a-43cb-b194-16397872fd71\") " Oct 12 08:25:50 crc kubenswrapper[4599]: I1012 08:25:50.452896 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb5ad21b-8e6a-43cb-b194-16397872fd71-kube-api-access-g9qhk" (OuterVolumeSpecName: "kube-api-access-g9qhk") pod "fb5ad21b-8e6a-43cb-b194-16397872fd71" (UID: "fb5ad21b-8e6a-43cb-b194-16397872fd71"). InnerVolumeSpecName "kube-api-access-g9qhk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 08:25:50 crc kubenswrapper[4599]: I1012 08:25:50.545493 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb5ad21b-8e6a-43cb-b194-16397872fd71-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "fb5ad21b-8e6a-43cb-b194-16397872fd71" (UID: "fb5ad21b-8e6a-43cb-b194-16397872fd71"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 08:25:50 crc kubenswrapper[4599]: I1012 08:25:50.550722 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9qhk\" (UniqueName: \"kubernetes.io/projected/fb5ad21b-8e6a-43cb-b194-16397872fd71-kube-api-access-g9qhk\") on node \"crc\" DevicePath \"\"" Oct 12 08:25:50 crc kubenswrapper[4599]: I1012 08:25:50.550749 4599 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/fb5ad21b-8e6a-43cb-b194-16397872fd71-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 12 08:25:50 crc kubenswrapper[4599]: I1012 08:25:50.880573 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-s256c_must-gather-76gb2_fb5ad21b-8e6a-43cb-b194-16397872fd71/copy/0.log" Oct 12 08:25:50 crc kubenswrapper[4599]: I1012 08:25:50.880920 4599 generic.go:334] "Generic (PLEG): container finished" podID="fb5ad21b-8e6a-43cb-b194-16397872fd71" containerID="3c9f5a86f36bd272a50df83079ffbd8469e49e79ed4e90bb337b06616f967174" exitCode=143 Oct 12 08:25:50 crc kubenswrapper[4599]: I1012 08:25:50.880966 4599 scope.go:117] "RemoveContainer" containerID="3c9f5a86f36bd272a50df83079ffbd8469e49e79ed4e90bb337b06616f967174" Oct 12 08:25:50 crc kubenswrapper[4599]: I1012 08:25:50.880977 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-s256c/must-gather-76gb2" Oct 12 08:25:50 crc kubenswrapper[4599]: I1012 08:25:50.895257 4599 scope.go:117] "RemoveContainer" containerID="f4e1f78f2906119c4c04399a39cad0e49ac30185490301f7ca759337d4deed6e" Oct 12 08:25:50 crc kubenswrapper[4599]: I1012 08:25:50.942402 4599 scope.go:117] "RemoveContainer" containerID="3c9f5a86f36bd272a50df83079ffbd8469e49e79ed4e90bb337b06616f967174" Oct 12 08:25:50 crc kubenswrapper[4599]: E1012 08:25:50.942753 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c9f5a86f36bd272a50df83079ffbd8469e49e79ed4e90bb337b06616f967174\": container with ID starting with 3c9f5a86f36bd272a50df83079ffbd8469e49e79ed4e90bb337b06616f967174 not found: ID does not exist" containerID="3c9f5a86f36bd272a50df83079ffbd8469e49e79ed4e90bb337b06616f967174" Oct 12 08:25:50 crc kubenswrapper[4599]: I1012 08:25:50.942783 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c9f5a86f36bd272a50df83079ffbd8469e49e79ed4e90bb337b06616f967174"} err="failed to get container status \"3c9f5a86f36bd272a50df83079ffbd8469e49e79ed4e90bb337b06616f967174\": rpc error: code = NotFound desc = could not find container \"3c9f5a86f36bd272a50df83079ffbd8469e49e79ed4e90bb337b06616f967174\": container with ID starting with 3c9f5a86f36bd272a50df83079ffbd8469e49e79ed4e90bb337b06616f967174 not found: ID does not exist" Oct 12 08:25:50 crc kubenswrapper[4599]: I1012 08:25:50.942801 4599 scope.go:117] "RemoveContainer" containerID="f4e1f78f2906119c4c04399a39cad0e49ac30185490301f7ca759337d4deed6e" Oct 12 08:25:50 crc kubenswrapper[4599]: E1012 08:25:50.943151 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4e1f78f2906119c4c04399a39cad0e49ac30185490301f7ca759337d4deed6e\": container with ID starting with f4e1f78f2906119c4c04399a39cad0e49ac30185490301f7ca759337d4deed6e not found: ID does not exist" containerID="f4e1f78f2906119c4c04399a39cad0e49ac30185490301f7ca759337d4deed6e" Oct 12 08:25:50 crc kubenswrapper[4599]: I1012 08:25:50.943187 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4e1f78f2906119c4c04399a39cad0e49ac30185490301f7ca759337d4deed6e"} err="failed to get container status \"f4e1f78f2906119c4c04399a39cad0e49ac30185490301f7ca759337d4deed6e\": rpc error: code = NotFound desc = could not find container \"f4e1f78f2906119c4c04399a39cad0e49ac30185490301f7ca759337d4deed6e\": container with ID starting with f4e1f78f2906119c4c04399a39cad0e49ac30185490301f7ca759337d4deed6e not found: ID does not exist" Oct 12 08:25:51 crc kubenswrapper[4599]: I1012 08:25:51.552319 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb5ad21b-8e6a-43cb-b194-16397872fd71" path="/var/lib/kubelet/pods/fb5ad21b-8e6a-43cb-b194-16397872fd71/volumes" Oct 12 08:25:54 crc kubenswrapper[4599]: I1012 08:25:54.544575 4599 scope.go:117] "RemoveContainer" containerID="536fe0523d7f4d373d85a237a3866f6e04c0049c07c9b6780974dfce18c2815f" Oct 12 08:25:54 crc kubenswrapper[4599]: E1012 08:25:54.544927 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5mz5c_openshift-machine-config-operator(cc694bce-8c25-4729-b452-29d44d3efe6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" Oct 12 08:26:09 crc kubenswrapper[4599]: I1012 08:26:09.545397 4599 scope.go:117] "RemoveContainer" containerID="536fe0523d7f4d373d85a237a3866f6e04c0049c07c9b6780974dfce18c2815f" Oct 12 08:26:09 crc kubenswrapper[4599]: E1012 08:26:09.546611 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5mz5c_openshift-machine-config-operator(cc694bce-8c25-4729-b452-29d44d3efe6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" Oct 12 08:26:20 crc kubenswrapper[4599]: I1012 08:26:20.343514 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-nsbng/must-gather-kg65p"] Oct 12 08:26:20 crc kubenswrapper[4599]: E1012 08:26:20.344202 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb5ad21b-8e6a-43cb-b194-16397872fd71" containerName="copy" Oct 12 08:26:20 crc kubenswrapper[4599]: I1012 08:26:20.344215 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb5ad21b-8e6a-43cb-b194-16397872fd71" containerName="copy" Oct 12 08:26:20 crc kubenswrapper[4599]: E1012 08:26:20.344241 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb5ad21b-8e6a-43cb-b194-16397872fd71" containerName="gather" Oct 12 08:26:20 crc kubenswrapper[4599]: I1012 08:26:20.344248 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb5ad21b-8e6a-43cb-b194-16397872fd71" containerName="gather" Oct 12 08:26:20 crc kubenswrapper[4599]: E1012 08:26:20.344256 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9a9df04-a178-4ddb-a9e3-16ff994ef030" containerName="container-00" Oct 12 08:26:20 crc kubenswrapper[4599]: I1012 08:26:20.344264 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9a9df04-a178-4ddb-a9e3-16ff994ef030" containerName="container-00" Oct 12 08:26:20 crc kubenswrapper[4599]: I1012 08:26:20.344499 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9a9df04-a178-4ddb-a9e3-16ff994ef030" containerName="container-00" Oct 12 08:26:20 crc kubenswrapper[4599]: I1012 08:26:20.344510 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb5ad21b-8e6a-43cb-b194-16397872fd71" containerName="copy" Oct 12 08:26:20 crc kubenswrapper[4599]: I1012 08:26:20.344526 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb5ad21b-8e6a-43cb-b194-16397872fd71" containerName="gather" Oct 12 08:26:20 crc kubenswrapper[4599]: I1012 08:26:20.345412 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nsbng/must-gather-kg65p" Oct 12 08:26:20 crc kubenswrapper[4599]: I1012 08:26:20.351324 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-nsbng"/"kube-root-ca.crt" Oct 12 08:26:20 crc kubenswrapper[4599]: I1012 08:26:20.351351 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-nsbng"/"default-dockercfg-8df7x" Oct 12 08:26:20 crc kubenswrapper[4599]: I1012 08:26:20.351552 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-nsbng"/"openshift-service-ca.crt" Oct 12 08:26:20 crc kubenswrapper[4599]: I1012 08:26:20.365866 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-nsbng/must-gather-kg65p"] Oct 12 08:26:20 crc kubenswrapper[4599]: I1012 08:26:20.385041 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9l562\" (UniqueName: \"kubernetes.io/projected/b0a8657a-c347-4913-afa8-020edbd6713a-kube-api-access-9l562\") pod \"must-gather-kg65p\" (UID: \"b0a8657a-c347-4913-afa8-020edbd6713a\") " pod="openshift-must-gather-nsbng/must-gather-kg65p" Oct 12 08:26:20 crc kubenswrapper[4599]: I1012 08:26:20.385106 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b0a8657a-c347-4913-afa8-020edbd6713a-must-gather-output\") pod \"must-gather-kg65p\" (UID: \"b0a8657a-c347-4913-afa8-020edbd6713a\") " pod="openshift-must-gather-nsbng/must-gather-kg65p" Oct 12 08:26:20 crc kubenswrapper[4599]: I1012 08:26:20.486747 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9l562\" (UniqueName: \"kubernetes.io/projected/b0a8657a-c347-4913-afa8-020edbd6713a-kube-api-access-9l562\") pod \"must-gather-kg65p\" (UID: \"b0a8657a-c347-4913-afa8-020edbd6713a\") " pod="openshift-must-gather-nsbng/must-gather-kg65p" Oct 12 08:26:20 crc kubenswrapper[4599]: I1012 08:26:20.486786 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b0a8657a-c347-4913-afa8-020edbd6713a-must-gather-output\") pod \"must-gather-kg65p\" (UID: \"b0a8657a-c347-4913-afa8-020edbd6713a\") " pod="openshift-must-gather-nsbng/must-gather-kg65p" Oct 12 08:26:20 crc kubenswrapper[4599]: I1012 08:26:20.487147 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b0a8657a-c347-4913-afa8-020edbd6713a-must-gather-output\") pod \"must-gather-kg65p\" (UID: \"b0a8657a-c347-4913-afa8-020edbd6713a\") " pod="openshift-must-gather-nsbng/must-gather-kg65p" Oct 12 08:26:20 crc kubenswrapper[4599]: I1012 08:26:20.500975 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9l562\" (UniqueName: \"kubernetes.io/projected/b0a8657a-c347-4913-afa8-020edbd6713a-kube-api-access-9l562\") pod \"must-gather-kg65p\" (UID: \"b0a8657a-c347-4913-afa8-020edbd6713a\") " pod="openshift-must-gather-nsbng/must-gather-kg65p" Oct 12 08:26:20 crc kubenswrapper[4599]: I1012 08:26:20.660252 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nsbng/must-gather-kg65p" Oct 12 08:26:21 crc kubenswrapper[4599]: I1012 08:26:21.106081 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-nsbng/must-gather-kg65p"] Oct 12 08:26:22 crc kubenswrapper[4599]: I1012 08:26:22.091079 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nsbng/must-gather-kg65p" event={"ID":"b0a8657a-c347-4913-afa8-020edbd6713a","Type":"ContainerStarted","Data":"a81e6297c3114d099b16dabffb92bddbec48235ee56b4623a6d93e8a80ed563d"} Oct 12 08:26:22 crc kubenswrapper[4599]: I1012 08:26:22.091372 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nsbng/must-gather-kg65p" event={"ID":"b0a8657a-c347-4913-afa8-020edbd6713a","Type":"ContainerStarted","Data":"9edae95dc665c176ae8ee0b31c5a4a8849bbb7a4b6a14ad7882e4b9dada95ddc"} Oct 12 08:26:22 crc kubenswrapper[4599]: I1012 08:26:22.091390 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nsbng/must-gather-kg65p" event={"ID":"b0a8657a-c347-4913-afa8-020edbd6713a","Type":"ContainerStarted","Data":"8e9161e95953a2aeb06d3fe904507e811fbb706c4d3822bca30195169097b72c"} Oct 12 08:26:22 crc kubenswrapper[4599]: I1012 08:26:22.105241 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-nsbng/must-gather-kg65p" podStartSLOduration=2.105223737 podStartE2EDuration="2.105223737s" podCreationTimestamp="2025-10-12 08:26:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 08:26:22.101665286 +0000 UTC m=+3078.890860788" watchObservedRunningTime="2025-10-12 08:26:22.105223737 +0000 UTC m=+3078.894419238" Oct 12 08:26:23 crc kubenswrapper[4599]: I1012 08:26:23.894542 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-nsbng/crc-debug-ll6gs"] Oct 12 08:26:23 crc kubenswrapper[4599]: I1012 08:26:23.896043 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nsbng/crc-debug-ll6gs" Oct 12 08:26:23 crc kubenswrapper[4599]: I1012 08:26:23.946824 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/67fe3b91-e3ea-47a5-8786-90d0d27e0956-host\") pod \"crc-debug-ll6gs\" (UID: \"67fe3b91-e3ea-47a5-8786-90d0d27e0956\") " pod="openshift-must-gather-nsbng/crc-debug-ll6gs" Oct 12 08:26:23 crc kubenswrapper[4599]: I1012 08:26:23.946891 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grxsv\" (UniqueName: \"kubernetes.io/projected/67fe3b91-e3ea-47a5-8786-90d0d27e0956-kube-api-access-grxsv\") pod \"crc-debug-ll6gs\" (UID: \"67fe3b91-e3ea-47a5-8786-90d0d27e0956\") " pod="openshift-must-gather-nsbng/crc-debug-ll6gs" Oct 12 08:26:24 crc kubenswrapper[4599]: I1012 08:26:24.048916 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/67fe3b91-e3ea-47a5-8786-90d0d27e0956-host\") pod \"crc-debug-ll6gs\" (UID: \"67fe3b91-e3ea-47a5-8786-90d0d27e0956\") " pod="openshift-must-gather-nsbng/crc-debug-ll6gs" Oct 12 08:26:24 crc kubenswrapper[4599]: I1012 08:26:24.048982 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grxsv\" (UniqueName: \"kubernetes.io/projected/67fe3b91-e3ea-47a5-8786-90d0d27e0956-kube-api-access-grxsv\") pod \"crc-debug-ll6gs\" (UID: \"67fe3b91-e3ea-47a5-8786-90d0d27e0956\") " pod="openshift-must-gather-nsbng/crc-debug-ll6gs" Oct 12 08:26:24 crc kubenswrapper[4599]: I1012 08:26:24.049044 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/67fe3b91-e3ea-47a5-8786-90d0d27e0956-host\") pod \"crc-debug-ll6gs\" (UID: \"67fe3b91-e3ea-47a5-8786-90d0d27e0956\") " pod="openshift-must-gather-nsbng/crc-debug-ll6gs" Oct 12 08:26:24 crc kubenswrapper[4599]: I1012 08:26:24.064313 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grxsv\" (UniqueName: \"kubernetes.io/projected/67fe3b91-e3ea-47a5-8786-90d0d27e0956-kube-api-access-grxsv\") pod \"crc-debug-ll6gs\" (UID: \"67fe3b91-e3ea-47a5-8786-90d0d27e0956\") " pod="openshift-must-gather-nsbng/crc-debug-ll6gs" Oct 12 08:26:24 crc kubenswrapper[4599]: I1012 08:26:24.212958 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nsbng/crc-debug-ll6gs" Oct 12 08:26:24 crc kubenswrapper[4599]: W1012 08:26:24.234277 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod67fe3b91_e3ea_47a5_8786_90d0d27e0956.slice/crio-85e7e950bf188343145cbde060b5ea5740e81b62282087e3cf48bdf3643fb9fe WatchSource:0}: Error finding container 85e7e950bf188343145cbde060b5ea5740e81b62282087e3cf48bdf3643fb9fe: Status 404 returned error can't find the container with id 85e7e950bf188343145cbde060b5ea5740e81b62282087e3cf48bdf3643fb9fe Oct 12 08:26:24 crc kubenswrapper[4599]: I1012 08:26:24.545954 4599 scope.go:117] "RemoveContainer" containerID="536fe0523d7f4d373d85a237a3866f6e04c0049c07c9b6780974dfce18c2815f" Oct 12 08:26:24 crc kubenswrapper[4599]: E1012 08:26:24.546392 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5mz5c_openshift-machine-config-operator(cc694bce-8c25-4729-b452-29d44d3efe6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" Oct 12 08:26:25 crc kubenswrapper[4599]: I1012 08:26:25.112124 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nsbng/crc-debug-ll6gs" event={"ID":"67fe3b91-e3ea-47a5-8786-90d0d27e0956","Type":"ContainerStarted","Data":"b6bbf7c45df86daa48dbebd7a0dd9e744069c056cc6e4005e6ef2136cfe6c7b5"} Oct 12 08:26:25 crc kubenswrapper[4599]: I1012 08:26:25.112541 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nsbng/crc-debug-ll6gs" event={"ID":"67fe3b91-e3ea-47a5-8786-90d0d27e0956","Type":"ContainerStarted","Data":"85e7e950bf188343145cbde060b5ea5740e81b62282087e3cf48bdf3643fb9fe"} Oct 12 08:26:25 crc kubenswrapper[4599]: I1012 08:26:25.125647 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-nsbng/crc-debug-ll6gs" podStartSLOduration=2.125635434 podStartE2EDuration="2.125635434s" podCreationTimestamp="2025-10-12 08:26:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 08:26:25.122112731 +0000 UTC m=+3081.911308233" watchObservedRunningTime="2025-10-12 08:26:25.125635434 +0000 UTC m=+3081.914830937" Oct 12 08:26:38 crc kubenswrapper[4599]: I1012 08:26:38.545174 4599 scope.go:117] "RemoveContainer" containerID="536fe0523d7f4d373d85a237a3866f6e04c0049c07c9b6780974dfce18c2815f" Oct 12 08:26:38 crc kubenswrapper[4599]: E1012 08:26:38.545765 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5mz5c_openshift-machine-config-operator(cc694bce-8c25-4729-b452-29d44d3efe6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" Oct 12 08:26:49 crc kubenswrapper[4599]: I1012 08:26:49.259153 4599 generic.go:334] "Generic (PLEG): container finished" podID="67fe3b91-e3ea-47a5-8786-90d0d27e0956" containerID="b6bbf7c45df86daa48dbebd7a0dd9e744069c056cc6e4005e6ef2136cfe6c7b5" exitCode=0 Oct 12 08:26:49 crc kubenswrapper[4599]: I1012 08:26:49.259232 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nsbng/crc-debug-ll6gs" event={"ID":"67fe3b91-e3ea-47a5-8786-90d0d27e0956","Type":"ContainerDied","Data":"b6bbf7c45df86daa48dbebd7a0dd9e744069c056cc6e4005e6ef2136cfe6c7b5"} Oct 12 08:26:50 crc kubenswrapper[4599]: I1012 08:26:50.338413 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nsbng/crc-debug-ll6gs" Oct 12 08:26:50 crc kubenswrapper[4599]: I1012 08:26:50.369412 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-nsbng/crc-debug-ll6gs"] Oct 12 08:26:50 crc kubenswrapper[4599]: I1012 08:26:50.373754 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-nsbng/crc-debug-ll6gs"] Oct 12 08:26:50 crc kubenswrapper[4599]: I1012 08:26:50.479355 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/67fe3b91-e3ea-47a5-8786-90d0d27e0956-host\") pod \"67fe3b91-e3ea-47a5-8786-90d0d27e0956\" (UID: \"67fe3b91-e3ea-47a5-8786-90d0d27e0956\") " Oct 12 08:26:50 crc kubenswrapper[4599]: I1012 08:26:50.479472 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-grxsv\" (UniqueName: \"kubernetes.io/projected/67fe3b91-e3ea-47a5-8786-90d0d27e0956-kube-api-access-grxsv\") pod \"67fe3b91-e3ea-47a5-8786-90d0d27e0956\" (UID: \"67fe3b91-e3ea-47a5-8786-90d0d27e0956\") " Oct 12 08:26:50 crc kubenswrapper[4599]: I1012 08:26:50.479483 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/67fe3b91-e3ea-47a5-8786-90d0d27e0956-host" (OuterVolumeSpecName: "host") pod "67fe3b91-e3ea-47a5-8786-90d0d27e0956" (UID: "67fe3b91-e3ea-47a5-8786-90d0d27e0956"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 12 08:26:50 crc kubenswrapper[4599]: I1012 08:26:50.479845 4599 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/67fe3b91-e3ea-47a5-8786-90d0d27e0956-host\") on node \"crc\" DevicePath \"\"" Oct 12 08:26:50 crc kubenswrapper[4599]: I1012 08:26:50.484414 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67fe3b91-e3ea-47a5-8786-90d0d27e0956-kube-api-access-grxsv" (OuterVolumeSpecName: "kube-api-access-grxsv") pod "67fe3b91-e3ea-47a5-8786-90d0d27e0956" (UID: "67fe3b91-e3ea-47a5-8786-90d0d27e0956"). InnerVolumeSpecName "kube-api-access-grxsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 08:26:50 crc kubenswrapper[4599]: I1012 08:26:50.582323 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-grxsv\" (UniqueName: \"kubernetes.io/projected/67fe3b91-e3ea-47a5-8786-90d0d27e0956-kube-api-access-grxsv\") on node \"crc\" DevicePath \"\"" Oct 12 08:26:51 crc kubenswrapper[4599]: I1012 08:26:51.274888 4599 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="85e7e950bf188343145cbde060b5ea5740e81b62282087e3cf48bdf3643fb9fe" Oct 12 08:26:51 crc kubenswrapper[4599]: I1012 08:26:51.274992 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nsbng/crc-debug-ll6gs" Oct 12 08:26:51 crc kubenswrapper[4599]: I1012 08:26:51.514326 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-nsbng/crc-debug-658qq"] Oct 12 08:26:51 crc kubenswrapper[4599]: E1012 08:26:51.514694 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67fe3b91-e3ea-47a5-8786-90d0d27e0956" containerName="container-00" Oct 12 08:26:51 crc kubenswrapper[4599]: I1012 08:26:51.514706 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="67fe3b91-e3ea-47a5-8786-90d0d27e0956" containerName="container-00" Oct 12 08:26:51 crc kubenswrapper[4599]: I1012 08:26:51.515599 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="67fe3b91-e3ea-47a5-8786-90d0d27e0956" containerName="container-00" Oct 12 08:26:51 crc kubenswrapper[4599]: I1012 08:26:51.516258 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nsbng/crc-debug-658qq" Oct 12 08:26:51 crc kubenswrapper[4599]: I1012 08:26:51.545445 4599 scope.go:117] "RemoveContainer" containerID="536fe0523d7f4d373d85a237a3866f6e04c0049c07c9b6780974dfce18c2815f" Oct 12 08:26:51 crc kubenswrapper[4599]: E1012 08:26:51.545751 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5mz5c_openshift-machine-config-operator(cc694bce-8c25-4729-b452-29d44d3efe6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" Oct 12 08:26:51 crc kubenswrapper[4599]: I1012 08:26:51.552241 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67fe3b91-e3ea-47a5-8786-90d0d27e0956" path="/var/lib/kubelet/pods/67fe3b91-e3ea-47a5-8786-90d0d27e0956/volumes" Oct 12 08:26:51 crc kubenswrapper[4599]: I1012 08:26:51.697833 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8ab600f7-dc1f-4318-a8c9-c4459aad9272-host\") pod \"crc-debug-658qq\" (UID: \"8ab600f7-dc1f-4318-a8c9-c4459aad9272\") " pod="openshift-must-gather-nsbng/crc-debug-658qq" Oct 12 08:26:51 crc kubenswrapper[4599]: I1012 08:26:51.698032 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsxtw\" (UniqueName: \"kubernetes.io/projected/8ab600f7-dc1f-4318-a8c9-c4459aad9272-kube-api-access-xsxtw\") pod \"crc-debug-658qq\" (UID: \"8ab600f7-dc1f-4318-a8c9-c4459aad9272\") " pod="openshift-must-gather-nsbng/crc-debug-658qq" Oct 12 08:26:51 crc kubenswrapper[4599]: I1012 08:26:51.799410 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xsxtw\" (UniqueName: \"kubernetes.io/projected/8ab600f7-dc1f-4318-a8c9-c4459aad9272-kube-api-access-xsxtw\") pod \"crc-debug-658qq\" (UID: \"8ab600f7-dc1f-4318-a8c9-c4459aad9272\") " pod="openshift-must-gather-nsbng/crc-debug-658qq" Oct 12 08:26:51 crc kubenswrapper[4599]: I1012 08:26:51.799530 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8ab600f7-dc1f-4318-a8c9-c4459aad9272-host\") pod \"crc-debug-658qq\" (UID: \"8ab600f7-dc1f-4318-a8c9-c4459aad9272\") " pod="openshift-must-gather-nsbng/crc-debug-658qq" Oct 12 08:26:51 crc kubenswrapper[4599]: I1012 08:26:51.799674 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8ab600f7-dc1f-4318-a8c9-c4459aad9272-host\") pod \"crc-debug-658qq\" (UID: \"8ab600f7-dc1f-4318-a8c9-c4459aad9272\") " pod="openshift-must-gather-nsbng/crc-debug-658qq" Oct 12 08:26:51 crc kubenswrapper[4599]: I1012 08:26:51.814252 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xsxtw\" (UniqueName: \"kubernetes.io/projected/8ab600f7-dc1f-4318-a8c9-c4459aad9272-kube-api-access-xsxtw\") pod \"crc-debug-658qq\" (UID: \"8ab600f7-dc1f-4318-a8c9-c4459aad9272\") " pod="openshift-must-gather-nsbng/crc-debug-658qq" Oct 12 08:26:51 crc kubenswrapper[4599]: I1012 08:26:51.828543 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nsbng/crc-debug-658qq" Oct 12 08:26:52 crc kubenswrapper[4599]: I1012 08:26:52.282347 4599 generic.go:334] "Generic (PLEG): container finished" podID="8ab600f7-dc1f-4318-a8c9-c4459aad9272" containerID="892bd901ba6a2ba41c1ecd5c14b55e940f993cb2f9f1b014c4456256b32966fb" exitCode=0 Oct 12 08:26:52 crc kubenswrapper[4599]: I1012 08:26:52.282436 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nsbng/crc-debug-658qq" event={"ID":"8ab600f7-dc1f-4318-a8c9-c4459aad9272","Type":"ContainerDied","Data":"892bd901ba6a2ba41c1ecd5c14b55e940f993cb2f9f1b014c4456256b32966fb"} Oct 12 08:26:52 crc kubenswrapper[4599]: I1012 08:26:52.282691 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nsbng/crc-debug-658qq" event={"ID":"8ab600f7-dc1f-4318-a8c9-c4459aad9272","Type":"ContainerStarted","Data":"f27be9f5622995ac3b323aa491fa0af060afe1876ca8972ac4898c3057e2973f"} Oct 12 08:26:52 crc kubenswrapper[4599]: I1012 08:26:52.658703 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-nsbng/crc-debug-658qq"] Oct 12 08:26:52 crc kubenswrapper[4599]: I1012 08:26:52.665412 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-nsbng/crc-debug-658qq"] Oct 12 08:26:53 crc kubenswrapper[4599]: I1012 08:26:53.372176 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nsbng/crc-debug-658qq" Oct 12 08:26:53 crc kubenswrapper[4599]: I1012 08:26:53.523852 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xsxtw\" (UniqueName: \"kubernetes.io/projected/8ab600f7-dc1f-4318-a8c9-c4459aad9272-kube-api-access-xsxtw\") pod \"8ab600f7-dc1f-4318-a8c9-c4459aad9272\" (UID: \"8ab600f7-dc1f-4318-a8c9-c4459aad9272\") " Oct 12 08:26:53 crc kubenswrapper[4599]: I1012 08:26:53.524023 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8ab600f7-dc1f-4318-a8c9-c4459aad9272-host\") pod \"8ab600f7-dc1f-4318-a8c9-c4459aad9272\" (UID: \"8ab600f7-dc1f-4318-a8c9-c4459aad9272\") " Oct 12 08:26:53 crc kubenswrapper[4599]: I1012 08:26:53.524063 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8ab600f7-dc1f-4318-a8c9-c4459aad9272-host" (OuterVolumeSpecName: "host") pod "8ab600f7-dc1f-4318-a8c9-c4459aad9272" (UID: "8ab600f7-dc1f-4318-a8c9-c4459aad9272"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 12 08:26:53 crc kubenswrapper[4599]: I1012 08:26:53.524504 4599 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8ab600f7-dc1f-4318-a8c9-c4459aad9272-host\") on node \"crc\" DevicePath \"\"" Oct 12 08:26:53 crc kubenswrapper[4599]: I1012 08:26:53.528362 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ab600f7-dc1f-4318-a8c9-c4459aad9272-kube-api-access-xsxtw" (OuterVolumeSpecName: "kube-api-access-xsxtw") pod "8ab600f7-dc1f-4318-a8c9-c4459aad9272" (UID: "8ab600f7-dc1f-4318-a8c9-c4459aad9272"). InnerVolumeSpecName "kube-api-access-xsxtw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 08:26:53 crc kubenswrapper[4599]: I1012 08:26:53.551710 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ab600f7-dc1f-4318-a8c9-c4459aad9272" path="/var/lib/kubelet/pods/8ab600f7-dc1f-4318-a8c9-c4459aad9272/volumes" Oct 12 08:26:53 crc kubenswrapper[4599]: I1012 08:26:53.626278 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xsxtw\" (UniqueName: \"kubernetes.io/projected/8ab600f7-dc1f-4318-a8c9-c4459aad9272-kube-api-access-xsxtw\") on node \"crc\" DevicePath \"\"" Oct 12 08:26:53 crc kubenswrapper[4599]: I1012 08:26:53.788127 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-nsbng/crc-debug-972r5"] Oct 12 08:26:53 crc kubenswrapper[4599]: E1012 08:26:53.788476 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ab600f7-dc1f-4318-a8c9-c4459aad9272" containerName="container-00" Oct 12 08:26:53 crc kubenswrapper[4599]: I1012 08:26:53.788488 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ab600f7-dc1f-4318-a8c9-c4459aad9272" containerName="container-00" Oct 12 08:26:53 crc kubenswrapper[4599]: I1012 08:26:53.788650 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ab600f7-dc1f-4318-a8c9-c4459aad9272" containerName="container-00" Oct 12 08:26:53 crc kubenswrapper[4599]: I1012 08:26:53.789157 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nsbng/crc-debug-972r5" Oct 12 08:26:53 crc kubenswrapper[4599]: I1012 08:26:53.828934 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fd4ebd54-da27-4008-9ed4-708c91fae5a3-host\") pod \"crc-debug-972r5\" (UID: \"fd4ebd54-da27-4008-9ed4-708c91fae5a3\") " pod="openshift-must-gather-nsbng/crc-debug-972r5" Oct 12 08:26:53 crc kubenswrapper[4599]: I1012 08:26:53.829002 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwm96\" (UniqueName: \"kubernetes.io/projected/fd4ebd54-da27-4008-9ed4-708c91fae5a3-kube-api-access-fwm96\") pod \"crc-debug-972r5\" (UID: \"fd4ebd54-da27-4008-9ed4-708c91fae5a3\") " pod="openshift-must-gather-nsbng/crc-debug-972r5" Oct 12 08:26:53 crc kubenswrapper[4599]: I1012 08:26:53.933475 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fd4ebd54-da27-4008-9ed4-708c91fae5a3-host\") pod \"crc-debug-972r5\" (UID: \"fd4ebd54-da27-4008-9ed4-708c91fae5a3\") " pod="openshift-must-gather-nsbng/crc-debug-972r5" Oct 12 08:26:53 crc kubenswrapper[4599]: I1012 08:26:53.933552 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwm96\" (UniqueName: \"kubernetes.io/projected/fd4ebd54-da27-4008-9ed4-708c91fae5a3-kube-api-access-fwm96\") pod \"crc-debug-972r5\" (UID: \"fd4ebd54-da27-4008-9ed4-708c91fae5a3\") " pod="openshift-must-gather-nsbng/crc-debug-972r5" Oct 12 08:26:53 crc kubenswrapper[4599]: I1012 08:26:53.933779 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fd4ebd54-da27-4008-9ed4-708c91fae5a3-host\") pod \"crc-debug-972r5\" (UID: \"fd4ebd54-da27-4008-9ed4-708c91fae5a3\") " pod="openshift-must-gather-nsbng/crc-debug-972r5" Oct 12 08:26:53 crc kubenswrapper[4599]: I1012 08:26:53.954315 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwm96\" (UniqueName: \"kubernetes.io/projected/fd4ebd54-da27-4008-9ed4-708c91fae5a3-kube-api-access-fwm96\") pod \"crc-debug-972r5\" (UID: \"fd4ebd54-da27-4008-9ed4-708c91fae5a3\") " pod="openshift-must-gather-nsbng/crc-debug-972r5" Oct 12 08:26:54 crc kubenswrapper[4599]: I1012 08:26:54.101465 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nsbng/crc-debug-972r5" Oct 12 08:26:54 crc kubenswrapper[4599]: W1012 08:26:54.121167 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfd4ebd54_da27_4008_9ed4_708c91fae5a3.slice/crio-d46a74b76d3b0cf2acbe889e661669ee9865974bb4367b8d0aaaba75f4d4e5e2 WatchSource:0}: Error finding container d46a74b76d3b0cf2acbe889e661669ee9865974bb4367b8d0aaaba75f4d4e5e2: Status 404 returned error can't find the container with id d46a74b76d3b0cf2acbe889e661669ee9865974bb4367b8d0aaaba75f4d4e5e2 Oct 12 08:26:54 crc kubenswrapper[4599]: I1012 08:26:54.295710 4599 generic.go:334] "Generic (PLEG): container finished" podID="fd4ebd54-da27-4008-9ed4-708c91fae5a3" containerID="5fea6f1fefe36192056b356e98bf8c3aa38574b14408e98500f7232c44a0db5b" exitCode=0 Oct 12 08:26:54 crc kubenswrapper[4599]: I1012 08:26:54.295771 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nsbng/crc-debug-972r5" event={"ID":"fd4ebd54-da27-4008-9ed4-708c91fae5a3","Type":"ContainerDied","Data":"5fea6f1fefe36192056b356e98bf8c3aa38574b14408e98500f7232c44a0db5b"} Oct 12 08:26:54 crc kubenswrapper[4599]: I1012 08:26:54.295797 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nsbng/crc-debug-972r5" event={"ID":"fd4ebd54-da27-4008-9ed4-708c91fae5a3","Type":"ContainerStarted","Data":"d46a74b76d3b0cf2acbe889e661669ee9865974bb4367b8d0aaaba75f4d4e5e2"} Oct 12 08:26:54 crc kubenswrapper[4599]: I1012 08:26:54.297761 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nsbng/crc-debug-658qq" Oct 12 08:26:54 crc kubenswrapper[4599]: I1012 08:26:54.297770 4599 scope.go:117] "RemoveContainer" containerID="892bd901ba6a2ba41c1ecd5c14b55e940f993cb2f9f1b014c4456256b32966fb" Oct 12 08:26:54 crc kubenswrapper[4599]: I1012 08:26:54.328134 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-nsbng/crc-debug-972r5"] Oct 12 08:26:54 crc kubenswrapper[4599]: I1012 08:26:54.336082 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-nsbng/crc-debug-972r5"] Oct 12 08:26:55 crc kubenswrapper[4599]: I1012 08:26:55.375198 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nsbng/crc-debug-972r5" Oct 12 08:26:55 crc kubenswrapper[4599]: I1012 08:26:55.558844 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fd4ebd54-da27-4008-9ed4-708c91fae5a3-host\") pod \"fd4ebd54-da27-4008-9ed4-708c91fae5a3\" (UID: \"fd4ebd54-da27-4008-9ed4-708c91fae5a3\") " Oct 12 08:26:55 crc kubenswrapper[4599]: I1012 08:26:55.558930 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fd4ebd54-da27-4008-9ed4-708c91fae5a3-host" (OuterVolumeSpecName: "host") pod "fd4ebd54-da27-4008-9ed4-708c91fae5a3" (UID: "fd4ebd54-da27-4008-9ed4-708c91fae5a3"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 12 08:26:55 crc kubenswrapper[4599]: I1012 08:26:55.559173 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fwm96\" (UniqueName: \"kubernetes.io/projected/fd4ebd54-da27-4008-9ed4-708c91fae5a3-kube-api-access-fwm96\") pod \"fd4ebd54-da27-4008-9ed4-708c91fae5a3\" (UID: \"fd4ebd54-da27-4008-9ed4-708c91fae5a3\") " Oct 12 08:26:55 crc kubenswrapper[4599]: I1012 08:26:55.559723 4599 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fd4ebd54-da27-4008-9ed4-708c91fae5a3-host\") on node \"crc\" DevicePath \"\"" Oct 12 08:26:55 crc kubenswrapper[4599]: I1012 08:26:55.564291 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd4ebd54-da27-4008-9ed4-708c91fae5a3-kube-api-access-fwm96" (OuterVolumeSpecName: "kube-api-access-fwm96") pod "fd4ebd54-da27-4008-9ed4-708c91fae5a3" (UID: "fd4ebd54-da27-4008-9ed4-708c91fae5a3"). InnerVolumeSpecName "kube-api-access-fwm96". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 08:26:55 crc kubenswrapper[4599]: I1012 08:26:55.660938 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fwm96\" (UniqueName: \"kubernetes.io/projected/fd4ebd54-da27-4008-9ed4-708c91fae5a3-kube-api-access-fwm96\") on node \"crc\" DevicePath \"\"" Oct 12 08:26:56 crc kubenswrapper[4599]: I1012 08:26:56.313268 4599 scope.go:117] "RemoveContainer" containerID="5fea6f1fefe36192056b356e98bf8c3aa38574b14408e98500f7232c44a0db5b" Oct 12 08:26:56 crc kubenswrapper[4599]: I1012 08:26:56.313302 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nsbng/crc-debug-972r5" Oct 12 08:26:57 crc kubenswrapper[4599]: I1012 08:26:57.552514 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd4ebd54-da27-4008-9ed4-708c91fae5a3" path="/var/lib/kubelet/pods/fd4ebd54-da27-4008-9ed4-708c91fae5a3/volumes" Oct 12 08:27:02 crc kubenswrapper[4599]: I1012 08:27:02.545166 4599 scope.go:117] "RemoveContainer" containerID="536fe0523d7f4d373d85a237a3866f6e04c0049c07c9b6780974dfce18c2815f" Oct 12 08:27:02 crc kubenswrapper[4599]: E1012 08:27:02.545764 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5mz5c_openshift-machine-config-operator(cc694bce-8c25-4729-b452-29d44d3efe6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" Oct 12 08:27:02 crc kubenswrapper[4599]: I1012 08:27:02.713915 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-689dd94bf4-pwcdz_c247e243-5ad3-4e53-a733-a11d9407c42a/barbican-api/0.log" Oct 12 08:27:02 crc kubenswrapper[4599]: I1012 08:27:02.836870 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-689dd94bf4-pwcdz_c247e243-5ad3-4e53-a733-a11d9407c42a/barbican-api-log/0.log" Oct 12 08:27:02 crc kubenswrapper[4599]: I1012 08:27:02.893172 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-58d46f7854-2972q_82626116-d40d-45c2-8a6f-513cb12f6b19/barbican-keystone-listener/0.log" Oct 12 08:27:02 crc kubenswrapper[4599]: I1012 08:27:02.911279 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-58d46f7854-2972q_82626116-d40d-45c2-8a6f-513cb12f6b19/barbican-keystone-listener-log/0.log" Oct 12 08:27:03 crc kubenswrapper[4599]: I1012 08:27:03.027575 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6d4cb4975-47tpw_f8d2d027-f32a-4708-b7cb-5302f1def41f/barbican-worker-log/0.log" Oct 12 08:27:03 crc kubenswrapper[4599]: I1012 08:27:03.035248 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6d4cb4975-47tpw_f8d2d027-f32a-4708-b7cb-5302f1def41f/barbican-worker/0.log" Oct 12 08:27:03 crc kubenswrapper[4599]: I1012 08:27:03.124760 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-2jgtd_f2098dde-6e8b-4a07-80d7-fc8e6d2c665e/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Oct 12 08:27:03 crc kubenswrapper[4599]: I1012 08:27:03.190992 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_95d7fa90-5a03-4991-810a-59cf46e55ebf/ceilometer-central-agent/0.log" Oct 12 08:27:03 crc kubenswrapper[4599]: I1012 08:27:03.229307 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_95d7fa90-5a03-4991-810a-59cf46e55ebf/ceilometer-notification-agent/0.log" Oct 12 08:27:03 crc kubenswrapper[4599]: I1012 08:27:03.273035 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_95d7fa90-5a03-4991-810a-59cf46e55ebf/proxy-httpd/0.log" Oct 12 08:27:03 crc kubenswrapper[4599]: I1012 08:27:03.352564 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_95d7fa90-5a03-4991-810a-59cf46e55ebf/sg-core/0.log" Oct 12 08:27:03 crc kubenswrapper[4599]: I1012 08:27:03.443087 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_5d006848-2829-4dab-b441-dddfc1737bfa/cinder-api/0.log" Oct 12 08:27:03 crc kubenswrapper[4599]: I1012 08:27:03.625762 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_5d006848-2829-4dab-b441-dddfc1737bfa/cinder-api-log/0.log" Oct 12 08:27:03 crc kubenswrapper[4599]: I1012 08:27:03.768572 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_dfa968eb-bc29-434a-a2f6-6aebf5c8beda/cinder-scheduler/0.log" Oct 12 08:27:03 crc kubenswrapper[4599]: I1012 08:27:03.772757 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_dfa968eb-bc29-434a-a2f6-6aebf5c8beda/probe/0.log" Oct 12 08:27:03 crc kubenswrapper[4599]: I1012 08:27:03.852607 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-s5xrx_9988d105-d7f0-459a-a8d9-056ac0d3abab/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 12 08:27:03 crc kubenswrapper[4599]: I1012 08:27:03.989614 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-n2smx_dd9a9999-dc26-4df4-b259-dbdbc31766f3/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 12 08:27:04 crc kubenswrapper[4599]: I1012 08:27:04.007176 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-sck65_9fd7ec79-0a36-4ac6-a81a-486df9b2ba89/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 12 08:27:04 crc kubenswrapper[4599]: I1012 08:27:04.124670 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5d885c8d8c-mm9q2_aa8d5577-ea50-40e9-8549-7c7ad4da7ee6/init/0.log" Oct 12 08:27:04 crc kubenswrapper[4599]: I1012 08:27:04.297632 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5d885c8d8c-mm9q2_aa8d5577-ea50-40e9-8549-7c7ad4da7ee6/init/0.log" Oct 12 08:27:04 crc kubenswrapper[4599]: I1012 08:27:04.354816 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5d885c8d8c-mm9q2_aa8d5577-ea50-40e9-8549-7c7ad4da7ee6/dnsmasq-dns/0.log" Oct 12 08:27:04 crc kubenswrapper[4599]: I1012 08:27:04.388879 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-bphhh_f37be313-7217-4822-82a3-b1c6edd70a45/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Oct 12 08:27:04 crc kubenswrapper[4599]: I1012 08:27:04.526707 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_a26e3618-b309-4ac5-b0e1-39feba422ef6/glance-log/0.log" Oct 12 08:27:04 crc kubenswrapper[4599]: I1012 08:27:04.527813 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_a26e3618-b309-4ac5-b0e1-39feba422ef6/glance-httpd/0.log" Oct 12 08:27:04 crc kubenswrapper[4599]: I1012 08:27:04.663002 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_7fec8099-a2ba-4cbd-af30-75a787e3ead1/glance-httpd/0.log" Oct 12 08:27:04 crc kubenswrapper[4599]: I1012 08:27:04.680236 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_7fec8099-a2ba-4cbd-af30-75a787e3ead1/glance-log/0.log" Oct 12 08:27:04 crc kubenswrapper[4599]: I1012 08:27:04.720755 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-2hsx4_bc8b881e-7904-44fe-ae99-975ece57dc4c/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Oct 12 08:27:04 crc kubenswrapper[4599]: I1012 08:27:04.877127 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-6ngmq_e862c4d5-24d0-42ee-82f5-7a17fc6773aa/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 12 08:27:05 crc kubenswrapper[4599]: I1012 08:27:05.008596 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-5d9d5b7486-4r48s_5d11880e-4007-4266-bf3b-8c1e3eea20b8/keystone-api/0.log" Oct 12 08:27:05 crc kubenswrapper[4599]: I1012 08:27:05.042422 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29337601-bb6kg_53717d04-56c1-42cc-af4a-f7edc51e3611/keystone-cron/0.log" Oct 12 08:27:05 crc kubenswrapper[4599]: I1012 08:27:05.169399 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_03a40b71-5a8f-42cd-97dd-e1b360a15b68/kube-state-metrics/0.log" Oct 12 08:27:05 crc kubenswrapper[4599]: I1012 08:27:05.259304 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-rpvwh_ed0b094a-51d6-4287-b4a4-4a0934139fa2/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Oct 12 08:27:05 crc kubenswrapper[4599]: I1012 08:27:05.537135 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6dcf447b8f-d5qql_1f21bc7c-6371-45de-9766-ce9ad07df644/neutron-httpd/0.log" Oct 12 08:27:05 crc kubenswrapper[4599]: I1012 08:27:05.544553 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6dcf447b8f-d5qql_1f21bc7c-6371-45de-9766-ce9ad07df644/neutron-api/0.log" Oct 12 08:27:05 crc kubenswrapper[4599]: I1012 08:27:05.894273 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-7rb25_d5a50516-f480-4da5-adb8-853dd9ce7b6c/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Oct 12 08:27:06 crc kubenswrapper[4599]: I1012 08:27:06.247497 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_fd526957-c4dc-40c6-87e3-eb3784e09fb5/nova-api-log/0.log" Oct 12 08:27:06 crc kubenswrapper[4599]: I1012 08:27:06.278309 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_65c5adc9-0a5f-4631-bee5-c87a70c0d0a2/nova-cell0-conductor-conductor/0.log" Oct 12 08:27:06 crc kubenswrapper[4599]: I1012 08:27:06.543109 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_c21fe8be-d815-4e07-9ea8-e22d73e2dd8f/nova-cell1-conductor-conductor/0.log" Oct 12 08:27:06 crc kubenswrapper[4599]: I1012 08:27:06.570738 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_fd526957-c4dc-40c6-87e3-eb3784e09fb5/nova-api-api/0.log" Oct 12 08:27:06 crc kubenswrapper[4599]: I1012 08:27:06.572565 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_2abe9d8b-a086-4e4c-8873-3b50714935c9/nova-cell1-novncproxy-novncproxy/0.log" Oct 12 08:27:06 crc kubenswrapper[4599]: I1012 08:27:06.843291 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_092769ed-436e-4aee-b3e9-4b2eeb4c487e/nova-metadata-log/0.log" Oct 12 08:27:06 crc kubenswrapper[4599]: I1012 08:27:06.855278 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-qxq8q_c01400c9-ebac-486d-ac74-9cec09171386/nova-edpm-deployment-openstack-edpm-ipam/0.log" Oct 12 08:27:07 crc kubenswrapper[4599]: I1012 08:27:07.204773 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_e76c3f9c-bea3-4b35-852c-65d48f177d8a/mysql-bootstrap/0.log" Oct 12 08:27:07 crc kubenswrapper[4599]: I1012 08:27:07.311831 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_223d3e1a-86ac-49d9-a231-b77957770434/nova-scheduler-scheduler/0.log" Oct 12 08:27:07 crc kubenswrapper[4599]: I1012 08:27:07.339538 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_e76c3f9c-bea3-4b35-852c-65d48f177d8a/mysql-bootstrap/0.log" Oct 12 08:27:07 crc kubenswrapper[4599]: I1012 08:27:07.366707 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_e76c3f9c-bea3-4b35-852c-65d48f177d8a/galera/0.log" Oct 12 08:27:07 crc kubenswrapper[4599]: I1012 08:27:07.490143 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_4a035bca-ccfe-4dc6-949a-44d2ddf0fa26/mysql-bootstrap/0.log" Oct 12 08:27:07 crc kubenswrapper[4599]: I1012 08:27:07.649573 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_4a035bca-ccfe-4dc6-949a-44d2ddf0fa26/mysql-bootstrap/0.log" Oct 12 08:27:07 crc kubenswrapper[4599]: I1012 08:27:07.658176 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_4a035bca-ccfe-4dc6-949a-44d2ddf0fa26/galera/0.log" Oct 12 08:27:07 crc kubenswrapper[4599]: I1012 08:27:07.709901 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_092769ed-436e-4aee-b3e9-4b2eeb4c487e/nova-metadata-metadata/0.log" Oct 12 08:27:07 crc kubenswrapper[4599]: I1012 08:27:07.791245 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_b6db97ed-496b-4f4d-bb27-2bce6e003912/openstackclient/0.log" Oct 12 08:27:07 crc kubenswrapper[4599]: I1012 08:27:07.936540 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-9rbk6_ff6de4a7-bd76-46bc-a376-b1ec8c5ab712/ovn-controller/0.log" Oct 12 08:27:07 crc kubenswrapper[4599]: I1012 08:27:07.949456 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-4dn56_ce16e02d-17cf-467a-aca5-944a67d4cd79/openstack-network-exporter/0.log" Oct 12 08:27:08 crc kubenswrapper[4599]: I1012 08:27:08.068563 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-8s62q_78cb767a-31ee-4e29-b075-e773a43272c2/ovsdb-server-init/0.log" Oct 12 08:27:08 crc kubenswrapper[4599]: I1012 08:27:08.200665 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-8s62q_78cb767a-31ee-4e29-b075-e773a43272c2/ovsdb-server-init/0.log" Oct 12 08:27:08 crc kubenswrapper[4599]: I1012 08:27:08.297147 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-8s62q_78cb767a-31ee-4e29-b075-e773a43272c2/ovs-vswitchd/0.log" Oct 12 08:27:08 crc kubenswrapper[4599]: I1012 08:27:08.304248 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-8s62q_78cb767a-31ee-4e29-b075-e773a43272c2/ovsdb-server/0.log" Oct 12 08:27:08 crc kubenswrapper[4599]: I1012 08:27:08.497888 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_07a217da-3192-46dd-a935-7b124b5e6961/ovn-northd/0.log" Oct 12 08:27:08 crc kubenswrapper[4599]: I1012 08:27:08.528186 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-nmnvn_29a6cb6f-6a2a-405e-a24b-5d49ed9288cd/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Oct 12 08:27:08 crc kubenswrapper[4599]: I1012 08:27:08.530177 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_07a217da-3192-46dd-a935-7b124b5e6961/openstack-network-exporter/0.log" Oct 12 08:27:08 crc kubenswrapper[4599]: I1012 08:27:08.748197 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_9d3345d9-bc30-42dc-98e0-bfd24fee35ab/openstack-network-exporter/0.log" Oct 12 08:27:08 crc kubenswrapper[4599]: I1012 08:27:08.887561 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_9d3345d9-bc30-42dc-98e0-bfd24fee35ab/ovsdbserver-nb/0.log" Oct 12 08:27:08 crc kubenswrapper[4599]: I1012 08:27:08.985992 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_b7b21b67-7112-4507-a5d2-9036f09a3cdf/ovsdbserver-sb/0.log" Oct 12 08:27:08 crc kubenswrapper[4599]: I1012 08:27:08.999653 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_b7b21b67-7112-4507-a5d2-9036f09a3cdf/openstack-network-exporter/0.log" Oct 12 08:27:09 crc kubenswrapper[4599]: I1012 08:27:09.213763 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-55dcb544bd-wvf5d_6d346d4c-1358-4305-89ac-c9c012143de6/placement-api/0.log" Oct 12 08:27:09 crc kubenswrapper[4599]: I1012 08:27:09.261271 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-55dcb544bd-wvf5d_6d346d4c-1358-4305-89ac-c9c012143de6/placement-log/0.log" Oct 12 08:27:09 crc kubenswrapper[4599]: I1012 08:27:09.281082 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_a9c90266-89e1-4527-8fa2-91826cbcc778/setup-container/0.log" Oct 12 08:27:09 crc kubenswrapper[4599]: I1012 08:27:09.405356 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_a9c90266-89e1-4527-8fa2-91826cbcc778/setup-container/0.log" Oct 12 08:27:09 crc kubenswrapper[4599]: I1012 08:27:09.433651 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_a9c90266-89e1-4527-8fa2-91826cbcc778/rabbitmq/0.log" Oct 12 08:27:09 crc kubenswrapper[4599]: I1012 08:27:09.486453 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_ab388f40-a761-44f6-812f-df5cf4b02b73/setup-container/0.log" Oct 12 08:27:09 crc kubenswrapper[4599]: I1012 08:27:09.605594 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_ab388f40-a761-44f6-812f-df5cf4b02b73/rabbitmq/0.log" Oct 12 08:27:09 crc kubenswrapper[4599]: I1012 08:27:09.613303 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_ab388f40-a761-44f6-812f-df5cf4b02b73/setup-container/0.log" Oct 12 08:27:09 crc kubenswrapper[4599]: I1012 08:27:09.680368 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-rvv95_e741b043-6773-4cbb-88fc-d3dc8cd7d39d/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 12 08:27:09 crc kubenswrapper[4599]: I1012 08:27:09.844866 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-sx5tv_39f84a87-f390-4864-85f1-d4df13fe6b93/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Oct 12 08:27:09 crc kubenswrapper[4599]: I1012 08:27:09.920765 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-gmzjl_dc14ab80-62e8-47ec-bf5b-370ccfd95eff/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Oct 12 08:27:10 crc kubenswrapper[4599]: I1012 08:27:10.053716 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-ssdqc_53901e82-60d6-4dd0-9ec9-15851ecb4215/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 12 08:27:10 crc kubenswrapper[4599]: I1012 08:27:10.085366 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-kbv4c_88ad049a-ede5-4ac3-842b-c1ab9199014a/ssh-known-hosts-edpm-deployment/0.log" Oct 12 08:27:10 crc kubenswrapper[4599]: I1012 08:27:10.312371 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-786bc7649f-6qt66_9a10375e-8317-473b-87b4-07c82831ac41/proxy-server/0.log" Oct 12 08:27:10 crc kubenswrapper[4599]: I1012 08:27:10.347494 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-786bc7649f-6qt66_9a10375e-8317-473b-87b4-07c82831ac41/proxy-httpd/0.log" Oct 12 08:27:10 crc kubenswrapper[4599]: I1012 08:27:10.508850 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_7f1c5e37-2b6d-4058-86e1-466baaa0f6c4/account-auditor/0.log" Oct 12 08:27:10 crc kubenswrapper[4599]: I1012 08:27:10.521828 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-w8g6f_3f44cfe9-f015-4084-b100-fbb08f528667/swift-ring-rebalance/0.log" Oct 12 08:27:10 crc kubenswrapper[4599]: I1012 08:27:10.590264 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_7f1c5e37-2b6d-4058-86e1-466baaa0f6c4/account-reaper/0.log" Oct 12 08:27:10 crc kubenswrapper[4599]: I1012 08:27:10.709258 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_7f1c5e37-2b6d-4058-86e1-466baaa0f6c4/account-server/0.log" Oct 12 08:27:10 crc kubenswrapper[4599]: I1012 08:27:10.745986 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_7f1c5e37-2b6d-4058-86e1-466baaa0f6c4/account-replicator/0.log" Oct 12 08:27:10 crc kubenswrapper[4599]: I1012 08:27:10.747714 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_7f1c5e37-2b6d-4058-86e1-466baaa0f6c4/container-auditor/0.log" Oct 12 08:27:10 crc kubenswrapper[4599]: I1012 08:27:10.792119 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_7f1c5e37-2b6d-4058-86e1-466baaa0f6c4/container-replicator/0.log" Oct 12 08:27:10 crc kubenswrapper[4599]: I1012 08:27:10.907580 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_7f1c5e37-2b6d-4058-86e1-466baaa0f6c4/container-server/0.log" Oct 12 08:27:10 crc kubenswrapper[4599]: I1012 08:27:10.921824 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_7f1c5e37-2b6d-4058-86e1-466baaa0f6c4/object-auditor/0.log" Oct 12 08:27:10 crc kubenswrapper[4599]: I1012 08:27:10.969087 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_7f1c5e37-2b6d-4058-86e1-466baaa0f6c4/container-updater/0.log" Oct 12 08:27:10 crc kubenswrapper[4599]: I1012 08:27:10.990838 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_7f1c5e37-2b6d-4058-86e1-466baaa0f6c4/object-expirer/0.log" Oct 12 08:27:11 crc kubenswrapper[4599]: I1012 08:27:11.147600 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_7f1c5e37-2b6d-4058-86e1-466baaa0f6c4/object-replicator/0.log" Oct 12 08:27:11 crc kubenswrapper[4599]: I1012 08:27:11.176372 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_7f1c5e37-2b6d-4058-86e1-466baaa0f6c4/object-server/0.log" Oct 12 08:27:11 crc kubenswrapper[4599]: I1012 08:27:11.182565 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_7f1c5e37-2b6d-4058-86e1-466baaa0f6c4/rsync/0.log" Oct 12 08:27:11 crc kubenswrapper[4599]: I1012 08:27:11.207553 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_7f1c5e37-2b6d-4058-86e1-466baaa0f6c4/object-updater/0.log" Oct 12 08:27:11 crc kubenswrapper[4599]: I1012 08:27:11.316928 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_7f1c5e37-2b6d-4058-86e1-466baaa0f6c4/swift-recon-cron/0.log" Oct 12 08:27:11 crc kubenswrapper[4599]: I1012 08:27:11.418526 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-b67cz_a489145f-f0fe-4e55-a9eb-29df1419aa2b/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Oct 12 08:27:11 crc kubenswrapper[4599]: I1012 08:27:11.698644 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_11baa2a2-2767-4dde-96b3-708570c6575d/test-operator-logs-container/0.log" Oct 12 08:27:11 crc kubenswrapper[4599]: I1012 08:27:11.718035 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_bdf67e61-9c15-4079-9a9b-a74c40ad364f/tempest-tests-tempest-tests-runner/0.log" Oct 12 08:27:11 crc kubenswrapper[4599]: I1012 08:27:11.822863 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-d787p_0027b20a-21c6-437b-b807-50484ab21289/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 12 08:27:14 crc kubenswrapper[4599]: I1012 08:27:14.544729 4599 scope.go:117] "RemoveContainer" containerID="536fe0523d7f4d373d85a237a3866f6e04c0049c07c9b6780974dfce18c2815f" Oct 12 08:27:14 crc kubenswrapper[4599]: E1012 08:27:14.545305 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5mz5c_openshift-machine-config-operator(cc694bce-8c25-4729-b452-29d44d3efe6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" Oct 12 08:27:19 crc kubenswrapper[4599]: I1012 08:27:19.497205 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_fe3787ed-cb03-457b-ad65-b33044cccffd/memcached/0.log" Oct 12 08:27:26 crc kubenswrapper[4599]: I1012 08:27:26.545696 4599 scope.go:117] "RemoveContainer" containerID="536fe0523d7f4d373d85a237a3866f6e04c0049c07c9b6780974dfce18c2815f" Oct 12 08:27:26 crc kubenswrapper[4599]: E1012 08:27:26.546309 4599 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5mz5c_openshift-machine-config-operator(cc694bce-8c25-4729-b452-29d44d3efe6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" Oct 12 08:27:28 crc kubenswrapper[4599]: I1012 08:27:28.622016 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-658bdf4b74-65btt_390a00af-1983-41ce-b7f2-3190e2d1594e/manager/0.log" Oct 12 08:27:28 crc kubenswrapper[4599]: I1012 08:27:28.636664 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-658bdf4b74-65btt_390a00af-1983-41ce-b7f2-3190e2d1594e/kube-rbac-proxy/0.log" Oct 12 08:27:28 crc kubenswrapper[4599]: I1012 08:27:28.747535 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365bb5qpj_a64bef86-ebe2-411a-92d3-f63087030b92/util/0.log" Oct 12 08:27:28 crc kubenswrapper[4599]: I1012 08:27:28.883552 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365bb5qpj_a64bef86-ebe2-411a-92d3-f63087030b92/pull/0.log" Oct 12 08:27:28 crc kubenswrapper[4599]: I1012 08:27:28.883779 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365bb5qpj_a64bef86-ebe2-411a-92d3-f63087030b92/pull/0.log" Oct 12 08:27:28 crc kubenswrapper[4599]: I1012 08:27:28.885362 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365bb5qpj_a64bef86-ebe2-411a-92d3-f63087030b92/util/0.log" Oct 12 08:27:29 crc kubenswrapper[4599]: I1012 08:27:29.000248 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365bb5qpj_a64bef86-ebe2-411a-92d3-f63087030b92/util/0.log" Oct 12 08:27:29 crc kubenswrapper[4599]: I1012 08:27:29.022218 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365bb5qpj_a64bef86-ebe2-411a-92d3-f63087030b92/extract/0.log" Oct 12 08:27:29 crc kubenswrapper[4599]: I1012 08:27:29.037072 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365bb5qpj_a64bef86-ebe2-411a-92d3-f63087030b92/pull/0.log" Oct 12 08:27:29 crc kubenswrapper[4599]: I1012 08:27:29.178943 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-7b7fb68549-nwqxv_f13311d0-566a-4c8d-823c-fae47384cd53/manager/0.log" Oct 12 08:27:29 crc kubenswrapper[4599]: I1012 08:27:29.189445 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-85d5d9dd78-rd8wv_37355d3f-b321-446a-b0ac-5d3a770bd0c5/kube-rbac-proxy/0.log" Oct 12 08:27:29 crc kubenswrapper[4599]: I1012 08:27:29.189672 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-7b7fb68549-nwqxv_f13311d0-566a-4c8d-823c-fae47384cd53/kube-rbac-proxy/0.log" Oct 12 08:27:29 crc kubenswrapper[4599]: I1012 08:27:29.293095 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-85d5d9dd78-rd8wv_37355d3f-b321-446a-b0ac-5d3a770bd0c5/manager/0.log" Oct 12 08:27:29 crc kubenswrapper[4599]: I1012 08:27:29.328270 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-84b9b84486-kbqh6_4f32c957-e414-456c-b06e-6f38553efe85/kube-rbac-proxy/0.log" Oct 12 08:27:29 crc kubenswrapper[4599]: I1012 08:27:29.382587 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-84b9b84486-kbqh6_4f32c957-e414-456c-b06e-6f38553efe85/manager/0.log" Oct 12 08:27:29 crc kubenswrapper[4599]: I1012 08:27:29.467561 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-858f76bbdd-n6scs_986744d6-7f19-4f08-9dfd-03629fe2ca58/kube-rbac-proxy/0.log" Oct 12 08:27:29 crc kubenswrapper[4599]: I1012 08:27:29.504849 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-858f76bbdd-n6scs_986744d6-7f19-4f08-9dfd-03629fe2ca58/manager/0.log" Oct 12 08:27:29 crc kubenswrapper[4599]: I1012 08:27:29.588587 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-7ffbcb7588-hlvdz_8ce364df-e28b-45cf-ae95-92ae415392f0/kube-rbac-proxy/0.log" Oct 12 08:27:29 crc kubenswrapper[4599]: I1012 08:27:29.625919 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-7ffbcb7588-hlvdz_8ce364df-e28b-45cf-ae95-92ae415392f0/manager/0.log" Oct 12 08:27:29 crc kubenswrapper[4599]: I1012 08:27:29.660266 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-656bcbd775-6xt98_32a9cc54-3488-4659-a83f-0a6dc0c402c9/kube-rbac-proxy/0.log" Oct 12 08:27:29 crc kubenswrapper[4599]: I1012 08:27:29.800372 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-9c5c78d49-gq5k6_f70b2a0c-df5a-4a41-89db-e1bf314ee45a/manager/0.log" Oct 12 08:27:29 crc kubenswrapper[4599]: I1012 08:27:29.840364 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-9c5c78d49-gq5k6_f70b2a0c-df5a-4a41-89db-e1bf314ee45a/kube-rbac-proxy/0.log" Oct 12 08:27:29 crc kubenswrapper[4599]: I1012 08:27:29.848888 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-656bcbd775-6xt98_32a9cc54-3488-4659-a83f-0a6dc0c402c9/manager/0.log" Oct 12 08:27:29 crc kubenswrapper[4599]: I1012 08:27:29.960171 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-55b6b7c7b8-7zfpt_297195f2-cb07-4bd8-9994-82f16e2f83f3/kube-rbac-proxy/0.log" Oct 12 08:27:30 crc kubenswrapper[4599]: I1012 08:27:30.037277 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-55b6b7c7b8-7zfpt_297195f2-cb07-4bd8-9994-82f16e2f83f3/manager/0.log" Oct 12 08:27:30 crc kubenswrapper[4599]: I1012 08:27:30.082686 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-5f67fbc655-sbgr8_6d340683-0013-4bf4-b98b-32610996ded4/kube-rbac-proxy/0.log" Oct 12 08:27:30 crc kubenswrapper[4599]: I1012 08:27:30.115193 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-5f67fbc655-sbgr8_6d340683-0013-4bf4-b98b-32610996ded4/manager/0.log" Oct 12 08:27:30 crc kubenswrapper[4599]: I1012 08:27:30.176378 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-f9fb45f8f-5zhzq_07c5394f-62de-4eca-86c0-c534788aead5/kube-rbac-proxy/0.log" Oct 12 08:27:30 crc kubenswrapper[4599]: I1012 08:27:30.227222 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-f9fb45f8f-5zhzq_07c5394f-62de-4eca-86c0-c534788aead5/manager/0.log" Oct 12 08:27:30 crc kubenswrapper[4599]: I1012 08:27:30.313792 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-79d585cb66-cz2h5_e88fb634-df40-47e9-a349-e7ac89e134f2/kube-rbac-proxy/0.log" Oct 12 08:27:30 crc kubenswrapper[4599]: I1012 08:27:30.359701 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-79d585cb66-cz2h5_e88fb634-df40-47e9-a349-e7ac89e134f2/manager/0.log" Oct 12 08:27:30 crc kubenswrapper[4599]: I1012 08:27:30.392301 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5df598886f-md9kr_84280387-ac26-4496-8c00-72673a91cb12/kube-rbac-proxy/0.log" Oct 12 08:27:30 crc kubenswrapper[4599]: I1012 08:27:30.508970 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5df598886f-md9kr_84280387-ac26-4496-8c00-72673a91cb12/manager/0.log" Oct 12 08:27:30 crc kubenswrapper[4599]: I1012 08:27:30.549319 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-69fdcfc5f5-8ccz2_bdeb0fae-d9af-4253-a90c-8a50255cc6fe/kube-rbac-proxy/0.log" Oct 12 08:27:30 crc kubenswrapper[4599]: I1012 08:27:30.550001 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-69fdcfc5f5-8ccz2_bdeb0fae-d9af-4253-a90c-8a50255cc6fe/manager/0.log" Oct 12 08:27:30 crc kubenswrapper[4599]: I1012 08:27:30.669058 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-5956dffb7b2gdkj_0cdab794-6175-4fb9-bd9d-c1080d45ee30/kube-rbac-proxy/0.log" Oct 12 08:27:30 crc kubenswrapper[4599]: I1012 08:27:30.686744 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-5956dffb7b2gdkj_0cdab794-6175-4fb9-bd9d-c1080d45ee30/manager/0.log" Oct 12 08:27:30 crc kubenswrapper[4599]: I1012 08:27:30.811685 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5b95c8954b-z5lsg_f4099eb0-0999-42a2-b525-5ae6b0ad984b/kube-rbac-proxy/0.log" Oct 12 08:27:30 crc kubenswrapper[4599]: I1012 08:27:30.865447 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-688d597459-gfhbq_bc697113-b995-44e9-92d2-070e55b12965/kube-rbac-proxy/0.log" Oct 12 08:27:31 crc kubenswrapper[4599]: I1012 08:27:31.124211 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-ds62j_2167f650-c160-4ede-ae67-5c8fd1f86b25/registry-server/0.log" Oct 12 08:27:31 crc kubenswrapper[4599]: I1012 08:27:31.124823 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-688d597459-gfhbq_bc697113-b995-44e9-92d2-070e55b12965/operator/0.log" Oct 12 08:27:31 crc kubenswrapper[4599]: I1012 08:27:31.364987 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-79df5fb58c-2fhbx_dc9b0db0-82c1-492b-94de-c8f93e96364f/kube-rbac-proxy/0.log" Oct 12 08:27:31 crc kubenswrapper[4599]: I1012 08:27:31.391293 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-79df5fb58c-2fhbx_dc9b0db0-82c1-492b-94de-c8f93e96364f/manager/0.log" Oct 12 08:27:31 crc kubenswrapper[4599]: I1012 08:27:31.464053 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-68b6c87b68-d2q7d_b1de56e0-d6e0-4c5a-9c4e-c725f171e142/kube-rbac-proxy/0.log" Oct 12 08:27:31 crc kubenswrapper[4599]: I1012 08:27:31.541474 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-68b6c87b68-d2q7d_b1de56e0-d6e0-4c5a-9c4e-c725f171e142/manager/0.log" Oct 12 08:27:31 crc kubenswrapper[4599]: I1012 08:27:31.625669 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-7sh6m_d5252e14-f285-43af-ace5-375bcfbe4c68/operator/0.log" Oct 12 08:27:31 crc kubenswrapper[4599]: I1012 08:27:31.663590 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5b95c8954b-z5lsg_f4099eb0-0999-42a2-b525-5ae6b0ad984b/manager/0.log" Oct 12 08:27:31 crc kubenswrapper[4599]: I1012 08:27:31.698464 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-db6d7f97b-gzlp6_caf862ab-9fa9-4c44-8e6c-35599bcc45a1/kube-rbac-proxy/0.log" Oct 12 08:27:31 crc kubenswrapper[4599]: I1012 08:27:31.765062 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-db6d7f97b-gzlp6_caf862ab-9fa9-4c44-8e6c-35599bcc45a1/manager/0.log" Oct 12 08:27:31 crc kubenswrapper[4599]: I1012 08:27:31.793585 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-67cfc6749b-m9l8f_5c565798-c0f8-4d14-b531-386b1b0efc63/kube-rbac-proxy/0.log" Oct 12 08:27:31 crc kubenswrapper[4599]: I1012 08:27:31.854718 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-67cfc6749b-m9l8f_5c565798-c0f8-4d14-b531-386b1b0efc63/manager/0.log" Oct 12 08:27:31 crc kubenswrapper[4599]: I1012 08:27:31.922328 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5458f77c4-b6bwb_5f9bd209-b904-4998-843c-d4573b0a2cd0/kube-rbac-proxy/0.log" Oct 12 08:27:31 crc kubenswrapper[4599]: I1012 08:27:31.931844 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5458f77c4-b6bwb_5f9bd209-b904-4998-843c-d4573b0a2cd0/manager/0.log" Oct 12 08:27:32 crc kubenswrapper[4599]: I1012 08:27:32.001056 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-7f554bff7b-ffpvz_20be3b1c-5de2-4c22-a3af-215e2272d586/kube-rbac-proxy/0.log" Oct 12 08:27:32 crc kubenswrapper[4599]: I1012 08:27:32.009381 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-7f554bff7b-ffpvz_20be3b1c-5de2-4c22-a3af-215e2272d586/manager/0.log" Oct 12 08:27:32 crc kubenswrapper[4599]: I1012 08:27:32.349791 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2kg7f"] Oct 12 08:27:32 crc kubenswrapper[4599]: E1012 08:27:32.350090 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd4ebd54-da27-4008-9ed4-708c91fae5a3" containerName="container-00" Oct 12 08:27:32 crc kubenswrapper[4599]: I1012 08:27:32.350107 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd4ebd54-da27-4008-9ed4-708c91fae5a3" containerName="container-00" Oct 12 08:27:32 crc kubenswrapper[4599]: I1012 08:27:32.350272 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd4ebd54-da27-4008-9ed4-708c91fae5a3" containerName="container-00" Oct 12 08:27:32 crc kubenswrapper[4599]: I1012 08:27:32.351416 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2kg7f" Oct 12 08:27:32 crc kubenswrapper[4599]: I1012 08:27:32.358881 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2kg7f"] Oct 12 08:27:32 crc kubenswrapper[4599]: I1012 08:27:32.482603 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e364767d-95fe-438e-8e4f-c73bd3b92a04-utilities\") pod \"certified-operators-2kg7f\" (UID: \"e364767d-95fe-438e-8e4f-c73bd3b92a04\") " pod="openshift-marketplace/certified-operators-2kg7f" Oct 12 08:27:32 crc kubenswrapper[4599]: I1012 08:27:32.482786 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frbnf\" (UniqueName: \"kubernetes.io/projected/e364767d-95fe-438e-8e4f-c73bd3b92a04-kube-api-access-frbnf\") pod \"certified-operators-2kg7f\" (UID: \"e364767d-95fe-438e-8e4f-c73bd3b92a04\") " pod="openshift-marketplace/certified-operators-2kg7f" Oct 12 08:27:32 crc kubenswrapper[4599]: I1012 08:27:32.483004 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e364767d-95fe-438e-8e4f-c73bd3b92a04-catalog-content\") pod \"certified-operators-2kg7f\" (UID: \"e364767d-95fe-438e-8e4f-c73bd3b92a04\") " pod="openshift-marketplace/certified-operators-2kg7f" Oct 12 08:27:32 crc kubenswrapper[4599]: I1012 08:27:32.545457 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4jgzx"] Oct 12 08:27:32 crc kubenswrapper[4599]: I1012 08:27:32.548155 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4jgzx" Oct 12 08:27:32 crc kubenswrapper[4599]: I1012 08:27:32.586628 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e364767d-95fe-438e-8e4f-c73bd3b92a04-utilities\") pod \"certified-operators-2kg7f\" (UID: \"e364767d-95fe-438e-8e4f-c73bd3b92a04\") " pod="openshift-marketplace/certified-operators-2kg7f" Oct 12 08:27:32 crc kubenswrapper[4599]: I1012 08:27:32.586771 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frbnf\" (UniqueName: \"kubernetes.io/projected/e364767d-95fe-438e-8e4f-c73bd3b92a04-kube-api-access-frbnf\") pod \"certified-operators-2kg7f\" (UID: \"e364767d-95fe-438e-8e4f-c73bd3b92a04\") " pod="openshift-marketplace/certified-operators-2kg7f" Oct 12 08:27:32 crc kubenswrapper[4599]: I1012 08:27:32.591837 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e364767d-95fe-438e-8e4f-c73bd3b92a04-utilities\") pod \"certified-operators-2kg7f\" (UID: \"e364767d-95fe-438e-8e4f-c73bd3b92a04\") " pod="openshift-marketplace/certified-operators-2kg7f" Oct 12 08:27:32 crc kubenswrapper[4599]: I1012 08:27:32.596673 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e364767d-95fe-438e-8e4f-c73bd3b92a04-catalog-content\") pod \"certified-operators-2kg7f\" (UID: \"e364767d-95fe-438e-8e4f-c73bd3b92a04\") " pod="openshift-marketplace/certified-operators-2kg7f" Oct 12 08:27:32 crc kubenswrapper[4599]: I1012 08:27:32.600480 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e364767d-95fe-438e-8e4f-c73bd3b92a04-catalog-content\") pod \"certified-operators-2kg7f\" (UID: \"e364767d-95fe-438e-8e4f-c73bd3b92a04\") " pod="openshift-marketplace/certified-operators-2kg7f" Oct 12 08:27:32 crc kubenswrapper[4599]: I1012 08:27:32.618919 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frbnf\" (UniqueName: \"kubernetes.io/projected/e364767d-95fe-438e-8e4f-c73bd3b92a04-kube-api-access-frbnf\") pod \"certified-operators-2kg7f\" (UID: \"e364767d-95fe-438e-8e4f-c73bd3b92a04\") " pod="openshift-marketplace/certified-operators-2kg7f" Oct 12 08:27:32 crc kubenswrapper[4599]: I1012 08:27:32.620307 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4jgzx"] Oct 12 08:27:32 crc kubenswrapper[4599]: I1012 08:27:32.665964 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2kg7f" Oct 12 08:27:32 crc kubenswrapper[4599]: I1012 08:27:32.703544 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glcwr\" (UniqueName: \"kubernetes.io/projected/9e349020-7268-4ca0-b16f-d4122ffd70ff-kube-api-access-glcwr\") pod \"redhat-operators-4jgzx\" (UID: \"9e349020-7268-4ca0-b16f-d4122ffd70ff\") " pod="openshift-marketplace/redhat-operators-4jgzx" Oct 12 08:27:32 crc kubenswrapper[4599]: I1012 08:27:32.703640 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e349020-7268-4ca0-b16f-d4122ffd70ff-catalog-content\") pod \"redhat-operators-4jgzx\" (UID: \"9e349020-7268-4ca0-b16f-d4122ffd70ff\") " pod="openshift-marketplace/redhat-operators-4jgzx" Oct 12 08:27:32 crc kubenswrapper[4599]: I1012 08:27:32.703677 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e349020-7268-4ca0-b16f-d4122ffd70ff-utilities\") pod \"redhat-operators-4jgzx\" (UID: \"9e349020-7268-4ca0-b16f-d4122ffd70ff\") " pod="openshift-marketplace/redhat-operators-4jgzx" Oct 12 08:27:32 crc kubenswrapper[4599]: I1012 08:27:32.810131 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-glcwr\" (UniqueName: \"kubernetes.io/projected/9e349020-7268-4ca0-b16f-d4122ffd70ff-kube-api-access-glcwr\") pod \"redhat-operators-4jgzx\" (UID: \"9e349020-7268-4ca0-b16f-d4122ffd70ff\") " pod="openshift-marketplace/redhat-operators-4jgzx" Oct 12 08:27:32 crc kubenswrapper[4599]: I1012 08:27:32.810952 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e349020-7268-4ca0-b16f-d4122ffd70ff-catalog-content\") pod \"redhat-operators-4jgzx\" (UID: \"9e349020-7268-4ca0-b16f-d4122ffd70ff\") " pod="openshift-marketplace/redhat-operators-4jgzx" Oct 12 08:27:32 crc kubenswrapper[4599]: I1012 08:27:32.811059 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e349020-7268-4ca0-b16f-d4122ffd70ff-utilities\") pod \"redhat-operators-4jgzx\" (UID: \"9e349020-7268-4ca0-b16f-d4122ffd70ff\") " pod="openshift-marketplace/redhat-operators-4jgzx" Oct 12 08:27:32 crc kubenswrapper[4599]: I1012 08:27:32.811701 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e349020-7268-4ca0-b16f-d4122ffd70ff-utilities\") pod \"redhat-operators-4jgzx\" (UID: \"9e349020-7268-4ca0-b16f-d4122ffd70ff\") " pod="openshift-marketplace/redhat-operators-4jgzx" Oct 12 08:27:32 crc kubenswrapper[4599]: I1012 08:27:32.811907 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e349020-7268-4ca0-b16f-d4122ffd70ff-catalog-content\") pod \"redhat-operators-4jgzx\" (UID: \"9e349020-7268-4ca0-b16f-d4122ffd70ff\") " pod="openshift-marketplace/redhat-operators-4jgzx" Oct 12 08:27:32 crc kubenswrapper[4599]: I1012 08:27:32.836110 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-glcwr\" (UniqueName: \"kubernetes.io/projected/9e349020-7268-4ca0-b16f-d4122ffd70ff-kube-api-access-glcwr\") pod \"redhat-operators-4jgzx\" (UID: \"9e349020-7268-4ca0-b16f-d4122ffd70ff\") " pod="openshift-marketplace/redhat-operators-4jgzx" Oct 12 08:27:32 crc kubenswrapper[4599]: I1012 08:27:32.867593 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4jgzx" Oct 12 08:27:33 crc kubenswrapper[4599]: I1012 08:27:33.136465 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2kg7f"] Oct 12 08:27:33 crc kubenswrapper[4599]: I1012 08:27:33.286770 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4jgzx"] Oct 12 08:27:33 crc kubenswrapper[4599]: W1012 08:27:33.343092 4599 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e349020_7268_4ca0_b16f_d4122ffd70ff.slice/crio-64c19d3c7d23dd19a7a1fbabbf2179fe5adc3ac93b74239eea2f277de2048be4 WatchSource:0}: Error finding container 64c19d3c7d23dd19a7a1fbabbf2179fe5adc3ac93b74239eea2f277de2048be4: Status 404 returned error can't find the container with id 64c19d3c7d23dd19a7a1fbabbf2179fe5adc3ac93b74239eea2f277de2048be4 Oct 12 08:27:33 crc kubenswrapper[4599]: I1012 08:27:33.591361 4599 generic.go:334] "Generic (PLEG): container finished" podID="e364767d-95fe-438e-8e4f-c73bd3b92a04" containerID="9da812629f2aadae9831a58f5e6a127125d62912ae85cc84848a535edd4efda6" exitCode=0 Oct 12 08:27:33 crc kubenswrapper[4599]: I1012 08:27:33.591403 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2kg7f" event={"ID":"e364767d-95fe-438e-8e4f-c73bd3b92a04","Type":"ContainerDied","Data":"9da812629f2aadae9831a58f5e6a127125d62912ae85cc84848a535edd4efda6"} Oct 12 08:27:33 crc kubenswrapper[4599]: I1012 08:27:33.591456 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2kg7f" event={"ID":"e364767d-95fe-438e-8e4f-c73bd3b92a04","Type":"ContainerStarted","Data":"6ad54d4fd4b887ed0682b2e5daf867944807dea2170e1741453c8097fb0258e2"} Oct 12 08:27:33 crc kubenswrapper[4599]: I1012 08:27:33.592909 4599 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 12 08:27:33 crc kubenswrapper[4599]: I1012 08:27:33.592968 4599 generic.go:334] "Generic (PLEG): container finished" podID="9e349020-7268-4ca0-b16f-d4122ffd70ff" containerID="5d4d7961b57d7bbd4f6e8dbb053db79d13786a49e237142f3d599959d033f18f" exitCode=0 Oct 12 08:27:33 crc kubenswrapper[4599]: I1012 08:27:33.592995 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4jgzx" event={"ID":"9e349020-7268-4ca0-b16f-d4122ffd70ff","Type":"ContainerDied","Data":"5d4d7961b57d7bbd4f6e8dbb053db79d13786a49e237142f3d599959d033f18f"} Oct 12 08:27:33 crc kubenswrapper[4599]: I1012 08:27:33.593016 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4jgzx" event={"ID":"9e349020-7268-4ca0-b16f-d4122ffd70ff","Type":"ContainerStarted","Data":"64c19d3c7d23dd19a7a1fbabbf2179fe5adc3ac93b74239eea2f277de2048be4"} Oct 12 08:27:34 crc kubenswrapper[4599]: I1012 08:27:34.600550 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2kg7f" event={"ID":"e364767d-95fe-438e-8e4f-c73bd3b92a04","Type":"ContainerStarted","Data":"05e1c50c168cbdfcc96a0657bf9e6839691d8b2c590c0b01c7c8deac5a1c7b5d"} Oct 12 08:27:34 crc kubenswrapper[4599]: I1012 08:27:34.602976 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4jgzx" event={"ID":"9e349020-7268-4ca0-b16f-d4122ffd70ff","Type":"ContainerStarted","Data":"948caf626ed0f1aca6fbc1286672721e56ea41fe2870e221d5dd059695c8ed3e"} Oct 12 08:27:34 crc kubenswrapper[4599]: I1012 08:27:34.745515 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-676j9"] Oct 12 08:27:34 crc kubenswrapper[4599]: I1012 08:27:34.747143 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-676j9" Oct 12 08:27:34 crc kubenswrapper[4599]: I1012 08:27:34.755695 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-676j9"] Oct 12 08:27:34 crc kubenswrapper[4599]: I1012 08:27:34.858206 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cz67x\" (UniqueName: \"kubernetes.io/projected/40603c3c-1d14-4973-aac7-c28ba1aa08a3-kube-api-access-cz67x\") pod \"redhat-marketplace-676j9\" (UID: \"40603c3c-1d14-4973-aac7-c28ba1aa08a3\") " pod="openshift-marketplace/redhat-marketplace-676j9" Oct 12 08:27:34 crc kubenswrapper[4599]: I1012 08:27:34.858380 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40603c3c-1d14-4973-aac7-c28ba1aa08a3-catalog-content\") pod \"redhat-marketplace-676j9\" (UID: \"40603c3c-1d14-4973-aac7-c28ba1aa08a3\") " pod="openshift-marketplace/redhat-marketplace-676j9" Oct 12 08:27:34 crc kubenswrapper[4599]: I1012 08:27:34.858473 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40603c3c-1d14-4973-aac7-c28ba1aa08a3-utilities\") pod \"redhat-marketplace-676j9\" (UID: \"40603c3c-1d14-4973-aac7-c28ba1aa08a3\") " pod="openshift-marketplace/redhat-marketplace-676j9" Oct 12 08:27:34 crc kubenswrapper[4599]: I1012 08:27:34.959587 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cz67x\" (UniqueName: \"kubernetes.io/projected/40603c3c-1d14-4973-aac7-c28ba1aa08a3-kube-api-access-cz67x\") pod \"redhat-marketplace-676j9\" (UID: \"40603c3c-1d14-4973-aac7-c28ba1aa08a3\") " pod="openshift-marketplace/redhat-marketplace-676j9" Oct 12 08:27:34 crc kubenswrapper[4599]: I1012 08:27:34.959697 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40603c3c-1d14-4973-aac7-c28ba1aa08a3-catalog-content\") pod \"redhat-marketplace-676j9\" (UID: \"40603c3c-1d14-4973-aac7-c28ba1aa08a3\") " pod="openshift-marketplace/redhat-marketplace-676j9" Oct 12 08:27:34 crc kubenswrapper[4599]: I1012 08:27:34.959757 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40603c3c-1d14-4973-aac7-c28ba1aa08a3-utilities\") pod \"redhat-marketplace-676j9\" (UID: \"40603c3c-1d14-4973-aac7-c28ba1aa08a3\") " pod="openshift-marketplace/redhat-marketplace-676j9" Oct 12 08:27:34 crc kubenswrapper[4599]: I1012 08:27:34.960101 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40603c3c-1d14-4973-aac7-c28ba1aa08a3-catalog-content\") pod \"redhat-marketplace-676j9\" (UID: \"40603c3c-1d14-4973-aac7-c28ba1aa08a3\") " pod="openshift-marketplace/redhat-marketplace-676j9" Oct 12 08:27:34 crc kubenswrapper[4599]: I1012 08:27:34.960140 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40603c3c-1d14-4973-aac7-c28ba1aa08a3-utilities\") pod \"redhat-marketplace-676j9\" (UID: \"40603c3c-1d14-4973-aac7-c28ba1aa08a3\") " pod="openshift-marketplace/redhat-marketplace-676j9" Oct 12 08:27:34 crc kubenswrapper[4599]: I1012 08:27:34.975979 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cz67x\" (UniqueName: \"kubernetes.io/projected/40603c3c-1d14-4973-aac7-c28ba1aa08a3-kube-api-access-cz67x\") pod \"redhat-marketplace-676j9\" (UID: \"40603c3c-1d14-4973-aac7-c28ba1aa08a3\") " pod="openshift-marketplace/redhat-marketplace-676j9" Oct 12 08:27:35 crc kubenswrapper[4599]: I1012 08:27:35.059003 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-676j9" Oct 12 08:27:35 crc kubenswrapper[4599]: I1012 08:27:35.459031 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-676j9"] Oct 12 08:27:35 crc kubenswrapper[4599]: I1012 08:27:35.610510 4599 generic.go:334] "Generic (PLEG): container finished" podID="9e349020-7268-4ca0-b16f-d4122ffd70ff" containerID="948caf626ed0f1aca6fbc1286672721e56ea41fe2870e221d5dd059695c8ed3e" exitCode=0 Oct 12 08:27:35 crc kubenswrapper[4599]: I1012 08:27:35.610585 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4jgzx" event={"ID":"9e349020-7268-4ca0-b16f-d4122ffd70ff","Type":"ContainerDied","Data":"948caf626ed0f1aca6fbc1286672721e56ea41fe2870e221d5dd059695c8ed3e"} Oct 12 08:27:35 crc kubenswrapper[4599]: I1012 08:27:35.614396 4599 generic.go:334] "Generic (PLEG): container finished" podID="e364767d-95fe-438e-8e4f-c73bd3b92a04" containerID="05e1c50c168cbdfcc96a0657bf9e6839691d8b2c590c0b01c7c8deac5a1c7b5d" exitCode=0 Oct 12 08:27:35 crc kubenswrapper[4599]: I1012 08:27:35.614480 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2kg7f" event={"ID":"e364767d-95fe-438e-8e4f-c73bd3b92a04","Type":"ContainerDied","Data":"05e1c50c168cbdfcc96a0657bf9e6839691d8b2c590c0b01c7c8deac5a1c7b5d"} Oct 12 08:27:35 crc kubenswrapper[4599]: I1012 08:27:35.624242 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-676j9" event={"ID":"40603c3c-1d14-4973-aac7-c28ba1aa08a3","Type":"ContainerStarted","Data":"ca3e2a1d1a8e24ebe157c4037f024ab7de9b278eb10b7da069a579959ac66bf5"} Oct 12 08:27:35 crc kubenswrapper[4599]: I1012 08:27:35.624287 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-676j9" event={"ID":"40603c3c-1d14-4973-aac7-c28ba1aa08a3","Type":"ContainerStarted","Data":"ccb0b568a59c053b3fbe90a91ebea8056f42ff9224c79424d01993aca80c31ce"} Oct 12 08:27:36 crc kubenswrapper[4599]: I1012 08:27:36.632959 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2kg7f" event={"ID":"e364767d-95fe-438e-8e4f-c73bd3b92a04","Type":"ContainerStarted","Data":"de1d5e4b7702251cd14c5b935809ed8dddce6ae576ea2396e8894f881309951a"} Oct 12 08:27:36 crc kubenswrapper[4599]: I1012 08:27:36.637538 4599 generic.go:334] "Generic (PLEG): container finished" podID="40603c3c-1d14-4973-aac7-c28ba1aa08a3" containerID="ca3e2a1d1a8e24ebe157c4037f024ab7de9b278eb10b7da069a579959ac66bf5" exitCode=0 Oct 12 08:27:36 crc kubenswrapper[4599]: I1012 08:27:36.637603 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-676j9" event={"ID":"40603c3c-1d14-4973-aac7-c28ba1aa08a3","Type":"ContainerDied","Data":"ca3e2a1d1a8e24ebe157c4037f024ab7de9b278eb10b7da069a579959ac66bf5"} Oct 12 08:27:36 crc kubenswrapper[4599]: I1012 08:27:36.644539 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4jgzx" event={"ID":"9e349020-7268-4ca0-b16f-d4122ffd70ff","Type":"ContainerStarted","Data":"7aa1ba52d4d8c32dd70950e7c2ecd5649e854e26be1e67a63c2190ce6d5f8c95"} Oct 12 08:27:36 crc kubenswrapper[4599]: I1012 08:27:36.653715 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2kg7f" podStartSLOduration=2.113496272 podStartE2EDuration="4.653698135s" podCreationTimestamp="2025-10-12 08:27:32 +0000 UTC" firstStartedPulling="2025-10-12 08:27:33.592674021 +0000 UTC m=+3150.381869524" lastFinishedPulling="2025-10-12 08:27:36.132875885 +0000 UTC m=+3152.922071387" observedRunningTime="2025-10-12 08:27:36.649192908 +0000 UTC m=+3153.438388411" watchObservedRunningTime="2025-10-12 08:27:36.653698135 +0000 UTC m=+3153.442893636" Oct 12 08:27:36 crc kubenswrapper[4599]: I1012 08:27:36.665950 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4jgzx" podStartSLOduration=2.004917352 podStartE2EDuration="4.665931584s" podCreationTimestamp="2025-10-12 08:27:32 +0000 UTC" firstStartedPulling="2025-10-12 08:27:33.594841479 +0000 UTC m=+3150.384036971" lastFinishedPulling="2025-10-12 08:27:36.255855701 +0000 UTC m=+3153.045051203" observedRunningTime="2025-10-12 08:27:36.662046327 +0000 UTC m=+3153.451241829" watchObservedRunningTime="2025-10-12 08:27:36.665931584 +0000 UTC m=+3153.455127086" Oct 12 08:27:37 crc kubenswrapper[4599]: I1012 08:27:37.652452 4599 generic.go:334] "Generic (PLEG): container finished" podID="40603c3c-1d14-4973-aac7-c28ba1aa08a3" containerID="bb0d7360eebde2f94572fe04956e95c924c764a40d8b6edfbbc727ceeb321ebf" exitCode=0 Oct 12 08:27:37 crc kubenswrapper[4599]: I1012 08:27:37.652499 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-676j9" event={"ID":"40603c3c-1d14-4973-aac7-c28ba1aa08a3","Type":"ContainerDied","Data":"bb0d7360eebde2f94572fe04956e95c924c764a40d8b6edfbbc727ceeb321ebf"} Oct 12 08:27:38 crc kubenswrapper[4599]: I1012 08:27:38.545266 4599 scope.go:117] "RemoveContainer" containerID="536fe0523d7f4d373d85a237a3866f6e04c0049c07c9b6780974dfce18c2815f" Oct 12 08:27:38 crc kubenswrapper[4599]: I1012 08:27:38.659779 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-676j9" event={"ID":"40603c3c-1d14-4973-aac7-c28ba1aa08a3","Type":"ContainerStarted","Data":"29d1e8040444d91cb4ec6ca5019d61e6571f099311c941eadb7cf3f21dcb9c79"} Oct 12 08:27:38 crc kubenswrapper[4599]: I1012 08:27:38.686917 4599 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-676j9" podStartSLOduration=3.083550299 podStartE2EDuration="4.686901516s" podCreationTimestamp="2025-10-12 08:27:34 +0000 UTC" firstStartedPulling="2025-10-12 08:27:36.640131562 +0000 UTC m=+3153.429327064" lastFinishedPulling="2025-10-12 08:27:38.243482779 +0000 UTC m=+3155.032678281" observedRunningTime="2025-10-12 08:27:38.679560142 +0000 UTC m=+3155.468755645" watchObservedRunningTime="2025-10-12 08:27:38.686901516 +0000 UTC m=+3155.476097018" Oct 12 08:27:39 crc kubenswrapper[4599]: I1012 08:27:39.670824 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" event={"ID":"cc694bce-8c25-4729-b452-29d44d3efe6e","Type":"ContainerStarted","Data":"261ce39d1df16a70aab27383f113d909d054165675c4dc256efe676be8b231e4"} Oct 12 08:27:42 crc kubenswrapper[4599]: I1012 08:27:42.667024 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2kg7f" Oct 12 08:27:42 crc kubenswrapper[4599]: I1012 08:27:42.667592 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2kg7f" Oct 12 08:27:42 crc kubenswrapper[4599]: I1012 08:27:42.709645 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2kg7f" Oct 12 08:27:42 crc kubenswrapper[4599]: I1012 08:27:42.746262 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2kg7f" Oct 12 08:27:42 crc kubenswrapper[4599]: I1012 08:27:42.868630 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4jgzx" Oct 12 08:27:42 crc kubenswrapper[4599]: I1012 08:27:42.868680 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4jgzx" Oct 12 08:27:42 crc kubenswrapper[4599]: I1012 08:27:42.913081 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4jgzx" Oct 12 08:27:43 crc kubenswrapper[4599]: I1012 08:27:43.733776 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4jgzx" Oct 12 08:27:43 crc kubenswrapper[4599]: I1012 08:27:43.742209 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2kg7f"] Oct 12 08:27:44 crc kubenswrapper[4599]: I1012 08:27:44.391368 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-44z27_b22d491a-9add-4ec5-ad3e-e593f9ca93bd/control-plane-machine-set-operator/0.log" Oct 12 08:27:44 crc kubenswrapper[4599]: I1012 08:27:44.531559 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-tgck5_1e511cbc-ca72-4380-8626-d9cade8ce3e2/kube-rbac-proxy/0.log" Oct 12 08:27:44 crc kubenswrapper[4599]: I1012 08:27:44.540867 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-tgck5_1e511cbc-ca72-4380-8626-d9cade8ce3e2/machine-api-operator/0.log" Oct 12 08:27:44 crc kubenswrapper[4599]: I1012 08:27:44.702049 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2kg7f" podUID="e364767d-95fe-438e-8e4f-c73bd3b92a04" containerName="registry-server" containerID="cri-o://de1d5e4b7702251cd14c5b935809ed8dddce6ae576ea2396e8894f881309951a" gracePeriod=2 Oct 12 08:27:45 crc kubenswrapper[4599]: I1012 08:27:45.059993 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-676j9" Oct 12 08:27:45 crc kubenswrapper[4599]: I1012 08:27:45.060391 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-676j9" Oct 12 08:27:45 crc kubenswrapper[4599]: I1012 08:27:45.066557 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2kg7f" Oct 12 08:27:45 crc kubenswrapper[4599]: I1012 08:27:45.112301 4599 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-676j9" Oct 12 08:27:45 crc kubenswrapper[4599]: I1012 08:27:45.141392 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4jgzx"] Oct 12 08:27:45 crc kubenswrapper[4599]: I1012 08:27:45.253044 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frbnf\" (UniqueName: \"kubernetes.io/projected/e364767d-95fe-438e-8e4f-c73bd3b92a04-kube-api-access-frbnf\") pod \"e364767d-95fe-438e-8e4f-c73bd3b92a04\" (UID: \"e364767d-95fe-438e-8e4f-c73bd3b92a04\") " Oct 12 08:27:45 crc kubenswrapper[4599]: I1012 08:27:45.253162 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e364767d-95fe-438e-8e4f-c73bd3b92a04-utilities\") pod \"e364767d-95fe-438e-8e4f-c73bd3b92a04\" (UID: \"e364767d-95fe-438e-8e4f-c73bd3b92a04\") " Oct 12 08:27:45 crc kubenswrapper[4599]: I1012 08:27:45.253209 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e364767d-95fe-438e-8e4f-c73bd3b92a04-catalog-content\") pod \"e364767d-95fe-438e-8e4f-c73bd3b92a04\" (UID: \"e364767d-95fe-438e-8e4f-c73bd3b92a04\") " Oct 12 08:27:45 crc kubenswrapper[4599]: I1012 08:27:45.255784 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e364767d-95fe-438e-8e4f-c73bd3b92a04-utilities" (OuterVolumeSpecName: "utilities") pod "e364767d-95fe-438e-8e4f-c73bd3b92a04" (UID: "e364767d-95fe-438e-8e4f-c73bd3b92a04"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 08:27:45 crc kubenswrapper[4599]: I1012 08:27:45.273201 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e364767d-95fe-438e-8e4f-c73bd3b92a04-kube-api-access-frbnf" (OuterVolumeSpecName: "kube-api-access-frbnf") pod "e364767d-95fe-438e-8e4f-c73bd3b92a04" (UID: "e364767d-95fe-438e-8e4f-c73bd3b92a04"). InnerVolumeSpecName "kube-api-access-frbnf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 08:27:45 crc kubenswrapper[4599]: I1012 08:27:45.293042 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e364767d-95fe-438e-8e4f-c73bd3b92a04-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e364767d-95fe-438e-8e4f-c73bd3b92a04" (UID: "e364767d-95fe-438e-8e4f-c73bd3b92a04"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 08:27:45 crc kubenswrapper[4599]: I1012 08:27:45.357397 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-frbnf\" (UniqueName: \"kubernetes.io/projected/e364767d-95fe-438e-8e4f-c73bd3b92a04-kube-api-access-frbnf\") on node \"crc\" DevicePath \"\"" Oct 12 08:27:45 crc kubenswrapper[4599]: I1012 08:27:45.357461 4599 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e364767d-95fe-438e-8e4f-c73bd3b92a04-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 08:27:45 crc kubenswrapper[4599]: I1012 08:27:45.357479 4599 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e364767d-95fe-438e-8e4f-c73bd3b92a04-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 08:27:45 crc kubenswrapper[4599]: I1012 08:27:45.711301 4599 generic.go:334] "Generic (PLEG): container finished" podID="e364767d-95fe-438e-8e4f-c73bd3b92a04" containerID="de1d5e4b7702251cd14c5b935809ed8dddce6ae576ea2396e8894f881309951a" exitCode=0 Oct 12 08:27:45 crc kubenswrapper[4599]: I1012 08:27:45.711372 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2kg7f" event={"ID":"e364767d-95fe-438e-8e4f-c73bd3b92a04","Type":"ContainerDied","Data":"de1d5e4b7702251cd14c5b935809ed8dddce6ae576ea2396e8894f881309951a"} Oct 12 08:27:45 crc kubenswrapper[4599]: I1012 08:27:45.711420 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2kg7f" event={"ID":"e364767d-95fe-438e-8e4f-c73bd3b92a04","Type":"ContainerDied","Data":"6ad54d4fd4b887ed0682b2e5daf867944807dea2170e1741453c8097fb0258e2"} Oct 12 08:27:45 crc kubenswrapper[4599]: I1012 08:27:45.711381 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2kg7f" Oct 12 08:27:45 crc kubenswrapper[4599]: I1012 08:27:45.711458 4599 scope.go:117] "RemoveContainer" containerID="de1d5e4b7702251cd14c5b935809ed8dddce6ae576ea2396e8894f881309951a" Oct 12 08:27:45 crc kubenswrapper[4599]: I1012 08:27:45.711559 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-4jgzx" podUID="9e349020-7268-4ca0-b16f-d4122ffd70ff" containerName="registry-server" containerID="cri-o://7aa1ba52d4d8c32dd70950e7c2ecd5649e854e26be1e67a63c2190ce6d5f8c95" gracePeriod=2 Oct 12 08:27:45 crc kubenswrapper[4599]: I1012 08:27:45.735594 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2kg7f"] Oct 12 08:27:45 crc kubenswrapper[4599]: I1012 08:27:45.742147 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2kg7f"] Oct 12 08:27:45 crc kubenswrapper[4599]: I1012 08:27:45.742445 4599 scope.go:117] "RemoveContainer" containerID="05e1c50c168cbdfcc96a0657bf9e6839691d8b2c590c0b01c7c8deac5a1c7b5d" Oct 12 08:27:45 crc kubenswrapper[4599]: I1012 08:27:45.762393 4599 scope.go:117] "RemoveContainer" containerID="9da812629f2aadae9831a58f5e6a127125d62912ae85cc84848a535edd4efda6" Oct 12 08:27:45 crc kubenswrapper[4599]: I1012 08:27:45.772670 4599 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-676j9" Oct 12 08:27:45 crc kubenswrapper[4599]: I1012 08:27:45.883273 4599 scope.go:117] "RemoveContainer" containerID="de1d5e4b7702251cd14c5b935809ed8dddce6ae576ea2396e8894f881309951a" Oct 12 08:27:45 crc kubenswrapper[4599]: E1012 08:27:45.883816 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de1d5e4b7702251cd14c5b935809ed8dddce6ae576ea2396e8894f881309951a\": container with ID starting with de1d5e4b7702251cd14c5b935809ed8dddce6ae576ea2396e8894f881309951a not found: ID does not exist" containerID="de1d5e4b7702251cd14c5b935809ed8dddce6ae576ea2396e8894f881309951a" Oct 12 08:27:45 crc kubenswrapper[4599]: I1012 08:27:45.883850 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de1d5e4b7702251cd14c5b935809ed8dddce6ae576ea2396e8894f881309951a"} err="failed to get container status \"de1d5e4b7702251cd14c5b935809ed8dddce6ae576ea2396e8894f881309951a\": rpc error: code = NotFound desc = could not find container \"de1d5e4b7702251cd14c5b935809ed8dddce6ae576ea2396e8894f881309951a\": container with ID starting with de1d5e4b7702251cd14c5b935809ed8dddce6ae576ea2396e8894f881309951a not found: ID does not exist" Oct 12 08:27:45 crc kubenswrapper[4599]: I1012 08:27:45.883869 4599 scope.go:117] "RemoveContainer" containerID="05e1c50c168cbdfcc96a0657bf9e6839691d8b2c590c0b01c7c8deac5a1c7b5d" Oct 12 08:27:45 crc kubenswrapper[4599]: E1012 08:27:45.884165 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05e1c50c168cbdfcc96a0657bf9e6839691d8b2c590c0b01c7c8deac5a1c7b5d\": container with ID starting with 05e1c50c168cbdfcc96a0657bf9e6839691d8b2c590c0b01c7c8deac5a1c7b5d not found: ID does not exist" containerID="05e1c50c168cbdfcc96a0657bf9e6839691d8b2c590c0b01c7c8deac5a1c7b5d" Oct 12 08:27:45 crc kubenswrapper[4599]: I1012 08:27:45.884210 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05e1c50c168cbdfcc96a0657bf9e6839691d8b2c590c0b01c7c8deac5a1c7b5d"} err="failed to get container status \"05e1c50c168cbdfcc96a0657bf9e6839691d8b2c590c0b01c7c8deac5a1c7b5d\": rpc error: code = NotFound desc = could not find container \"05e1c50c168cbdfcc96a0657bf9e6839691d8b2c590c0b01c7c8deac5a1c7b5d\": container with ID starting with 05e1c50c168cbdfcc96a0657bf9e6839691d8b2c590c0b01c7c8deac5a1c7b5d not found: ID does not exist" Oct 12 08:27:45 crc kubenswrapper[4599]: I1012 08:27:45.884236 4599 scope.go:117] "RemoveContainer" containerID="9da812629f2aadae9831a58f5e6a127125d62912ae85cc84848a535edd4efda6" Oct 12 08:27:45 crc kubenswrapper[4599]: E1012 08:27:45.884580 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9da812629f2aadae9831a58f5e6a127125d62912ae85cc84848a535edd4efda6\": container with ID starting with 9da812629f2aadae9831a58f5e6a127125d62912ae85cc84848a535edd4efda6 not found: ID does not exist" containerID="9da812629f2aadae9831a58f5e6a127125d62912ae85cc84848a535edd4efda6" Oct 12 08:27:45 crc kubenswrapper[4599]: I1012 08:27:45.884603 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9da812629f2aadae9831a58f5e6a127125d62912ae85cc84848a535edd4efda6"} err="failed to get container status \"9da812629f2aadae9831a58f5e6a127125d62912ae85cc84848a535edd4efda6\": rpc error: code = NotFound desc = could not find container \"9da812629f2aadae9831a58f5e6a127125d62912ae85cc84848a535edd4efda6\": container with ID starting with 9da812629f2aadae9831a58f5e6a127125d62912ae85cc84848a535edd4efda6 not found: ID does not exist" Oct 12 08:27:46 crc kubenswrapper[4599]: I1012 08:27:46.059562 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4jgzx" Oct 12 08:27:46 crc kubenswrapper[4599]: I1012 08:27:46.175458 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e349020-7268-4ca0-b16f-d4122ffd70ff-utilities\") pod \"9e349020-7268-4ca0-b16f-d4122ffd70ff\" (UID: \"9e349020-7268-4ca0-b16f-d4122ffd70ff\") " Oct 12 08:27:46 crc kubenswrapper[4599]: I1012 08:27:46.175772 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-glcwr\" (UniqueName: \"kubernetes.io/projected/9e349020-7268-4ca0-b16f-d4122ffd70ff-kube-api-access-glcwr\") pod \"9e349020-7268-4ca0-b16f-d4122ffd70ff\" (UID: \"9e349020-7268-4ca0-b16f-d4122ffd70ff\") " Oct 12 08:27:46 crc kubenswrapper[4599]: I1012 08:27:46.175815 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e349020-7268-4ca0-b16f-d4122ffd70ff-catalog-content\") pod \"9e349020-7268-4ca0-b16f-d4122ffd70ff\" (UID: \"9e349020-7268-4ca0-b16f-d4122ffd70ff\") " Oct 12 08:27:46 crc kubenswrapper[4599]: I1012 08:27:46.176673 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e349020-7268-4ca0-b16f-d4122ffd70ff-utilities" (OuterVolumeSpecName: "utilities") pod "9e349020-7268-4ca0-b16f-d4122ffd70ff" (UID: "9e349020-7268-4ca0-b16f-d4122ffd70ff"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 08:27:46 crc kubenswrapper[4599]: I1012 08:27:46.182118 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e349020-7268-4ca0-b16f-d4122ffd70ff-kube-api-access-glcwr" (OuterVolumeSpecName: "kube-api-access-glcwr") pod "9e349020-7268-4ca0-b16f-d4122ffd70ff" (UID: "9e349020-7268-4ca0-b16f-d4122ffd70ff"). InnerVolumeSpecName "kube-api-access-glcwr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 08:27:46 crc kubenswrapper[4599]: I1012 08:27:46.243595 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e349020-7268-4ca0-b16f-d4122ffd70ff-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9e349020-7268-4ca0-b16f-d4122ffd70ff" (UID: "9e349020-7268-4ca0-b16f-d4122ffd70ff"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 08:27:46 crc kubenswrapper[4599]: I1012 08:27:46.279148 4599 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e349020-7268-4ca0-b16f-d4122ffd70ff-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 08:27:46 crc kubenswrapper[4599]: I1012 08:27:46.279180 4599 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e349020-7268-4ca0-b16f-d4122ffd70ff-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 08:27:46 crc kubenswrapper[4599]: I1012 08:27:46.279191 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-glcwr\" (UniqueName: \"kubernetes.io/projected/9e349020-7268-4ca0-b16f-d4122ffd70ff-kube-api-access-glcwr\") on node \"crc\" DevicePath \"\"" Oct 12 08:27:46 crc kubenswrapper[4599]: I1012 08:27:46.719773 4599 generic.go:334] "Generic (PLEG): container finished" podID="9e349020-7268-4ca0-b16f-d4122ffd70ff" containerID="7aa1ba52d4d8c32dd70950e7c2ecd5649e854e26be1e67a63c2190ce6d5f8c95" exitCode=0 Oct 12 08:27:46 crc kubenswrapper[4599]: I1012 08:27:46.719858 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4jgzx" Oct 12 08:27:46 crc kubenswrapper[4599]: I1012 08:27:46.719875 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4jgzx" event={"ID":"9e349020-7268-4ca0-b16f-d4122ffd70ff","Type":"ContainerDied","Data":"7aa1ba52d4d8c32dd70950e7c2ecd5649e854e26be1e67a63c2190ce6d5f8c95"} Oct 12 08:27:46 crc kubenswrapper[4599]: I1012 08:27:46.720196 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4jgzx" event={"ID":"9e349020-7268-4ca0-b16f-d4122ffd70ff","Type":"ContainerDied","Data":"64c19d3c7d23dd19a7a1fbabbf2179fe5adc3ac93b74239eea2f277de2048be4"} Oct 12 08:27:46 crc kubenswrapper[4599]: I1012 08:27:46.720223 4599 scope.go:117] "RemoveContainer" containerID="7aa1ba52d4d8c32dd70950e7c2ecd5649e854e26be1e67a63c2190ce6d5f8c95" Oct 12 08:27:46 crc kubenswrapper[4599]: I1012 08:27:46.750653 4599 scope.go:117] "RemoveContainer" containerID="948caf626ed0f1aca6fbc1286672721e56ea41fe2870e221d5dd059695c8ed3e" Oct 12 08:27:46 crc kubenswrapper[4599]: I1012 08:27:46.750753 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4jgzx"] Oct 12 08:27:46 crc kubenswrapper[4599]: I1012 08:27:46.757897 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-4jgzx"] Oct 12 08:27:46 crc kubenswrapper[4599]: I1012 08:27:46.780755 4599 scope.go:117] "RemoveContainer" containerID="5d4d7961b57d7bbd4f6e8dbb053db79d13786a49e237142f3d599959d033f18f" Oct 12 08:27:46 crc kubenswrapper[4599]: I1012 08:27:46.801160 4599 scope.go:117] "RemoveContainer" containerID="7aa1ba52d4d8c32dd70950e7c2ecd5649e854e26be1e67a63c2190ce6d5f8c95" Oct 12 08:27:46 crc kubenswrapper[4599]: E1012 08:27:46.801521 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7aa1ba52d4d8c32dd70950e7c2ecd5649e854e26be1e67a63c2190ce6d5f8c95\": container with ID starting with 7aa1ba52d4d8c32dd70950e7c2ecd5649e854e26be1e67a63c2190ce6d5f8c95 not found: ID does not exist" containerID="7aa1ba52d4d8c32dd70950e7c2ecd5649e854e26be1e67a63c2190ce6d5f8c95" Oct 12 08:27:46 crc kubenswrapper[4599]: I1012 08:27:46.801556 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7aa1ba52d4d8c32dd70950e7c2ecd5649e854e26be1e67a63c2190ce6d5f8c95"} err="failed to get container status \"7aa1ba52d4d8c32dd70950e7c2ecd5649e854e26be1e67a63c2190ce6d5f8c95\": rpc error: code = NotFound desc = could not find container \"7aa1ba52d4d8c32dd70950e7c2ecd5649e854e26be1e67a63c2190ce6d5f8c95\": container with ID starting with 7aa1ba52d4d8c32dd70950e7c2ecd5649e854e26be1e67a63c2190ce6d5f8c95 not found: ID does not exist" Oct 12 08:27:46 crc kubenswrapper[4599]: I1012 08:27:46.801579 4599 scope.go:117] "RemoveContainer" containerID="948caf626ed0f1aca6fbc1286672721e56ea41fe2870e221d5dd059695c8ed3e" Oct 12 08:27:46 crc kubenswrapper[4599]: E1012 08:27:46.801800 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"948caf626ed0f1aca6fbc1286672721e56ea41fe2870e221d5dd059695c8ed3e\": container with ID starting with 948caf626ed0f1aca6fbc1286672721e56ea41fe2870e221d5dd059695c8ed3e not found: ID does not exist" containerID="948caf626ed0f1aca6fbc1286672721e56ea41fe2870e221d5dd059695c8ed3e" Oct 12 08:27:46 crc kubenswrapper[4599]: I1012 08:27:46.801831 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"948caf626ed0f1aca6fbc1286672721e56ea41fe2870e221d5dd059695c8ed3e"} err="failed to get container status \"948caf626ed0f1aca6fbc1286672721e56ea41fe2870e221d5dd059695c8ed3e\": rpc error: code = NotFound desc = could not find container \"948caf626ed0f1aca6fbc1286672721e56ea41fe2870e221d5dd059695c8ed3e\": container with ID starting with 948caf626ed0f1aca6fbc1286672721e56ea41fe2870e221d5dd059695c8ed3e not found: ID does not exist" Oct 12 08:27:46 crc kubenswrapper[4599]: I1012 08:27:46.801855 4599 scope.go:117] "RemoveContainer" containerID="5d4d7961b57d7bbd4f6e8dbb053db79d13786a49e237142f3d599959d033f18f" Oct 12 08:27:46 crc kubenswrapper[4599]: E1012 08:27:46.802055 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d4d7961b57d7bbd4f6e8dbb053db79d13786a49e237142f3d599959d033f18f\": container with ID starting with 5d4d7961b57d7bbd4f6e8dbb053db79d13786a49e237142f3d599959d033f18f not found: ID does not exist" containerID="5d4d7961b57d7bbd4f6e8dbb053db79d13786a49e237142f3d599959d033f18f" Oct 12 08:27:46 crc kubenswrapper[4599]: I1012 08:27:46.802074 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d4d7961b57d7bbd4f6e8dbb053db79d13786a49e237142f3d599959d033f18f"} err="failed to get container status \"5d4d7961b57d7bbd4f6e8dbb053db79d13786a49e237142f3d599959d033f18f\": rpc error: code = NotFound desc = could not find container \"5d4d7961b57d7bbd4f6e8dbb053db79d13786a49e237142f3d599959d033f18f\": container with ID starting with 5d4d7961b57d7bbd4f6e8dbb053db79d13786a49e237142f3d599959d033f18f not found: ID does not exist" Oct 12 08:27:47 crc kubenswrapper[4599]: I1012 08:27:47.540329 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-676j9"] Oct 12 08:27:47 crc kubenswrapper[4599]: I1012 08:27:47.552446 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e349020-7268-4ca0-b16f-d4122ffd70ff" path="/var/lib/kubelet/pods/9e349020-7268-4ca0-b16f-d4122ffd70ff/volumes" Oct 12 08:27:47 crc kubenswrapper[4599]: I1012 08:27:47.553025 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e364767d-95fe-438e-8e4f-c73bd3b92a04" path="/var/lib/kubelet/pods/e364767d-95fe-438e-8e4f-c73bd3b92a04/volumes" Oct 12 08:27:47 crc kubenswrapper[4599]: I1012 08:27:47.727765 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-676j9" podUID="40603c3c-1d14-4973-aac7-c28ba1aa08a3" containerName="registry-server" containerID="cri-o://29d1e8040444d91cb4ec6ca5019d61e6571f099311c941eadb7cf3f21dcb9c79" gracePeriod=2 Oct 12 08:27:48 crc kubenswrapper[4599]: I1012 08:27:48.121961 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-676j9" Oct 12 08:27:48 crc kubenswrapper[4599]: I1012 08:27:48.321763 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40603c3c-1d14-4973-aac7-c28ba1aa08a3-catalog-content\") pod \"40603c3c-1d14-4973-aac7-c28ba1aa08a3\" (UID: \"40603c3c-1d14-4973-aac7-c28ba1aa08a3\") " Oct 12 08:27:48 crc kubenswrapper[4599]: I1012 08:27:48.321881 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40603c3c-1d14-4973-aac7-c28ba1aa08a3-utilities\") pod \"40603c3c-1d14-4973-aac7-c28ba1aa08a3\" (UID: \"40603c3c-1d14-4973-aac7-c28ba1aa08a3\") " Oct 12 08:27:48 crc kubenswrapper[4599]: I1012 08:27:48.322025 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cz67x\" (UniqueName: \"kubernetes.io/projected/40603c3c-1d14-4973-aac7-c28ba1aa08a3-kube-api-access-cz67x\") pod \"40603c3c-1d14-4973-aac7-c28ba1aa08a3\" (UID: \"40603c3c-1d14-4973-aac7-c28ba1aa08a3\") " Oct 12 08:27:48 crc kubenswrapper[4599]: I1012 08:27:48.322757 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40603c3c-1d14-4973-aac7-c28ba1aa08a3-utilities" (OuterVolumeSpecName: "utilities") pod "40603c3c-1d14-4973-aac7-c28ba1aa08a3" (UID: "40603c3c-1d14-4973-aac7-c28ba1aa08a3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 08:27:48 crc kubenswrapper[4599]: I1012 08:27:48.323205 4599 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40603c3c-1d14-4973-aac7-c28ba1aa08a3-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 08:27:48 crc kubenswrapper[4599]: I1012 08:27:48.327871 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40603c3c-1d14-4973-aac7-c28ba1aa08a3-kube-api-access-cz67x" (OuterVolumeSpecName: "kube-api-access-cz67x") pod "40603c3c-1d14-4973-aac7-c28ba1aa08a3" (UID: "40603c3c-1d14-4973-aac7-c28ba1aa08a3"). InnerVolumeSpecName "kube-api-access-cz67x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 08:27:48 crc kubenswrapper[4599]: I1012 08:27:48.334997 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40603c3c-1d14-4973-aac7-c28ba1aa08a3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "40603c3c-1d14-4973-aac7-c28ba1aa08a3" (UID: "40603c3c-1d14-4973-aac7-c28ba1aa08a3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 08:27:48 crc kubenswrapper[4599]: I1012 08:27:48.426878 4599 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40603c3c-1d14-4973-aac7-c28ba1aa08a3-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 08:27:48 crc kubenswrapper[4599]: I1012 08:27:48.426926 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cz67x\" (UniqueName: \"kubernetes.io/projected/40603c3c-1d14-4973-aac7-c28ba1aa08a3-kube-api-access-cz67x\") on node \"crc\" DevicePath \"\"" Oct 12 08:27:48 crc kubenswrapper[4599]: I1012 08:27:48.736097 4599 generic.go:334] "Generic (PLEG): container finished" podID="40603c3c-1d14-4973-aac7-c28ba1aa08a3" containerID="29d1e8040444d91cb4ec6ca5019d61e6571f099311c941eadb7cf3f21dcb9c79" exitCode=0 Oct 12 08:27:48 crc kubenswrapper[4599]: I1012 08:27:48.736150 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-676j9" Oct 12 08:27:48 crc kubenswrapper[4599]: I1012 08:27:48.736183 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-676j9" event={"ID":"40603c3c-1d14-4973-aac7-c28ba1aa08a3","Type":"ContainerDied","Data":"29d1e8040444d91cb4ec6ca5019d61e6571f099311c941eadb7cf3f21dcb9c79"} Oct 12 08:27:48 crc kubenswrapper[4599]: I1012 08:27:48.737257 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-676j9" event={"ID":"40603c3c-1d14-4973-aac7-c28ba1aa08a3","Type":"ContainerDied","Data":"ccb0b568a59c053b3fbe90a91ebea8056f42ff9224c79424d01993aca80c31ce"} Oct 12 08:27:48 crc kubenswrapper[4599]: I1012 08:27:48.737283 4599 scope.go:117] "RemoveContainer" containerID="29d1e8040444d91cb4ec6ca5019d61e6571f099311c941eadb7cf3f21dcb9c79" Oct 12 08:27:48 crc kubenswrapper[4599]: I1012 08:27:48.753864 4599 scope.go:117] "RemoveContainer" containerID="bb0d7360eebde2f94572fe04956e95c924c764a40d8b6edfbbc727ceeb321ebf" Oct 12 08:27:48 crc kubenswrapper[4599]: I1012 08:27:48.763179 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-676j9"] Oct 12 08:27:48 crc kubenswrapper[4599]: I1012 08:27:48.770162 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-676j9"] Oct 12 08:27:48 crc kubenswrapper[4599]: I1012 08:27:48.779520 4599 scope.go:117] "RemoveContainer" containerID="ca3e2a1d1a8e24ebe157c4037f024ab7de9b278eb10b7da069a579959ac66bf5" Oct 12 08:27:48 crc kubenswrapper[4599]: I1012 08:27:48.804045 4599 scope.go:117] "RemoveContainer" containerID="29d1e8040444d91cb4ec6ca5019d61e6571f099311c941eadb7cf3f21dcb9c79" Oct 12 08:27:48 crc kubenswrapper[4599]: E1012 08:27:48.804357 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29d1e8040444d91cb4ec6ca5019d61e6571f099311c941eadb7cf3f21dcb9c79\": container with ID starting with 29d1e8040444d91cb4ec6ca5019d61e6571f099311c941eadb7cf3f21dcb9c79 not found: ID does not exist" containerID="29d1e8040444d91cb4ec6ca5019d61e6571f099311c941eadb7cf3f21dcb9c79" Oct 12 08:27:48 crc kubenswrapper[4599]: I1012 08:27:48.804395 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29d1e8040444d91cb4ec6ca5019d61e6571f099311c941eadb7cf3f21dcb9c79"} err="failed to get container status \"29d1e8040444d91cb4ec6ca5019d61e6571f099311c941eadb7cf3f21dcb9c79\": rpc error: code = NotFound desc = could not find container \"29d1e8040444d91cb4ec6ca5019d61e6571f099311c941eadb7cf3f21dcb9c79\": container with ID starting with 29d1e8040444d91cb4ec6ca5019d61e6571f099311c941eadb7cf3f21dcb9c79 not found: ID does not exist" Oct 12 08:27:48 crc kubenswrapper[4599]: I1012 08:27:48.804418 4599 scope.go:117] "RemoveContainer" containerID="bb0d7360eebde2f94572fe04956e95c924c764a40d8b6edfbbc727ceeb321ebf" Oct 12 08:27:48 crc kubenswrapper[4599]: E1012 08:27:48.804722 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb0d7360eebde2f94572fe04956e95c924c764a40d8b6edfbbc727ceeb321ebf\": container with ID starting with bb0d7360eebde2f94572fe04956e95c924c764a40d8b6edfbbc727ceeb321ebf not found: ID does not exist" containerID="bb0d7360eebde2f94572fe04956e95c924c764a40d8b6edfbbc727ceeb321ebf" Oct 12 08:27:48 crc kubenswrapper[4599]: I1012 08:27:48.804756 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb0d7360eebde2f94572fe04956e95c924c764a40d8b6edfbbc727ceeb321ebf"} err="failed to get container status \"bb0d7360eebde2f94572fe04956e95c924c764a40d8b6edfbbc727ceeb321ebf\": rpc error: code = NotFound desc = could not find container \"bb0d7360eebde2f94572fe04956e95c924c764a40d8b6edfbbc727ceeb321ebf\": container with ID starting with bb0d7360eebde2f94572fe04956e95c924c764a40d8b6edfbbc727ceeb321ebf not found: ID does not exist" Oct 12 08:27:48 crc kubenswrapper[4599]: I1012 08:27:48.804775 4599 scope.go:117] "RemoveContainer" containerID="ca3e2a1d1a8e24ebe157c4037f024ab7de9b278eb10b7da069a579959ac66bf5" Oct 12 08:27:48 crc kubenswrapper[4599]: E1012 08:27:48.805025 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca3e2a1d1a8e24ebe157c4037f024ab7de9b278eb10b7da069a579959ac66bf5\": container with ID starting with ca3e2a1d1a8e24ebe157c4037f024ab7de9b278eb10b7da069a579959ac66bf5 not found: ID does not exist" containerID="ca3e2a1d1a8e24ebe157c4037f024ab7de9b278eb10b7da069a579959ac66bf5" Oct 12 08:27:48 crc kubenswrapper[4599]: I1012 08:27:48.805060 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca3e2a1d1a8e24ebe157c4037f024ab7de9b278eb10b7da069a579959ac66bf5"} err="failed to get container status \"ca3e2a1d1a8e24ebe157c4037f024ab7de9b278eb10b7da069a579959ac66bf5\": rpc error: code = NotFound desc = could not find container \"ca3e2a1d1a8e24ebe157c4037f024ab7de9b278eb10b7da069a579959ac66bf5\": container with ID starting with ca3e2a1d1a8e24ebe157c4037f024ab7de9b278eb10b7da069a579959ac66bf5 not found: ID does not exist" Oct 12 08:27:49 crc kubenswrapper[4599]: I1012 08:27:49.554057 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40603c3c-1d14-4973-aac7-c28ba1aa08a3" path="/var/lib/kubelet/pods/40603c3c-1d14-4973-aac7-c28ba1aa08a3/volumes" Oct 12 08:27:53 crc kubenswrapper[4599]: I1012 08:27:53.905496 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-7v5dn_70c0dff5-0cd7-4399-a044-c95469bea793/cert-manager-controller/0.log" Oct 12 08:27:54 crc kubenswrapper[4599]: I1012 08:27:54.017789 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-kpxnw_88150699-5d1d-4b47-ad1e-bbe4cf006a3e/cert-manager-cainjector/0.log" Oct 12 08:27:54 crc kubenswrapper[4599]: I1012 08:27:54.031084 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-tp9pm_c6d2a135-7c1d-4cfb-b8ee-fa9737f62776/cert-manager-webhook/0.log" Oct 12 08:28:02 crc kubenswrapper[4599]: I1012 08:28:02.395853 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6b874cbd85-klx8j_2ff98253-bb25-450e-9202-817788dab660/nmstate-console-plugin/0.log" Oct 12 08:28:02 crc kubenswrapper[4599]: I1012 08:28:02.565434 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-mxzhr_cf76296b-ab43-4e40-83c9-ee507169ea4c/nmstate-handler/0.log" Oct 12 08:28:02 crc kubenswrapper[4599]: I1012 08:28:02.579443 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-tls82_126c69f6-12a5-46a8-a817-23b97dc624d7/nmstate-metrics/0.log" Oct 12 08:28:02 crc kubenswrapper[4599]: I1012 08:28:02.585442 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-tls82_126c69f6-12a5-46a8-a817-23b97dc624d7/kube-rbac-proxy/0.log" Oct 12 08:28:02 crc kubenswrapper[4599]: I1012 08:28:02.692832 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-858ddd8f98-vz57h_87a55b6b-2189-4332-beb0-5bf12c1ded00/nmstate-operator/0.log" Oct 12 08:28:02 crc kubenswrapper[4599]: I1012 08:28:02.719095 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6cdbc54649-rb46k_fe7e44b3-5972-4d03-8919-8a67214fee06/nmstate-webhook/0.log" Oct 12 08:28:12 crc kubenswrapper[4599]: I1012 08:28:12.691417 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-v2d4g_a8d05e19-feeb-41e3-ab30-f55af42472ca/kube-rbac-proxy/0.log" Oct 12 08:28:12 crc kubenswrapper[4599]: I1012 08:28:12.729953 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-v2d4g_a8d05e19-feeb-41e3-ab30-f55af42472ca/controller/0.log" Oct 12 08:28:12 crc kubenswrapper[4599]: I1012 08:28:12.866671 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-76vb2_e2411c1e-4e34-4ca5-aa25-0b317652dd35/cp-frr-files/0.log" Oct 12 08:28:13 crc kubenswrapper[4599]: I1012 08:28:13.012475 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-76vb2_e2411c1e-4e34-4ca5-aa25-0b317652dd35/cp-frr-files/0.log" Oct 12 08:28:13 crc kubenswrapper[4599]: I1012 08:28:13.035632 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-76vb2_e2411c1e-4e34-4ca5-aa25-0b317652dd35/cp-reloader/0.log" Oct 12 08:28:13 crc kubenswrapper[4599]: I1012 08:28:13.048251 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-76vb2_e2411c1e-4e34-4ca5-aa25-0b317652dd35/cp-reloader/0.log" Oct 12 08:28:13 crc kubenswrapper[4599]: I1012 08:28:13.049600 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-76vb2_e2411c1e-4e34-4ca5-aa25-0b317652dd35/cp-metrics/0.log" Oct 12 08:28:13 crc kubenswrapper[4599]: I1012 08:28:13.151890 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-76vb2_e2411c1e-4e34-4ca5-aa25-0b317652dd35/cp-frr-files/0.log" Oct 12 08:28:13 crc kubenswrapper[4599]: I1012 08:28:13.154668 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-76vb2_e2411c1e-4e34-4ca5-aa25-0b317652dd35/cp-reloader/0.log" Oct 12 08:28:13 crc kubenswrapper[4599]: I1012 08:28:13.207787 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-76vb2_e2411c1e-4e34-4ca5-aa25-0b317652dd35/cp-metrics/0.log" Oct 12 08:28:13 crc kubenswrapper[4599]: I1012 08:28:13.207822 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-76vb2_e2411c1e-4e34-4ca5-aa25-0b317652dd35/cp-metrics/0.log" Oct 12 08:28:13 crc kubenswrapper[4599]: I1012 08:28:13.356409 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-76vb2_e2411c1e-4e34-4ca5-aa25-0b317652dd35/cp-frr-files/0.log" Oct 12 08:28:13 crc kubenswrapper[4599]: I1012 08:28:13.361825 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-76vb2_e2411c1e-4e34-4ca5-aa25-0b317652dd35/cp-metrics/0.log" Oct 12 08:28:13 crc kubenswrapper[4599]: I1012 08:28:13.364043 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-76vb2_e2411c1e-4e34-4ca5-aa25-0b317652dd35/controller/0.log" Oct 12 08:28:13 crc kubenswrapper[4599]: I1012 08:28:13.374871 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-76vb2_e2411c1e-4e34-4ca5-aa25-0b317652dd35/cp-reloader/0.log" Oct 12 08:28:13 crc kubenswrapper[4599]: I1012 08:28:13.517778 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-76vb2_e2411c1e-4e34-4ca5-aa25-0b317652dd35/frr-metrics/0.log" Oct 12 08:28:13 crc kubenswrapper[4599]: I1012 08:28:13.537821 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-76vb2_e2411c1e-4e34-4ca5-aa25-0b317652dd35/kube-rbac-proxy/0.log" Oct 12 08:28:13 crc kubenswrapper[4599]: I1012 08:28:13.543727 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-76vb2_e2411c1e-4e34-4ca5-aa25-0b317652dd35/kube-rbac-proxy-frr/0.log" Oct 12 08:28:13 crc kubenswrapper[4599]: I1012 08:28:13.678721 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-76vb2_e2411c1e-4e34-4ca5-aa25-0b317652dd35/reloader/0.log" Oct 12 08:28:13 crc kubenswrapper[4599]: I1012 08:28:13.794130 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-64bf5d555-zcx46_6f5b7478-9c00-4302-b356-cae717338202/frr-k8s-webhook-server/0.log" Oct 12 08:28:13 crc kubenswrapper[4599]: I1012 08:28:13.951769 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-75d566c47b-2dhk8_14e65bd0-37ae-438c-9d25-b2d4b70556e7/manager/0.log" Oct 12 08:28:14 crc kubenswrapper[4599]: I1012 08:28:14.092750 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-5c94cffdb4-lj8nf_bb4359a2-e2a1-4e37-b7df-420ab49781c6/webhook-server/0.log" Oct 12 08:28:14 crc kubenswrapper[4599]: I1012 08:28:14.175785 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-g4nt4_12d002b3-67b4-4405-ab2f-930346bfc610/kube-rbac-proxy/0.log" Oct 12 08:28:14 crc kubenswrapper[4599]: I1012 08:28:14.611160 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-76vb2_e2411c1e-4e34-4ca5-aa25-0b317652dd35/frr/0.log" Oct 12 08:28:14 crc kubenswrapper[4599]: I1012 08:28:14.675514 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-g4nt4_12d002b3-67b4-4405-ab2f-930346bfc610/speaker/0.log" Oct 12 08:28:23 crc kubenswrapper[4599]: I1012 08:28:23.627889 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xcf9v_492ddcff-667c-4dda-b878-741413cb8aa1/util/0.log" Oct 12 08:28:23 crc kubenswrapper[4599]: I1012 08:28:23.760795 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xcf9v_492ddcff-667c-4dda-b878-741413cb8aa1/pull/0.log" Oct 12 08:28:23 crc kubenswrapper[4599]: I1012 08:28:23.764525 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xcf9v_492ddcff-667c-4dda-b878-741413cb8aa1/util/0.log" Oct 12 08:28:23 crc kubenswrapper[4599]: I1012 08:28:23.811929 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xcf9v_492ddcff-667c-4dda-b878-741413cb8aa1/pull/0.log" Oct 12 08:28:23 crc kubenswrapper[4599]: I1012 08:28:23.904571 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xcf9v_492ddcff-667c-4dda-b878-741413cb8aa1/util/0.log" Oct 12 08:28:23 crc kubenswrapper[4599]: I1012 08:28:23.904787 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xcf9v_492ddcff-667c-4dda-b878-741413cb8aa1/pull/0.log" Oct 12 08:28:23 crc kubenswrapper[4599]: I1012 08:28:23.938556 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2xcf9v_492ddcff-667c-4dda-b878-741413cb8aa1/extract/0.log" Oct 12 08:28:24 crc kubenswrapper[4599]: I1012 08:28:24.053134 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sn9x6_c3548e29-bf51-474a-9110-60bfad743fd3/extract-utilities/0.log" Oct 12 08:28:24 crc kubenswrapper[4599]: I1012 08:28:24.157778 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sn9x6_c3548e29-bf51-474a-9110-60bfad743fd3/extract-utilities/0.log" Oct 12 08:28:24 crc kubenswrapper[4599]: I1012 08:28:24.182297 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sn9x6_c3548e29-bf51-474a-9110-60bfad743fd3/extract-content/0.log" Oct 12 08:28:24 crc kubenswrapper[4599]: I1012 08:28:24.191629 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sn9x6_c3548e29-bf51-474a-9110-60bfad743fd3/extract-content/0.log" Oct 12 08:28:24 crc kubenswrapper[4599]: I1012 08:28:24.310307 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sn9x6_c3548e29-bf51-474a-9110-60bfad743fd3/extract-content/0.log" Oct 12 08:28:24 crc kubenswrapper[4599]: I1012 08:28:24.313037 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sn9x6_c3548e29-bf51-474a-9110-60bfad743fd3/extract-utilities/0.log" Oct 12 08:28:24 crc kubenswrapper[4599]: I1012 08:28:24.501394 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-vz257_45933fcb-2999-4505-8356-fb50f8d1e2c7/extract-utilities/0.log" Oct 12 08:28:24 crc kubenswrapper[4599]: I1012 08:28:24.656496 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-vz257_45933fcb-2999-4505-8356-fb50f8d1e2c7/extract-utilities/0.log" Oct 12 08:28:24 crc kubenswrapper[4599]: I1012 08:28:24.704973 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sn9x6_c3548e29-bf51-474a-9110-60bfad743fd3/registry-server/0.log" Oct 12 08:28:24 crc kubenswrapper[4599]: I1012 08:28:24.716311 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-vz257_45933fcb-2999-4505-8356-fb50f8d1e2c7/extract-content/0.log" Oct 12 08:28:24 crc kubenswrapper[4599]: I1012 08:28:24.753086 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-vz257_45933fcb-2999-4505-8356-fb50f8d1e2c7/extract-content/0.log" Oct 12 08:28:24 crc kubenswrapper[4599]: I1012 08:28:24.865683 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-vz257_45933fcb-2999-4505-8356-fb50f8d1e2c7/extract-utilities/0.log" Oct 12 08:28:24 crc kubenswrapper[4599]: I1012 08:28:24.891477 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-vz257_45933fcb-2999-4505-8356-fb50f8d1e2c7/extract-content/0.log" Oct 12 08:28:25 crc kubenswrapper[4599]: I1012 08:28:25.049078 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cg6fsr_82699d9d-fb0c-46be-9020-16bb7e4cb65c/util/0.log" Oct 12 08:28:25 crc kubenswrapper[4599]: I1012 08:28:25.294825 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cg6fsr_82699d9d-fb0c-46be-9020-16bb7e4cb65c/util/0.log" Oct 12 08:28:25 crc kubenswrapper[4599]: I1012 08:28:25.310790 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cg6fsr_82699d9d-fb0c-46be-9020-16bb7e4cb65c/pull/0.log" Oct 12 08:28:25 crc kubenswrapper[4599]: I1012 08:28:25.360191 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-vz257_45933fcb-2999-4505-8356-fb50f8d1e2c7/registry-server/0.log" Oct 12 08:28:25 crc kubenswrapper[4599]: I1012 08:28:25.365657 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cg6fsr_82699d9d-fb0c-46be-9020-16bb7e4cb65c/pull/0.log" Oct 12 08:28:25 crc kubenswrapper[4599]: I1012 08:28:25.458270 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cg6fsr_82699d9d-fb0c-46be-9020-16bb7e4cb65c/util/0.log" Oct 12 08:28:25 crc kubenswrapper[4599]: I1012 08:28:25.483547 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cg6fsr_82699d9d-fb0c-46be-9020-16bb7e4cb65c/pull/0.log" Oct 12 08:28:25 crc kubenswrapper[4599]: I1012 08:28:25.523598 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cg6fsr_82699d9d-fb0c-46be-9020-16bb7e4cb65c/extract/0.log" Oct 12 08:28:25 crc kubenswrapper[4599]: I1012 08:28:25.625847 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-2vntc_04a727ef-7194-4df1-b0a2-0107085a972d/marketplace-operator/0.log" Oct 12 08:28:25 crc kubenswrapper[4599]: I1012 08:28:25.674810 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xkm4x_4b6aaa9e-4f52-45ec-994b-c0b4e600c00f/extract-utilities/0.log" Oct 12 08:28:25 crc kubenswrapper[4599]: I1012 08:28:25.817017 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xkm4x_4b6aaa9e-4f52-45ec-994b-c0b4e600c00f/extract-content/0.log" Oct 12 08:28:25 crc kubenswrapper[4599]: I1012 08:28:25.823754 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xkm4x_4b6aaa9e-4f52-45ec-994b-c0b4e600c00f/extract-content/0.log" Oct 12 08:28:25 crc kubenswrapper[4599]: I1012 08:28:25.842682 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xkm4x_4b6aaa9e-4f52-45ec-994b-c0b4e600c00f/extract-utilities/0.log" Oct 12 08:28:25 crc kubenswrapper[4599]: I1012 08:28:25.987149 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xkm4x_4b6aaa9e-4f52-45ec-994b-c0b4e600c00f/extract-utilities/0.log" Oct 12 08:28:25 crc kubenswrapper[4599]: I1012 08:28:25.987289 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xkm4x_4b6aaa9e-4f52-45ec-994b-c0b4e600c00f/extract-content/0.log" Oct 12 08:28:26 crc kubenswrapper[4599]: I1012 08:28:26.129016 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xkm4x_4b6aaa9e-4f52-45ec-994b-c0b4e600c00f/registry-server/0.log" Oct 12 08:28:26 crc kubenswrapper[4599]: I1012 08:28:26.192284 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9d992_ff5e25ae-0293-4ddc-8e3e-1c27d4aa9d89/extract-utilities/0.log" Oct 12 08:28:26 crc kubenswrapper[4599]: I1012 08:28:26.291252 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9d992_ff5e25ae-0293-4ddc-8e3e-1c27d4aa9d89/extract-utilities/0.log" Oct 12 08:28:26 crc kubenswrapper[4599]: I1012 08:28:26.316757 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9d992_ff5e25ae-0293-4ddc-8e3e-1c27d4aa9d89/extract-content/0.log" Oct 12 08:28:26 crc kubenswrapper[4599]: I1012 08:28:26.342162 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9d992_ff5e25ae-0293-4ddc-8e3e-1c27d4aa9d89/extract-content/0.log" Oct 12 08:28:26 crc kubenswrapper[4599]: I1012 08:28:26.454917 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9d992_ff5e25ae-0293-4ddc-8e3e-1c27d4aa9d89/extract-utilities/0.log" Oct 12 08:28:26 crc kubenswrapper[4599]: I1012 08:28:26.479049 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9d992_ff5e25ae-0293-4ddc-8e3e-1c27d4aa9d89/extract-content/0.log" Oct 12 08:28:26 crc kubenswrapper[4599]: I1012 08:28:26.833918 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9d992_ff5e25ae-0293-4ddc-8e3e-1c27d4aa9d89/registry-server/0.log" Oct 12 08:29:40 crc kubenswrapper[4599]: I1012 08:29:40.511433 4599 generic.go:334] "Generic (PLEG): container finished" podID="b0a8657a-c347-4913-afa8-020edbd6713a" containerID="9edae95dc665c176ae8ee0b31c5a4a8849bbb7a4b6a14ad7882e4b9dada95ddc" exitCode=0 Oct 12 08:29:40 crc kubenswrapper[4599]: I1012 08:29:40.511521 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nsbng/must-gather-kg65p" event={"ID":"b0a8657a-c347-4913-afa8-020edbd6713a","Type":"ContainerDied","Data":"9edae95dc665c176ae8ee0b31c5a4a8849bbb7a4b6a14ad7882e4b9dada95ddc"} Oct 12 08:29:40 crc kubenswrapper[4599]: I1012 08:29:40.512391 4599 scope.go:117] "RemoveContainer" containerID="9edae95dc665c176ae8ee0b31c5a4a8849bbb7a4b6a14ad7882e4b9dada95ddc" Oct 12 08:29:40 crc kubenswrapper[4599]: I1012 08:29:40.559887 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-nsbng_must-gather-kg65p_b0a8657a-c347-4913-afa8-020edbd6713a/gather/0.log" Oct 12 08:29:49 crc kubenswrapper[4599]: I1012 08:29:49.046788 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-nsbng/must-gather-kg65p"] Oct 12 08:29:49 crc kubenswrapper[4599]: I1012 08:29:49.047417 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-nsbng/must-gather-kg65p" podUID="b0a8657a-c347-4913-afa8-020edbd6713a" containerName="copy" containerID="cri-o://a81e6297c3114d099b16dabffb92bddbec48235ee56b4623a6d93e8a80ed563d" gracePeriod=2 Oct 12 08:29:49 crc kubenswrapper[4599]: I1012 08:29:49.055346 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-nsbng/must-gather-kg65p"] Oct 12 08:29:49 crc kubenswrapper[4599]: I1012 08:29:49.391038 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-nsbng_must-gather-kg65p_b0a8657a-c347-4913-afa8-020edbd6713a/copy/0.log" Oct 12 08:29:49 crc kubenswrapper[4599]: I1012 08:29:49.391706 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nsbng/must-gather-kg65p" Oct 12 08:29:49 crc kubenswrapper[4599]: I1012 08:29:49.576099 4599 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-nsbng_must-gather-kg65p_b0a8657a-c347-4913-afa8-020edbd6713a/copy/0.log" Oct 12 08:29:49 crc kubenswrapper[4599]: I1012 08:29:49.576418 4599 generic.go:334] "Generic (PLEG): container finished" podID="b0a8657a-c347-4913-afa8-020edbd6713a" containerID="a81e6297c3114d099b16dabffb92bddbec48235ee56b4623a6d93e8a80ed563d" exitCode=143 Oct 12 08:29:49 crc kubenswrapper[4599]: I1012 08:29:49.576464 4599 scope.go:117] "RemoveContainer" containerID="a81e6297c3114d099b16dabffb92bddbec48235ee56b4623a6d93e8a80ed563d" Oct 12 08:29:49 crc kubenswrapper[4599]: I1012 08:29:49.576602 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nsbng/must-gather-kg65p" Oct 12 08:29:49 crc kubenswrapper[4599]: I1012 08:29:49.592608 4599 scope.go:117] "RemoveContainer" containerID="9edae95dc665c176ae8ee0b31c5a4a8849bbb7a4b6a14ad7882e4b9dada95ddc" Oct 12 08:29:49 crc kubenswrapper[4599]: I1012 08:29:49.592953 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9l562\" (UniqueName: \"kubernetes.io/projected/b0a8657a-c347-4913-afa8-020edbd6713a-kube-api-access-9l562\") pod \"b0a8657a-c347-4913-afa8-020edbd6713a\" (UID: \"b0a8657a-c347-4913-afa8-020edbd6713a\") " Oct 12 08:29:49 crc kubenswrapper[4599]: I1012 08:29:49.593029 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b0a8657a-c347-4913-afa8-020edbd6713a-must-gather-output\") pod \"b0a8657a-c347-4913-afa8-020edbd6713a\" (UID: \"b0a8657a-c347-4913-afa8-020edbd6713a\") " Oct 12 08:29:49 crc kubenswrapper[4599]: I1012 08:29:49.597828 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0a8657a-c347-4913-afa8-020edbd6713a-kube-api-access-9l562" (OuterVolumeSpecName: "kube-api-access-9l562") pod "b0a8657a-c347-4913-afa8-020edbd6713a" (UID: "b0a8657a-c347-4913-afa8-020edbd6713a"). InnerVolumeSpecName "kube-api-access-9l562". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 08:29:49 crc kubenswrapper[4599]: I1012 08:29:49.647708 4599 scope.go:117] "RemoveContainer" containerID="a81e6297c3114d099b16dabffb92bddbec48235ee56b4623a6d93e8a80ed563d" Oct 12 08:29:49 crc kubenswrapper[4599]: E1012 08:29:49.648165 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a81e6297c3114d099b16dabffb92bddbec48235ee56b4623a6d93e8a80ed563d\": container with ID starting with a81e6297c3114d099b16dabffb92bddbec48235ee56b4623a6d93e8a80ed563d not found: ID does not exist" containerID="a81e6297c3114d099b16dabffb92bddbec48235ee56b4623a6d93e8a80ed563d" Oct 12 08:29:49 crc kubenswrapper[4599]: I1012 08:29:49.648208 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a81e6297c3114d099b16dabffb92bddbec48235ee56b4623a6d93e8a80ed563d"} err="failed to get container status \"a81e6297c3114d099b16dabffb92bddbec48235ee56b4623a6d93e8a80ed563d\": rpc error: code = NotFound desc = could not find container \"a81e6297c3114d099b16dabffb92bddbec48235ee56b4623a6d93e8a80ed563d\": container with ID starting with a81e6297c3114d099b16dabffb92bddbec48235ee56b4623a6d93e8a80ed563d not found: ID does not exist" Oct 12 08:29:49 crc kubenswrapper[4599]: I1012 08:29:49.648231 4599 scope.go:117] "RemoveContainer" containerID="9edae95dc665c176ae8ee0b31c5a4a8849bbb7a4b6a14ad7882e4b9dada95ddc" Oct 12 08:29:49 crc kubenswrapper[4599]: E1012 08:29:49.648667 4599 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9edae95dc665c176ae8ee0b31c5a4a8849bbb7a4b6a14ad7882e4b9dada95ddc\": container with ID starting with 9edae95dc665c176ae8ee0b31c5a4a8849bbb7a4b6a14ad7882e4b9dada95ddc not found: ID does not exist" containerID="9edae95dc665c176ae8ee0b31c5a4a8849bbb7a4b6a14ad7882e4b9dada95ddc" Oct 12 08:29:49 crc kubenswrapper[4599]: I1012 08:29:49.648708 4599 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9edae95dc665c176ae8ee0b31c5a4a8849bbb7a4b6a14ad7882e4b9dada95ddc"} err="failed to get container status \"9edae95dc665c176ae8ee0b31c5a4a8849bbb7a4b6a14ad7882e4b9dada95ddc\": rpc error: code = NotFound desc = could not find container \"9edae95dc665c176ae8ee0b31c5a4a8849bbb7a4b6a14ad7882e4b9dada95ddc\": container with ID starting with 9edae95dc665c176ae8ee0b31c5a4a8849bbb7a4b6a14ad7882e4b9dada95ddc not found: ID does not exist" Oct 12 08:29:49 crc kubenswrapper[4599]: I1012 08:29:49.695855 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9l562\" (UniqueName: \"kubernetes.io/projected/b0a8657a-c347-4913-afa8-020edbd6713a-kube-api-access-9l562\") on node \"crc\" DevicePath \"\"" Oct 12 08:29:49 crc kubenswrapper[4599]: I1012 08:29:49.709213 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0a8657a-c347-4913-afa8-020edbd6713a-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "b0a8657a-c347-4913-afa8-020edbd6713a" (UID: "b0a8657a-c347-4913-afa8-020edbd6713a"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 08:29:49 crc kubenswrapper[4599]: I1012 08:29:49.797060 4599 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b0a8657a-c347-4913-afa8-020edbd6713a-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 12 08:29:51 crc kubenswrapper[4599]: I1012 08:29:51.552836 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0a8657a-c347-4913-afa8-020edbd6713a" path="/var/lib/kubelet/pods/b0a8657a-c347-4913-afa8-020edbd6713a/volumes" Oct 12 08:29:58 crc kubenswrapper[4599]: I1012 08:29:58.321853 4599 patch_prober.go:28] interesting pod/machine-config-daemon-5mz5c container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 08:29:58 crc kubenswrapper[4599]: I1012 08:29:58.322199 4599 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 08:30:00 crc kubenswrapper[4599]: I1012 08:30:00.148804 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29337630-gsc28"] Oct 12 08:30:00 crc kubenswrapper[4599]: E1012 08:30:00.149279 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e349020-7268-4ca0-b16f-d4122ffd70ff" containerName="extract-content" Oct 12 08:30:00 crc kubenswrapper[4599]: I1012 08:30:00.149291 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e349020-7268-4ca0-b16f-d4122ffd70ff" containerName="extract-content" Oct 12 08:30:00 crc kubenswrapper[4599]: E1012 08:30:00.149303 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e349020-7268-4ca0-b16f-d4122ffd70ff" containerName="extract-utilities" Oct 12 08:30:00 crc kubenswrapper[4599]: I1012 08:30:00.149310 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e349020-7268-4ca0-b16f-d4122ffd70ff" containerName="extract-utilities" Oct 12 08:30:00 crc kubenswrapper[4599]: E1012 08:30:00.149319 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e364767d-95fe-438e-8e4f-c73bd3b92a04" containerName="extract-utilities" Oct 12 08:30:00 crc kubenswrapper[4599]: I1012 08:30:00.149324 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="e364767d-95fe-438e-8e4f-c73bd3b92a04" containerName="extract-utilities" Oct 12 08:30:00 crc kubenswrapper[4599]: E1012 08:30:00.149350 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40603c3c-1d14-4973-aac7-c28ba1aa08a3" containerName="extract-utilities" Oct 12 08:30:00 crc kubenswrapper[4599]: I1012 08:30:00.149356 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="40603c3c-1d14-4973-aac7-c28ba1aa08a3" containerName="extract-utilities" Oct 12 08:30:00 crc kubenswrapper[4599]: E1012 08:30:00.149364 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e364767d-95fe-438e-8e4f-c73bd3b92a04" containerName="extract-content" Oct 12 08:30:00 crc kubenswrapper[4599]: I1012 08:30:00.149369 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="e364767d-95fe-438e-8e4f-c73bd3b92a04" containerName="extract-content" Oct 12 08:30:00 crc kubenswrapper[4599]: E1012 08:30:00.149379 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0a8657a-c347-4913-afa8-020edbd6713a" containerName="gather" Oct 12 08:30:00 crc kubenswrapper[4599]: I1012 08:30:00.149385 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0a8657a-c347-4913-afa8-020edbd6713a" containerName="gather" Oct 12 08:30:00 crc kubenswrapper[4599]: E1012 08:30:00.149406 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40603c3c-1d14-4973-aac7-c28ba1aa08a3" containerName="extract-content" Oct 12 08:30:00 crc kubenswrapper[4599]: I1012 08:30:00.149411 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="40603c3c-1d14-4973-aac7-c28ba1aa08a3" containerName="extract-content" Oct 12 08:30:00 crc kubenswrapper[4599]: E1012 08:30:00.149421 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40603c3c-1d14-4973-aac7-c28ba1aa08a3" containerName="registry-server" Oct 12 08:30:00 crc kubenswrapper[4599]: I1012 08:30:00.149426 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="40603c3c-1d14-4973-aac7-c28ba1aa08a3" containerName="registry-server" Oct 12 08:30:00 crc kubenswrapper[4599]: E1012 08:30:00.149438 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e364767d-95fe-438e-8e4f-c73bd3b92a04" containerName="registry-server" Oct 12 08:30:00 crc kubenswrapper[4599]: I1012 08:30:00.149443 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="e364767d-95fe-438e-8e4f-c73bd3b92a04" containerName="registry-server" Oct 12 08:30:00 crc kubenswrapper[4599]: E1012 08:30:00.149454 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0a8657a-c347-4913-afa8-020edbd6713a" containerName="copy" Oct 12 08:30:00 crc kubenswrapper[4599]: I1012 08:30:00.149459 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0a8657a-c347-4913-afa8-020edbd6713a" containerName="copy" Oct 12 08:30:00 crc kubenswrapper[4599]: E1012 08:30:00.149468 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e349020-7268-4ca0-b16f-d4122ffd70ff" containerName="registry-server" Oct 12 08:30:00 crc kubenswrapper[4599]: I1012 08:30:00.149473 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e349020-7268-4ca0-b16f-d4122ffd70ff" containerName="registry-server" Oct 12 08:30:00 crc kubenswrapper[4599]: I1012 08:30:00.149650 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0a8657a-c347-4913-afa8-020edbd6713a" containerName="copy" Oct 12 08:30:00 crc kubenswrapper[4599]: I1012 08:30:00.149660 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0a8657a-c347-4913-afa8-020edbd6713a" containerName="gather" Oct 12 08:30:00 crc kubenswrapper[4599]: I1012 08:30:00.149676 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e349020-7268-4ca0-b16f-d4122ffd70ff" containerName="registry-server" Oct 12 08:30:00 crc kubenswrapper[4599]: I1012 08:30:00.149688 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="40603c3c-1d14-4973-aac7-c28ba1aa08a3" containerName="registry-server" Oct 12 08:30:00 crc kubenswrapper[4599]: I1012 08:30:00.149696 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="e364767d-95fe-438e-8e4f-c73bd3b92a04" containerName="registry-server" Oct 12 08:30:00 crc kubenswrapper[4599]: I1012 08:30:00.150187 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29337630-gsc28" Oct 12 08:30:00 crc kubenswrapper[4599]: I1012 08:30:00.152050 4599 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 12 08:30:00 crc kubenswrapper[4599]: I1012 08:30:00.152050 4599 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 12 08:30:00 crc kubenswrapper[4599]: I1012 08:30:00.156983 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29337630-gsc28"] Oct 12 08:30:00 crc kubenswrapper[4599]: I1012 08:30:00.351935 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vq9fx\" (UniqueName: \"kubernetes.io/projected/e32f3ce9-c26d-4de7-b650-bb16b0c60a1d-kube-api-access-vq9fx\") pod \"collect-profiles-29337630-gsc28\" (UID: \"e32f3ce9-c26d-4de7-b650-bb16b0c60a1d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337630-gsc28" Oct 12 08:30:00 crc kubenswrapper[4599]: I1012 08:30:00.352048 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e32f3ce9-c26d-4de7-b650-bb16b0c60a1d-secret-volume\") pod \"collect-profiles-29337630-gsc28\" (UID: \"e32f3ce9-c26d-4de7-b650-bb16b0c60a1d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337630-gsc28" Oct 12 08:30:00 crc kubenswrapper[4599]: I1012 08:30:00.352500 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e32f3ce9-c26d-4de7-b650-bb16b0c60a1d-config-volume\") pod \"collect-profiles-29337630-gsc28\" (UID: \"e32f3ce9-c26d-4de7-b650-bb16b0c60a1d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337630-gsc28" Oct 12 08:30:00 crc kubenswrapper[4599]: I1012 08:30:00.453968 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e32f3ce9-c26d-4de7-b650-bb16b0c60a1d-config-volume\") pod \"collect-profiles-29337630-gsc28\" (UID: \"e32f3ce9-c26d-4de7-b650-bb16b0c60a1d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337630-gsc28" Oct 12 08:30:00 crc kubenswrapper[4599]: I1012 08:30:00.454076 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vq9fx\" (UniqueName: \"kubernetes.io/projected/e32f3ce9-c26d-4de7-b650-bb16b0c60a1d-kube-api-access-vq9fx\") pod \"collect-profiles-29337630-gsc28\" (UID: \"e32f3ce9-c26d-4de7-b650-bb16b0c60a1d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337630-gsc28" Oct 12 08:30:00 crc kubenswrapper[4599]: I1012 08:30:00.454179 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e32f3ce9-c26d-4de7-b650-bb16b0c60a1d-secret-volume\") pod \"collect-profiles-29337630-gsc28\" (UID: \"e32f3ce9-c26d-4de7-b650-bb16b0c60a1d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337630-gsc28" Oct 12 08:30:00 crc kubenswrapper[4599]: I1012 08:30:00.454826 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e32f3ce9-c26d-4de7-b650-bb16b0c60a1d-config-volume\") pod \"collect-profiles-29337630-gsc28\" (UID: \"e32f3ce9-c26d-4de7-b650-bb16b0c60a1d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337630-gsc28" Oct 12 08:30:00 crc kubenswrapper[4599]: I1012 08:30:00.458975 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e32f3ce9-c26d-4de7-b650-bb16b0c60a1d-secret-volume\") pod \"collect-profiles-29337630-gsc28\" (UID: \"e32f3ce9-c26d-4de7-b650-bb16b0c60a1d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337630-gsc28" Oct 12 08:30:00 crc kubenswrapper[4599]: I1012 08:30:00.467669 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vq9fx\" (UniqueName: \"kubernetes.io/projected/e32f3ce9-c26d-4de7-b650-bb16b0c60a1d-kube-api-access-vq9fx\") pod \"collect-profiles-29337630-gsc28\" (UID: \"e32f3ce9-c26d-4de7-b650-bb16b0c60a1d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337630-gsc28" Oct 12 08:30:00 crc kubenswrapper[4599]: I1012 08:30:00.764029 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29337630-gsc28" Oct 12 08:30:01 crc kubenswrapper[4599]: I1012 08:30:01.121451 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29337630-gsc28"] Oct 12 08:30:01 crc kubenswrapper[4599]: I1012 08:30:01.656065 4599 generic.go:334] "Generic (PLEG): container finished" podID="e32f3ce9-c26d-4de7-b650-bb16b0c60a1d" containerID="c0b04e92e71076d4ef91d1575bd9d8f43ad6c4b3f1e6cb75d8cef9083d6fea63" exitCode=0 Oct 12 08:30:01 crc kubenswrapper[4599]: I1012 08:30:01.656144 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29337630-gsc28" event={"ID":"e32f3ce9-c26d-4de7-b650-bb16b0c60a1d","Type":"ContainerDied","Data":"c0b04e92e71076d4ef91d1575bd9d8f43ad6c4b3f1e6cb75d8cef9083d6fea63"} Oct 12 08:30:01 crc kubenswrapper[4599]: I1012 08:30:01.656827 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29337630-gsc28" event={"ID":"e32f3ce9-c26d-4de7-b650-bb16b0c60a1d","Type":"ContainerStarted","Data":"4085eb0ce8749fb4b0e3b48b7fb22998c4e51cace3024fc120cd3e7b29871bcd"} Oct 12 08:30:02 crc kubenswrapper[4599]: I1012 08:30:02.894468 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29337630-gsc28" Oct 12 08:30:03 crc kubenswrapper[4599]: I1012 08:30:03.090744 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e32f3ce9-c26d-4de7-b650-bb16b0c60a1d-secret-volume\") pod \"e32f3ce9-c26d-4de7-b650-bb16b0c60a1d\" (UID: \"e32f3ce9-c26d-4de7-b650-bb16b0c60a1d\") " Oct 12 08:30:03 crc kubenswrapper[4599]: I1012 08:30:03.090870 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e32f3ce9-c26d-4de7-b650-bb16b0c60a1d-config-volume\") pod \"e32f3ce9-c26d-4de7-b650-bb16b0c60a1d\" (UID: \"e32f3ce9-c26d-4de7-b650-bb16b0c60a1d\") " Oct 12 08:30:03 crc kubenswrapper[4599]: I1012 08:30:03.090906 4599 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vq9fx\" (UniqueName: \"kubernetes.io/projected/e32f3ce9-c26d-4de7-b650-bb16b0c60a1d-kube-api-access-vq9fx\") pod \"e32f3ce9-c26d-4de7-b650-bb16b0c60a1d\" (UID: \"e32f3ce9-c26d-4de7-b650-bb16b0c60a1d\") " Oct 12 08:30:03 crc kubenswrapper[4599]: I1012 08:30:03.091666 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e32f3ce9-c26d-4de7-b650-bb16b0c60a1d-config-volume" (OuterVolumeSpecName: "config-volume") pod "e32f3ce9-c26d-4de7-b650-bb16b0c60a1d" (UID: "e32f3ce9-c26d-4de7-b650-bb16b0c60a1d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 08:30:03 crc kubenswrapper[4599]: I1012 08:30:03.095521 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e32f3ce9-c26d-4de7-b650-bb16b0c60a1d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e32f3ce9-c26d-4de7-b650-bb16b0c60a1d" (UID: "e32f3ce9-c26d-4de7-b650-bb16b0c60a1d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 08:30:03 crc kubenswrapper[4599]: I1012 08:30:03.095937 4599 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e32f3ce9-c26d-4de7-b650-bb16b0c60a1d-kube-api-access-vq9fx" (OuterVolumeSpecName: "kube-api-access-vq9fx") pod "e32f3ce9-c26d-4de7-b650-bb16b0c60a1d" (UID: "e32f3ce9-c26d-4de7-b650-bb16b0c60a1d"). InnerVolumeSpecName "kube-api-access-vq9fx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 08:30:03 crc kubenswrapper[4599]: I1012 08:30:03.192713 4599 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e32f3ce9-c26d-4de7-b650-bb16b0c60a1d-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 12 08:30:03 crc kubenswrapper[4599]: I1012 08:30:03.192742 4599 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e32f3ce9-c26d-4de7-b650-bb16b0c60a1d-config-volume\") on node \"crc\" DevicePath \"\"" Oct 12 08:30:03 crc kubenswrapper[4599]: I1012 08:30:03.192752 4599 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vq9fx\" (UniqueName: \"kubernetes.io/projected/e32f3ce9-c26d-4de7-b650-bb16b0c60a1d-kube-api-access-vq9fx\") on node \"crc\" DevicePath \"\"" Oct 12 08:30:03 crc kubenswrapper[4599]: I1012 08:30:03.670253 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29337630-gsc28" event={"ID":"e32f3ce9-c26d-4de7-b650-bb16b0c60a1d","Type":"ContainerDied","Data":"4085eb0ce8749fb4b0e3b48b7fb22998c4e51cace3024fc120cd3e7b29871bcd"} Oct 12 08:30:03 crc kubenswrapper[4599]: I1012 08:30:03.670441 4599 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4085eb0ce8749fb4b0e3b48b7fb22998c4e51cace3024fc120cd3e7b29871bcd" Oct 12 08:30:03 crc kubenswrapper[4599]: I1012 08:30:03.670288 4599 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29337630-gsc28" Oct 12 08:30:03 crc kubenswrapper[4599]: I1012 08:30:03.947538 4599 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29337585-gh2qn"] Oct 12 08:30:03 crc kubenswrapper[4599]: I1012 08:30:03.954051 4599 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29337585-gh2qn"] Oct 12 08:30:05 crc kubenswrapper[4599]: I1012 08:30:05.553142 4599 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6dd7df5-77c0-4ffb-9d14-425465bb9ab3" path="/var/lib/kubelet/pods/b6dd7df5-77c0-4ffb-9d14-425465bb9ab3/volumes" Oct 12 08:30:08 crc kubenswrapper[4599]: I1012 08:30:08.048181 4599 scope.go:117] "RemoveContainer" containerID="4ad77157cd9ccdd7484b815bcc542a463380fb30bb7bec230944a05c2d9c6007" Oct 12 08:30:09 crc kubenswrapper[4599]: E1012 08:30:09.812413 4599 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode32f3ce9_c26d_4de7_b650_bb16b0c60a1d.slice/crio-4085eb0ce8749fb4b0e3b48b7fb22998c4e51cace3024fc120cd3e7b29871bcd\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode32f3ce9_c26d_4de7_b650_bb16b0c60a1d.slice\": RecentStats: unable to find data in memory cache]" Oct 12 08:30:20 crc kubenswrapper[4599]: E1012 08:30:20.014139 4599 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode32f3ce9_c26d_4de7_b650_bb16b0c60a1d.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode32f3ce9_c26d_4de7_b650_bb16b0c60a1d.slice/crio-4085eb0ce8749fb4b0e3b48b7fb22998c4e51cace3024fc120cd3e7b29871bcd\": RecentStats: unable to find data in memory cache]" Oct 12 08:30:28 crc kubenswrapper[4599]: I1012 08:30:28.322220 4599 patch_prober.go:28] interesting pod/machine-config-daemon-5mz5c container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 08:30:28 crc kubenswrapper[4599]: I1012 08:30:28.322585 4599 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 08:30:30 crc kubenswrapper[4599]: E1012 08:30:30.200558 4599 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode32f3ce9_c26d_4de7_b650_bb16b0c60a1d.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode32f3ce9_c26d_4de7_b650_bb16b0c60a1d.slice/crio-4085eb0ce8749fb4b0e3b48b7fb22998c4e51cace3024fc120cd3e7b29871bcd\": RecentStats: unable to find data in memory cache]" Oct 12 08:30:40 crc kubenswrapper[4599]: E1012 08:30:40.413134 4599 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode32f3ce9_c26d_4de7_b650_bb16b0c60a1d.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode32f3ce9_c26d_4de7_b650_bb16b0c60a1d.slice/crio-4085eb0ce8749fb4b0e3b48b7fb22998c4e51cace3024fc120cd3e7b29871bcd\": RecentStats: unable to find data in memory cache]" Oct 12 08:30:50 crc kubenswrapper[4599]: E1012 08:30:50.616964 4599 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode32f3ce9_c26d_4de7_b650_bb16b0c60a1d.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode32f3ce9_c26d_4de7_b650_bb16b0c60a1d.slice/crio-4085eb0ce8749fb4b0e3b48b7fb22998c4e51cace3024fc120cd3e7b29871bcd\": RecentStats: unable to find data in memory cache]" Oct 12 08:30:58 crc kubenswrapper[4599]: I1012 08:30:58.321951 4599 patch_prober.go:28] interesting pod/machine-config-daemon-5mz5c container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 08:30:58 crc kubenswrapper[4599]: I1012 08:30:58.322477 4599 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 08:30:58 crc kubenswrapper[4599]: I1012 08:30:58.322519 4599 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" Oct 12 08:30:58 crc kubenswrapper[4599]: I1012 08:30:58.322973 4599 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"261ce39d1df16a70aab27383f113d909d054165675c4dc256efe676be8b231e4"} pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 12 08:30:58 crc kubenswrapper[4599]: I1012 08:30:58.323015 4599 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" podUID="cc694bce-8c25-4729-b452-29d44d3efe6e" containerName="machine-config-daemon" containerID="cri-o://261ce39d1df16a70aab27383f113d909d054165675c4dc256efe676be8b231e4" gracePeriod=600 Oct 12 08:30:59 crc kubenswrapper[4599]: I1012 08:30:59.045529 4599 generic.go:334] "Generic (PLEG): container finished" podID="cc694bce-8c25-4729-b452-29d44d3efe6e" containerID="261ce39d1df16a70aab27383f113d909d054165675c4dc256efe676be8b231e4" exitCode=0 Oct 12 08:30:59 crc kubenswrapper[4599]: I1012 08:30:59.046024 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" event={"ID":"cc694bce-8c25-4729-b452-29d44d3efe6e","Type":"ContainerDied","Data":"261ce39d1df16a70aab27383f113d909d054165675c4dc256efe676be8b231e4"} Oct 12 08:30:59 crc kubenswrapper[4599]: I1012 08:30:59.046172 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5mz5c" event={"ID":"cc694bce-8c25-4729-b452-29d44d3efe6e","Type":"ContainerStarted","Data":"2b1789dcfc64b742f9e4086b2b09a06e8edefba60b860169812e48b74134d647"} Oct 12 08:30:59 crc kubenswrapper[4599]: I1012 08:30:59.046245 4599 scope.go:117] "RemoveContainer" containerID="536fe0523d7f4d373d85a237a3866f6e04c0049c07c9b6780974dfce18c2815f" Oct 12 08:31:00 crc kubenswrapper[4599]: E1012 08:31:00.845056 4599 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode32f3ce9_c26d_4de7_b650_bb16b0c60a1d.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode32f3ce9_c26d_4de7_b650_bb16b0c60a1d.slice/crio-4085eb0ce8749fb4b0e3b48b7fb22998c4e51cace3024fc120cd3e7b29871bcd\": RecentStats: unable to find data in memory cache]" Oct 12 08:31:32 crc kubenswrapper[4599]: I1012 08:31:32.047390 4599 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wx674"] Oct 12 08:31:32 crc kubenswrapper[4599]: E1012 08:31:32.049929 4599 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e32f3ce9-c26d-4de7-b650-bb16b0c60a1d" containerName="collect-profiles" Oct 12 08:31:32 crc kubenswrapper[4599]: I1012 08:31:32.050001 4599 state_mem.go:107] "Deleted CPUSet assignment" podUID="e32f3ce9-c26d-4de7-b650-bb16b0c60a1d" containerName="collect-profiles" Oct 12 08:31:32 crc kubenswrapper[4599]: I1012 08:31:32.050543 4599 memory_manager.go:354] "RemoveStaleState removing state" podUID="e32f3ce9-c26d-4de7-b650-bb16b0c60a1d" containerName="collect-profiles" Oct 12 08:31:32 crc kubenswrapper[4599]: I1012 08:31:32.052052 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wx674" Oct 12 08:31:32 crc kubenswrapper[4599]: I1012 08:31:32.056258 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wx674"] Oct 12 08:31:32 crc kubenswrapper[4599]: I1012 08:31:32.208771 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a3e7fc0-43e6-4a90-ab35-fd2fdb6e26e3-catalog-content\") pod \"community-operators-wx674\" (UID: \"7a3e7fc0-43e6-4a90-ab35-fd2fdb6e26e3\") " pod="openshift-marketplace/community-operators-wx674" Oct 12 08:31:32 crc kubenswrapper[4599]: I1012 08:31:32.208844 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a3e7fc0-43e6-4a90-ab35-fd2fdb6e26e3-utilities\") pod \"community-operators-wx674\" (UID: \"7a3e7fc0-43e6-4a90-ab35-fd2fdb6e26e3\") " pod="openshift-marketplace/community-operators-wx674" Oct 12 08:31:32 crc kubenswrapper[4599]: I1012 08:31:32.209397 4599 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdjl6\" (UniqueName: \"kubernetes.io/projected/7a3e7fc0-43e6-4a90-ab35-fd2fdb6e26e3-kube-api-access-hdjl6\") pod \"community-operators-wx674\" (UID: \"7a3e7fc0-43e6-4a90-ab35-fd2fdb6e26e3\") " pod="openshift-marketplace/community-operators-wx674" Oct 12 08:31:32 crc kubenswrapper[4599]: I1012 08:31:32.312879 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdjl6\" (UniqueName: \"kubernetes.io/projected/7a3e7fc0-43e6-4a90-ab35-fd2fdb6e26e3-kube-api-access-hdjl6\") pod \"community-operators-wx674\" (UID: \"7a3e7fc0-43e6-4a90-ab35-fd2fdb6e26e3\") " pod="openshift-marketplace/community-operators-wx674" Oct 12 08:31:32 crc kubenswrapper[4599]: I1012 08:31:32.312978 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a3e7fc0-43e6-4a90-ab35-fd2fdb6e26e3-catalog-content\") pod \"community-operators-wx674\" (UID: \"7a3e7fc0-43e6-4a90-ab35-fd2fdb6e26e3\") " pod="openshift-marketplace/community-operators-wx674" Oct 12 08:31:32 crc kubenswrapper[4599]: I1012 08:31:32.313097 4599 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a3e7fc0-43e6-4a90-ab35-fd2fdb6e26e3-utilities\") pod \"community-operators-wx674\" (UID: \"7a3e7fc0-43e6-4a90-ab35-fd2fdb6e26e3\") " pod="openshift-marketplace/community-operators-wx674" Oct 12 08:31:32 crc kubenswrapper[4599]: I1012 08:31:32.313535 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a3e7fc0-43e6-4a90-ab35-fd2fdb6e26e3-catalog-content\") pod \"community-operators-wx674\" (UID: \"7a3e7fc0-43e6-4a90-ab35-fd2fdb6e26e3\") " pod="openshift-marketplace/community-operators-wx674" Oct 12 08:31:32 crc kubenswrapper[4599]: I1012 08:31:32.313673 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a3e7fc0-43e6-4a90-ab35-fd2fdb6e26e3-utilities\") pod \"community-operators-wx674\" (UID: \"7a3e7fc0-43e6-4a90-ab35-fd2fdb6e26e3\") " pod="openshift-marketplace/community-operators-wx674" Oct 12 08:31:32 crc kubenswrapper[4599]: I1012 08:31:32.331269 4599 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdjl6\" (UniqueName: \"kubernetes.io/projected/7a3e7fc0-43e6-4a90-ab35-fd2fdb6e26e3-kube-api-access-hdjl6\") pod \"community-operators-wx674\" (UID: \"7a3e7fc0-43e6-4a90-ab35-fd2fdb6e26e3\") " pod="openshift-marketplace/community-operators-wx674" Oct 12 08:31:32 crc kubenswrapper[4599]: I1012 08:31:32.370406 4599 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wx674" Oct 12 08:31:32 crc kubenswrapper[4599]: I1012 08:31:32.791184 4599 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wx674"] Oct 12 08:31:33 crc kubenswrapper[4599]: I1012 08:31:33.284185 4599 generic.go:334] "Generic (PLEG): container finished" podID="7a3e7fc0-43e6-4a90-ab35-fd2fdb6e26e3" containerID="27fbbaa00b0dbc2a58b96356ef79d65c5ebd482aa14fdd645ff74a37d46b0feb" exitCode=0 Oct 12 08:31:33 crc kubenswrapper[4599]: I1012 08:31:33.284378 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wx674" event={"ID":"7a3e7fc0-43e6-4a90-ab35-fd2fdb6e26e3","Type":"ContainerDied","Data":"27fbbaa00b0dbc2a58b96356ef79d65c5ebd482aa14fdd645ff74a37d46b0feb"} Oct 12 08:31:33 crc kubenswrapper[4599]: I1012 08:31:33.284569 4599 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wx674" event={"ID":"7a3e7fc0-43e6-4a90-ab35-fd2fdb6e26e3","Type":"ContainerStarted","Data":"9b2033370b38080b3177aa268d54f4e8c2b332f1671c5303eacabf61501532fd"}